The Wildfire Interdisciplinary Research Center (WIRC) at San Jose State University investigates California wildfires using a variety of disciplinary and research approaches, including physical, social-economic, meteorological, and geographical methods. We utilize tower weather stations, Doppler radars, and aerial remote sensing technologies to study wildfire and burn events. While traditional remote sensing technologies are limited by coarse spatial resolution from satellite imagery, high cost, and long preparation time for aerial mapping, drones offer temporal flexibility and cost-effectiveness, particularly for lower latitude mapping to capture and monitor fire events. Furthermore, once a solution has been established, drone mapping systems are convenient and cost-effective for performing repeated mapping and monitoring of post-burn effects resulting from wildfires.
In October 2022, SJSU WIRC and Cal Fire conducted a coordinated Canyon Fire experiment in a selected canyon near Salinas in Northern California. The experiment was monitored using airborne IR systems, Radar/Lidar, and weather tower stations, while various UAV platforms/sensors collected mapping data and ground measurements. One month prior to the experiment, drones were flown twice to capture high-resolution orthomosaic imagery in both multispectral and RGB bands. The structure to motion algorithm was used to collect the pre-burn surface volume represented by the Digital Elevation Model (DEM) with a 3 cm spatial resolution. During the canyon fire, a Matrice 200 drone with a FLIR Zemuse H20T thermal camera collected real-time thermal videos of the fire behavior. Post-burn data, including optical spectral imagery and DEM, were collected after the event. This experiment provided invaluable measurements of pre- and post-fire land change patterns and during-fire dynamics using both remotely sensed data and in-situ metrology stations.
The project generated data on fire behavior, fuel types, wildfire mapping, and vegetation time-series change for the California canyon burn site. This data includes original drone imagery from autonomous mapping, processed drone orthomosaic imagery for both RGB bands and multispectral, drone thermal mapping video, Drone LiDAR cloud point mapping, Digital Elevation Model (DEM), Digital Surface Model (DSM), and Ground Control Points (GCP) obtained using high-performance GPS for drone image registration. Additionally, video footage and photos were captured for media consumption, and auxiliary satellite/aerial data were assimilated when applicable. Field station data was also generated for vegetation coverage and fuel distribution.
We currently working on process and analyze the time-series drone mapping and ground data, and generate Anderson fuel classification maps based on RGB and multispectral imagery. This informative dataset will be combined with the during-fire and post-fire burn data to compare the effect and development of different vagatation/fuel that has been burned during and after the fire. Also, the DEM data will be compared before and after the burn to assess the volume change before and after fire, combining the subset with classification results from the multispectral data to see how the burn affects the different vegetation types. Moreover, in the following year we will continue using multispectral to map the area multiple times for this year to monitor how the vegetation grew after fire, compared with the control group, the no-burn area vegetation growing condition.
See our storymap about the Canyon Fire experiment