Introduction
A large body of research has been dedicated to examining the variety of ways in which natural disasters, particularly urban wildfires (given that they are expected to increase in the coming years due to climate change), impact diverse populations. Previous studies investigate the relationship between exposure or proximity to natural disaster and self-reports of negative mental health symptoms, physical illness, and financial strain (McDermott et al. 2005; Finlay et al. 2012; Felix and Afifi 2015; Reisen et al. 2015; Reid et al. 2016; Brown et al. 2019; Agyapong et al. 2020; Belleville et al. 2021).
While these results are important, many are based on local surveys fielded in the aftermath of a natural disaster. As a result, they leave unaddressed the possibility that the natural disaster itself may have impacted survey response rates and biased estimates of impact (van den Berg et al. 2005). The bias resulting from selective response based on disaster experience could potentially cut two ways: The first possibility is that unaffected individuals could be less interested in a survey focused on disaster impacts if they feel they have little to contribute. For example, in the aftermath of an oil spill, Foster et al. (1995) interview non-responders and find that they did not think their health was or would be affected and thus did not think the study was useful. Likewise, van den Berg et al. (2005) find that individuals were more likely to participate in post-disaster studies if they had psychological problems, as they “may have attributed their health problems to the disaster,” motivating their participation in the study (pg. 785).
Bias may also emerge if the most disaster-affected individuals are too stressed, busy, or imperiled to take time to answer a survey fielded shortly after the natural disaster. For instance, van den Berg et al. (2007) notes that when studying post-disaster mental wellbeing “highly exposed or distressed survivors would be less likely to respond because they do not want to be reminded of the stressful event.” Several studies have found that those with worse mental health problems in the aftermath of a disaster were more likely to attrit from post-disaster survey projects (Horikoshi et al. 2017; Dougall et al. 2000; Benight et al. 1999; Weisæth 1989). These outcomes may extend to health as well, with participants who are less healthy at the baseline post-disaster survey being less likely to respond to future waves (Yu et al. 2015).
This literature, while providing important but conflicting accounts of selective attrition in post-disaster surveys, is still largely unable to estimate how disaster exposure itself impacts participation and response rates, as the characteristics (including disaster exposure) of those who do not participate in any post-disaster survey remain unknown. Existing literature is largely relegated to survey projects that occur only after the disaster. As such, it is unclear, for example, if disaster exposure is what is causing those with more psychological symptoms to attrit from these studies or if they attrit because those with mental health symptoms are simply more likely to attrit from surveys (Yu et al. 2022). Grievink et al. (2006) field survey waves both before and after a man-made disaster. However, the surveys were fielded 1 year prior to and 1 year after the event and are therefore unable to speak to post-disaster surveys that are fielded in the immediate aftermath. Bourque et al. (1997) field surveys shortly after earthquake occurrence. However, they do not have data on the same respondents prior to event occurrence. As such, the degree to which impacted populations are less likely to respond after a natural disaster remains an open question, and one that has ramifications for social science research, governments, and aid organizations that rely on post-disaster surveys to evaluate disaster impacts.
To study the potential impact of natural disasters on participant survey response rates, we match geocoded measures of wildfire exposure with longitudinal panel data from Los Angeles County to examine changes in survey response before and after the January 2025 LA County Wildfires that occurred primarily in the Palisades and Eaton Canyon areas. While we find evidence that the January 2025 LA County Wildfires differentially impacted the survey response rates of those living in more affected areas of the County (i.e., areas in closer proximity, under evacuation orders, warnings etc.), these effects are small and relatively short lived. We also find that respondents living in more exposed areas exhibited lower response rates in both the pre-wildfire and post-wildfire periods, reinforcing the importance of utilizing pre- and post-wildfire longitudinal data so as to not confuse the effects of wildfire with the effects of pre-wildfire population characteristics.
Data and Methods
We use longitudinal data from the Understanding America Study (UAS) and its LA County subpanel LABarometer. The UAS is a nationally representative probability-based internet panel of about 15,000 US adults. Participants in the panel are recruited via address-based sampling and are paid about $10 for each 15-minute survey they complete.[1] The panel has been in existence since 2014 and includes a wealth of data on participant demographics, mental and physical wellbeing, and cognition among other variables. Since October 2023, all UAS members are invited to complete the UAS Monthly Panel Survey series, which fields surveys every month eliciting whether respondents experienced significant life events, changes in health, mental health, life satisfaction, pain, stress and other topics.[2] The UAS contains an oversample of about 2,000 LA County residents that constitute the subpanel LABarometer. Members of the LABarometer are invited to take surveys as part of the UAS Monthly Panel Survey series in addition to receiving surveys with LA County-specific topics. As such, we can track survey response rates to the UAS Monthly Panel Survey series for all respondents living in LA County. For this project, we examine response rates to the UAS Monthly Panel Survey series beginning in October 2023 and ending in November 2025. Response rates range from 82.1% to 64.8%. The invitation text, incentives, and most of the questionnaire remained the same over the study period. Questions were periodically added or removed from survey waves but the survey structure remained largely the same. Questionnaire length varied from two to seven minutes on average (in testing) and all surveys from wave three onward ranged from four to seven minutes, with no relationship between wave number and average time to completion.
To examine how a natural disaster impacts response rates to the monthly surveys, we examine response rates to the monthly surveys both before and after the 2025 LA County Wildfires. The Los Angeles County Palisades and Eaton wildfires of January 2025 were two of the largest wildfires in California history – burning over 35,000 acres, destroying over 16,000 structures, displacing over 200,000 residents, killing 30 people, and causing billions of dollars in damage (Roger 2025; Phillips 2025). The wildfires began on January 7, 2025, and were not fully contained until January 31, 2025. We match census-tract level wildfire data to LABarometer panelists, including their Euclidean distance from the centroid point of their census tract to the nearest burn zone perimeter (Distance), the percent share of their census tract that was within an evacuation warning zone (Evacuation Warning) and the percent of their census tract that was within an evacuation order zone (Evacuation Order).[3] The evacuation variables and distance from burn zone perimeters are sourced from the Los Angeles County Enterprise Geographic Information Systems (GIS) data hub (Graham 2025; County of Los Angeles 2025). Combined with longitudinal response rates, the wildfire data will allow us to see if response rates declined after the start of the wildfires and if any decline was more pronounced among respondents who reside in more heavily impacted census tracts.
We estimate changes in the response rate between waves within-individuals using a series of fixed-effects logistic regression models.[4] Interactions between Post-Fire (“1” defined as post January 7, 2025) or wave (month and year) and exposure measures are included in separate models to estimate how the relationship between wave and survey response changes by wildfire exposure. The constituent terms of 2025 exposure (Distance, Evacuation Warning, and Evacuation Order) are fixed across waves and are therefore dropped from the interactive models. However, the differential effect of individual waves on response over these variables can be inferred from the interaction terms retained by the model.[5] Standard errors are clustered by individual participant.
Results
Presented in Table 1 are the odd ratio results of four fixed-effects regression models in which the wildfire variables (Distance, Evacuation Warning, Evacuation Order) are interacted with the variable Post-Fire. The first column includes the results of a simple additive model. The results show an odds ratio of .495 of Post-Fire, indicating that those invited to a survey after the fires had half the odds of answering compared to those who answered before the fire. There is some limited evidence that proximity and exposure to the wildfire differentially impacted response rates. In column 2, the interaction term Post-Fire x Distance is positive and marginally significant (p < .1) indicated that the effect of Post-Fire becomes more positive as Distance increases. For every one-unit increase in the logged distance of the respondent census tract from the burn perimeter, the post-wildfire odds of responding increase by 14% relative to census tracts within the burn zone. The interactions Post-Fire x Evac Order and Post-Fire x Evac Warning in column 3 and 4 respectively are both negative (odds ratios below 1) although only the latter is significant, indicating a greater negative effect of Post-Fire as the value of Evac Warning increases. For every one unit logged increase in the proportion of the census tract under evacuation warning, the odds of responding to the survey decrease by 17% relative to a census tract with no areas under evacuation warning.
Presented in Table 2 are the results of the fixed-effects models when the same measures of wildfire proximity and exposure are interacted with Wave (the month and year in which the monthly survey was fielded). December 2024 is the excluded (reference) month. While together Post-Fire had a negative effect in the previous models, the additive model in column 1 shows no single Wave month in 2025 in which respondents were less likely to respond. However, subsequent models show that this was not the case for those residing in areas most exposed to the January 2025 wildfires. While interactions between the wave variables and Distance are insignificant (column 2), column 3 shows negative and significant interactions between Jan 2025 x Evac Order (or = 0.770) and May 2025 x Evac Order (or = .621), indicating a decreased odds of responding to the survey in those months as the proportion of the resident’s census tract under Evac Order increased. Likewise, the model in column 4 shows negative and significant interactions between Feb 2025 x Evac Warning (or = 0.800) as well as Oct 2025 x Evac Warning (or = .750) and Nov 2025 x Evac Warning (or = .705), implying that residents living where a higher proportion of the census tract was under Evac Warning were less likely to respond in those months. In neither of these models (column 3 and column 4) was there any effect of Evac Order or Evac Warning prior to 2025 – indicating that differences only emerged after the January 2025 wildfires.
While predicted probabilities from fixed-effects models are not substantively interpretable due to the non-estimation of the fixed effects, the substantive effect of wildfire exposures on response rates can be demonstrated by deriving the predicted probabilities from a random effects model (Stammann et al. 2016; Abrevaya and Hsu 2021).[6] Presented in Table 3 are the predicted response rates in January 2024 and January 2025 for an individual residing in a census tract under no evacuation order (Evac Order = 0) and a census tract that is nearly fully under evacuation order (Evac Order = .99). Response rates were initially higher in evacuation order areas but decline by a predicted 18 percentage points, nearly double the predicted decline of the response rate in census tracts under no evacuation orders.
Discussion and Conclusion
Previous studies of response rates in the wake of a natural disaster have typically only focused on attrition in post-disaster or post-event survey projects. Few studies have examined how post-disaster response rates change by exposure to the disaster itself. This paper contributes significantly to the literature on the effects of natural disasters and wildfire by comparing response rates before and after a natural disaster and by examining how exposure to the disaster may influence changes in response rates.
While we find little evidence that response rates changed significantly in LA County after the 2025 wildfire, we do find some evidence that response rates declined in post-wildfire months among respondents residing in census tracts under in closer proximity to the burn zone, under evacuation orders, or under evacuation warnings. However, results are generally weak and inconsistent between models. On average, those residing in census tracts in closer proximity to the fires have lower odds of responding in post-wildfire months. However, the second set of models shows that, among those residing in closer proximity to the January 2025 wildfires, in no single month were they less likely to respond in 2025 compared to December 2024. Likewise, while those residing in census tracts with higher proportion under evacuation order were less likely to respond in February 2025 and May 2025, on average they were not less likely to respond to the survey in post-wildfire months as a whole. The effect of evacuation warnings was the only wildfire-related variable that was consistent across specifications. Even so, the effect was only significant in January, October, and November 2025.
These results have implications for survey researchers trying to study the effects of natural disasters. Post-disaster surveys fielded in the first months after the event may suffer lower response rates among those who are more affected. However, these effects seem to be relatively short-lived or not consistent across post-disaster waves. A limitation of this study is that it only applies to a large urban wildfire. Although destructive, the number of panel members who had their homes damaged or destroyed in the Palisades and Eaton wildfires was low (n = 36). Other natural disasters, such as extreme flooding or a hurricane, may lead to longer-term and more widespread displacement or upheaval. As a result, the effects of other types of natural disasters on survey non-response may be longer lasting. One major caveat to this study is that this was a longitudinal internet panel where respondents are able to take surveys with any internet-connected device, allowing them to take surveys even if they are displaced from their homes for a period of time. The same would of course not be true with in-person survey modes. Researchers fielding disaster impact surveys soon after the event should perhaps think of reaching out to non-responders again in the following months to re-ask questions that were asked in the prior wave.
Corresponding author contact information
Evan W. Sandlin
635 Downey Way, VPD, Los Angeles, CA 90089
esandlin@usc.edu
For complete UAS recruitment methodology please see: https://uasdata.usc.edu/index.php/methodology.
Support for the Monthly Panel Survey series was provided by the Social Security Administration (SSA) and the National Institute on Aging. The project was reviewed and approved by Biomedical Research Alliance of New York (BRANY) Institutional Review Board (IRB# 22-030-1044).
The variable is the natural log of the distance from the burn zone perimeter: ln(Distance + 1).
Fixed effects models are commonly used to analyze repeated panel observations within individuals (Brüderl and Ludwig 2015).
A random effects model with controls for demographics is included in the appendix. The results are nearly identical to those using fixed effects.
The random effects models, which produce nearly identical results, can be found in the supplementary appendix (Table A1).