American Trends Panel Wave 170 Survey Methodology: Sample Design, Weighting & Response Rates (2025)

by Chief Editor

Between May 5 and May 11, 2025, the Pew Research Center’s American Trends Panel (ATP) completed Wave 170, gathering responses from 8,937 U.S. adults out of 9,531 sampled. The survey, which combined online and live‑telephone interviewing in English and Spanish, underpins a broad range of findings released later this year.

Did You Know? Since 2018 the ATP has relied on address‑based sampling drawn from the U.S. Postal Service’s Computerized Delivery Sequence File, which reaches an estimated 90 %–98 % of U.S. households.
Expert Insight: A 94 % response rate for the field period is unusually high for a nationally representative panel, bolstering confidence in the data but also reminding analysts that the cumulative response rate—including recruitment attrition—remains low at 3 %, a factor that must be accounted for when extrapolating results.

Survey Overview

Wave 170 targeted non‑institutionalized adults age 18 and older. The final sample yields a margin of error of ±1.4 percentage points. Two respondents were removed after quality checks, leaving the weighted dataset intact.

Recruitment and Sampling Design

The panel draws new participants each year since 2014, using address‑based sampling (ABS) to mail cover letters and pre‑incentives to randomly selected households. The adult with the next birthday in each household is invited to join. Oversampling in 2019, 2022 and 2023 boosted representation of Hispanic, Black and Asian adults, respectively.

Questionnaire Development and Testing

The questionnaire was crafted by Pew in partnership with SSRS. Prior to launch, the web instrument was stress‑tested on both desktop and mobile platforms, and test data were run through SPSS to verify logical flow and randomization.

Incentives

All participants received a post‑paid incentive, selectable as a check or a gift code for Amazon, Target or Walmart. Incentive amounts varied from $5 to $20 based on the respondent’s estimated ease of recruitment, a strategy designed to improve participation among harder‑to‑reach groups.

Data Collection Protocol

Online respondents received mailed postcard notifications on May 5, followed by a soft launch to 60 panelists and a full launch on May 6. Email and SMS reminders were sent to non‑respondents. Telephone respondents were pre‑notified by postcard on May 2, with up to six call attempts during the field period.

Weighting Methodology

Base weights reflect each panelist’s probability of recruitment. Subsequent calibrations align the sample with population benchmarks and adjust for nonresponse, attrition and differential selection probabilities. Final weights are trimmed at the 1st and 99th percentiles to preserve statistical precision.

Confirming Religious Identity

All Wave 170 participants had also answered Wave 162 (February 2025) on current and childhood religious affiliation. In Wave 170, 92 % confirmed both identifiers, enabling robust analysis of religious switching and related topics. The remaining 8 % did not confirm.

  • Protestant
  • Roman Catholic
  • Mormon (LDS)
  • Orthodox
  • Jewish
  • Muslim
  • Buddhist
  • Hindu
  • Atheist
  • Agnostic
  • Something else
  • Nothing in particular

Why It Matters

The rigor of ATP’s methodology—high field‑period response rates, layered weighting, and thorough quality checks—provides a solid foundation for interpreting the panel’s insights on public opinion, demographic trends, and religious affiliation. Researchers and policymakers rely on these data to gauge shifts in the electorate and social attitudes.

However, the low cumulative response rate (3 %) underscores the challenge of maintaining a representative panel over time. Weighting mitigates bias, but analysts must remain vigilant about potential undercoverage of groups less likely to stay engaged.

What May Happen Next

Analysts could see the Center refine its ABS recruitment to capture a larger share of the population, potentially increasing the cumulative response rate. The incentive structure may be further tailored to encourage continued participation among historically low‑response demographics. Future waves might also expand testing of mobile‑first survey designs, given the high proportion of online respondents (8,720 of 8,937).

Frequently Asked Questions

What was the field‑period response rate for Wave 170?

The survey achieved a 94 % response rate among the 9,531 sampled panelists during the May 5‑11, 2025 field period.

How does the ATP ensure the sample reflects the U.S. adult population?

Weighting adjusts for recruitment probabilities, nonresponse, and attrition, while oversampling in selected years improves representation of underrepresented groups such as Hispanic, Black and Asian adults.

Why were respondents asked to confirm their religious identity?

Confirmation was needed because Wave 170 included many questions on religious switching that depend on accurate identification of both current and childhood religion.

What aspects of this methodology do you think will shape future public‑opinion research?

You may also like

Leave a Comment