Sunday 31 July 2016

Methodology

This report is drawn from two surveys conducted as part of the American Trends Panel (ATP), created by Pew Research Center, a nationally representative panel of randomly selected U.S. adults living in households. Respondents who self-identify as internet users and who provided an email address participate in the panel via monthly self-administered web surveys, and those who do not use the internet or decline to provide an email address participate via the mail. The panel is being managed by Abt SRBI.


Data in this report are drawn primarily from the March wave of the panel, conducted March 2-28, 2016, among 4,726 respondents (4,243 by web and 483 by mail). The margin of sampling error for the full sample of 4,726 respondents from the March wave is plus or minus 2.2 percentage points.


Members of the American Trends Panel were recruited from two large, national landline and cellphone random-digit dial (RDD) surveys conducted in English and Spanish. At the end of each survey, respondents were invited to join the panel. The first group of panelists were recruited from the 2014 Political Polarization and Typology Survey, conducted Jan. 23 to March 16, 2014. Of the 10,013 adults interviewed, 9,809 were invited to take part in the panel and a total of 5,338 agreed to participate.15 The second group of panelists were recruited from the 2015 Survey on Government, conducted Aug. 27 to Oct. 4, 2015. Of the 6,004 adults interviewed, all were invited to join the panel, and 2,976 agreed to participate.16


Participating panelists provided either a mailing address or an email address to which a welcome packet, a monetary incentive and future survey invitations could be sent. Panelists also receive a small monetary incentive after participating in each wave of the survey.


The ATP data were weighted in a multistep process that begins with a base weight incorporating the respondents’ original survey selection probability and the fact that in 2014 some panelists were subsampled for invitation to the panel. Next, an adjustment was made for the fact that the propensity to join the panel and remain an active panelist varied across different groups in the sample. The final step in the weighting uses an iterative technique that matches gender, age, education, race, Hispanic origin and region to parameters from the U.S. Census Bureau’s 2014 American Community Survey. Population density is weighted to match the 2010 U.S. Decennial Census.


Telephone service is weighted to estimates of telephone coverage for 2016 that were projected from the January-June 2015 National Health Interview Survey. Volunteerism is weighted to match the 2013 Current Population Survey Volunteer Supplement. Internet access is adjusted using a measure from the 2015 Survey on Government. Frequency of internet use is weighted to an estimate of daily internet use projected to 2016 from the 2013 Current Population Survey Computer and Internet Use Supplement. It also adjusts for party affiliation using an average of the three most recent Pew Research Center general public telephone surveys. Sampling errors and statistical tests of significance take into account the effect of weighting. Interviews are conducted in both English and Spanish, but the Hispanic sample in the American Trends Panel is predominantly native born and English speaking.


The margins of error tables show the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey. Sample sizes and sampling errors for other subgroups are available upon request.


In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.


The web component of the March wave had a response rate of 68% (4,243 responses among 6,267 web-based individuals in the panel); the mail component had a response rate of 68% (483 responses among 710 non-web individuals in the panel). Taking account of the combined, weighted response rate for the recruitment surveys (10.0%) and attrition from panel members who were removed at their request or for inactivity, the cumulative response rate for the March ATP wave is 3%.17


Additional survey data


Some data in this report are also drawn from the April wave of the same panel, conducted April 5-May 2, 2016 among 4,685 respondents (4,207 by web and 478 by mail). The margin of sampling error for the full sample of 4,685 respondents from the April wave is plus or minus 2.2 percentage points. Sample sizes and sampling errors for subgroups in this wave are available upon request.


The web component of the April wave had a response rate of 83% (4,207 responses among 5,091 web-based individuals in the panel); the mail component had a response rate of 77% (478 responses among 625 non-web individuals in the panel). Taking account of the combined, weighted response rate for the recruitment surveys (10.0%) and attrition from panel members who were removed at their request or for inactivity, the cumulative response rate for the April ATP wave is 3%.18


Questionnaire development and testing


Pew Research Center developed the questionnaire for this study. The design of the questionnaire was informed by the results of six focus groups and additional pretests with a non-probability sample, as well as input from Pew Research Center staff and six external advisers on the project.


Focus groups. Pew Research Center conducted a series of six focus groups around the country from Jan. 19-Feb. 4, 2016, designed to gain insight into Americans’ reasoning about the possibility of human enhancements. The groups focused on longer versions of the three scenarios that were presented in the national adult survey: gene editing to reduce disease risk, brain chip implants to improve cognitive abilities and synthetic blood substitutes to improve physical abilities. This is not an exhaustive list of potential human enhancements. The focus group discussions focused on the kinds of moral and practical considerations people bring to bear in thinking about these issues. The focus group moderators asked participants to consider the potential use of these enhancements for healthy people, not those who are sick or in need. See “American Voices on Ways Human Enhancement Could Shape Our Future” for further details on the focus groups. [LINK TK]


Pilot testing questions. Pew Research Center conducted 17 online, nonprobability surveys to test question wording options for the questionnaire. These pilot tests were also used to test the information presented about each of the three types of human enhancement. The pilot tests were completed from January through February, 2016. Each survey had an average of 100 respondents, ages 18 and older, and was conducted entirely online. Each individual pilot test covered a single type human enhancement (e.g., gene editing) and covered only a short set of about 10 questions.


Outside advisers. Pew Research Center also consulted with a number of expert advisers, listed in the acknowledgements section above, to inform the development of the questionnaire, including the scenarios or vignettes describing each type of human enhancement. We are grateful to this group for their input, but Pew Research Center bears full responsibility for the questionnaire design and analysis.


Religious commitment index


Survey respondents were classified into high, medium and low levels of religious commitment based on three indicators: frequency of religious service attendance, self-reported importance of religion in their lives and frequency of prayer. Those who attend worship services at least weekly, pray at least once a day and say religion is very important in their lives are classified as high in religious commitment. Those low in commitment say religion is not too or not at all important in their lives, that they seldom or never attend worship services and seldom or never pray. All others are classified as exhibiting a medium level of religious commitment.



  1. When data collection for the 2014 Political Polarization and Typology Survey began, non-internet users were subsampled at a rate of 25%, but a decision was made shortly thereafter to invite all non-internet users to join. In total, 83% of non-internet users were invited to join the panel.

  2. Respondents to the 2014 Political Polarization and Typology Survey who indicated that they are internet users but refused to provide an email address were initially permitted to participate in the American Trends Panel by mail, but were no longer permitted to join the panel after Feb. 6, 2014. Internet users from the 2015 Survey on Government who refused to provide an email address were not permitted to join the panel.

  3. Approximately once per year, panelists who have not participated in multiple consecutive waves are removed from the panel. These cases are counted in the denominator of cumulative response rates.

  4. Approximately once per year, panelists who have not participated in multiple consecutive waves are removed from the panel. These cases are counted in the denominator of cumulative response rates.





Source: http://www.pewresearch.org/feed/



Methodology

No comments:

Post a Comment