The American Trends Panel (ATP) is a national, probability-based panel of US adults fielded for the Pew Research Center by Abt SRBI. A special Diary Study was fielded November 10 through 16, 2014, with smartphone users identified in the panel. This study consisted of 14 short surveys administered twice a day for seven consecutive days. The study was conducted using two different self-administered approaches or “treatments.” One treatment required the panelist to download a special app to their phone, and then they used that app to complete the surveys. The other treatment was a normal Web survey, which could be completed on a mobile device, tablet, laptop or desktop computer. The app is only compatible with certain smartphones. Eligible panelists with a compatible phone were randomly assigned to participate in the app treatment (60%) or the normal Web survey treatment (40%). Eligible panelists with a non-compatible phone were assigned to the normal Web survey treatment. In total, 1,635 ATP members completed at least one of the 14 surveys, with 938 participating by Web and 697 participating with the app. The survey was administered in English and Spanish. Survey weights are provided to account for differential probabilities of selection into the panel, attrition, and differential nonresponse to the Diary Study.

Data in this report are drawn from the 1,035 respondents who completed at least ten of the 14 surveys over the course of the study period. The margin of sampling error for these 1,035 smartphone owners is plus or minus 4.0 percentage points.

Sample Design

The target population for the Diary Study was non-institutionalized smartphone owners age 18 and over, living in the US, including Alaska and Hawaii. The sample consisted of smartphone users identified and recruited in Wave 8 of the ATP, which was administered using the routine panel protocol. The ATP is a probability-based panel of adults in the United States. Currently all ATP panelists have been recruited from a large (n=10,013) national overlapping dual frame landline and cell phone random digit dial (RDD) survey conducted for the Pew Research Center. At the end of that RDD survey, respondents were invited to join the panel. The invitation was extended to all respondents who use the internet (from any location) and a random subsample of respondents who do not use the internet. The RDD survey was conducted from January 23rd to March 16th, 2014, in English and Spanish. Sample for the RDD survey was obtained from SSI. Please refer to the Pew Research Center Political Typology/Polarization Survey Methodology Report for additional information on the sample design for the RDD survey.

At the start of Wave 8, the ATP featured 4,228 active panel members, and 3,181 of them completed Wave 8. The Diary Study sample consisted of ATP panelists who had internet access, reported having a smartphone in Wave 8, and consented to participate in the smartphone follow-up (Diary) study. In total 2,188 Wave 8 panelists reported having a smartphone. Of those, 42 belonged to the non-internet arm of the panel and were ineligible for the Diary Study. Of the remaining 2,146 smartphone panelists, 1,945 consented to participate in the Diary Study. Among those consenting, 1,635 completed at least one of the 14 Diary Study surveys.

The Diary Study was conducted using two different self-administered approaches or “treatments.” One treatment required the panelist to download a special app (SODA®) to their phone, and then they used that app to complete the surveys. The other treatment was a normal Web survey, which could be completed on a mobile device, tablet, laptop or desktop computer. SODA® is only compatible with iPhones, Androids and Blackberry phones. Eligible panelists with one of these three phone types were randomly assigned to participate in the app treatment (60%) or the normal Web survey treatment (40%). All of the eligible panelists with a different type of smartphone (e.g., Windows) were assigned to the normal Web survey treatment. Among the 1,211 panelists assigned to the app treatment, 292 declined the follow up survey invitation and were then asked if they would complete the Diary Study via normal Web surveys. Some 195 agreed to that offer.

Data Collection Protocol

ATP panelists who agreed to participate in the special week of surveys (Diary Study) were sent an email notifying them of the upcoming week of surveys on November 7, 2014. Respondents for whom we also had a residential address received a matching letter in the mail with $5 cash enclosed, while respondents without an address were emailed a $5 Amazon gift code as a pre-incentive. The data collection for the surveys was conducted from November 10-16, 2014.

An identical survey measuring smartphone usage was dispatched to each respondent twice daily for seven days, for a total of 14 surveys. All respondents were sent an email invitation. Those who agreed to receive text message reminders were sent those, and the SODA® app had an alarm feature that notified SODA respondents of the current survey’s availability. Each survey was available for two hours, and the survey invitations were dispatched at the times specified in the table below based on the respondents’ reported local time zone.

Survey Administration Contacts

Panelists received $1 for each survey they completed during the week and an additional $5 bonus for completing 10 or more of the surveys during the week. Panelists who had previously selected a method of payment received their incentive based on their check or electronic Amazon gift code preference.

Weighting

The ATP data were weighted in a multi-step process that begins with a base weight incorporating the respondents’ original survey selection probability and the fact that some panelists were subsampled for invitation to the panel. Next, an adjustment was made for the fact that the propensity to join the panel and remain an active panelist varied across different groups in the sample. The next step was a weighting cell adjustment for non-response to the experience sampling study since the response rate differed somewhat across the treatment groups. The final step in the weighting uses an iterative technique that matches gender, age, education, race, Hispanic origin, region and smartphone type to parameters for US adults who have a smartphone from the October 2014 wave of the ATP. Normally ATP samples are calibrated to benchmarks for the US adult population. For this study, however, the target population was US adults who have a smartphone. There are no official government statistics on the demographics of this population. The best available data were from the October 2014 wave of the American Trends Panel, which featured a national probability-based sample of 2,188 adult smartphone users.

The margins of error reported and statistical tests of significance are adjusted to account for the survey’s design effect, a measure of how much efficiency is lost from the weighting procedures. The Hispanic sample in the American Trends Panel is predominantly native born and English speaking. In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.

The following table shows the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey:

unweighted sample sizes