Author: Jeremy Candelas

Imagine you had the chance to find out what Americans thought about almost any conceivable topic. Perhaps you’d want to know how Americans felt about aggression on social media. Maybe you’d be curious about how opinions on U.S. relations with China have recently evolved? Or how Americans viewed the quality of their education?

These types of questions are the kind that saturate discussions in GOVT/COMM 3189: “Taking America’s Pulse.”

The class is relatively new at Cornell, taught by Communication Professor Jon Schuldt and Government Professor Peter Enns. I stumbled upon it during enrollment period and became heavily intrigued by the course description, which promised students the chance to “design, conduct, and analyze a national-level public opinion survey.” Being the political junkie that I am, I immediately jumped on the opportunity. The general concept appealed to me – a free weekly lunch with the professors was just an additional perk.

The survey is wholly designed and run by the students, consisting entirely of the issues that matter to them – one question per student. Typically, students tend to form questions based on similar interests in the class. Some students may be intrigued by certain aspects of social media, while others may be interested in topics such as health or education.

Slope Media | Photographer: Jeremy Candelas
Slope Media | Photographer: Jeremy Candelas

Putting into context the magnitude of this opportunity, Professor Peter Enns insists that each question asked “is worth at least a thousand dollars,” going on to describe the survey as “a $45,000 endeavor.”

For my own research question, I plan to explore the following relationship: how content individuals are with the level of formal education they have received, and how this may vary across various demographics, such as gender and ethnicity.

Questions are grouped together according to similarity so that the survey will flow smoothly for both the interviewers and the respondents. Being able to participate in such a large project is no small opportunity, and the ability for each student to choose his or her own question is even more extraordinary.

To say that the course is fast-paced is an understatement. We drafted our questions and response options within the first month; they then went through numerous peer reviews before a final submission to the Cornell Survey Research Institute.

Once the survey was submitted, we began to undergo training through the Cornell Institutional Review Board for Human Subjects Research. Now that our training is complete, we’ve begun conducting the survey through the Cornell Survey Research Institute, and will continue to do so throughout the month of March.

Photo courtesy of Cornell.edu
 cornell.edu

Professor Peter Enns
Cornell.edu

Surveys are conducted entirely over the phone, with a database of verified phone numbers generating randomized numbers. On our own time, and in the time allotted for weekly discussion sections, we are expected to complete at least five telephone surveys at the off-campus research center, with each survey taking about 15 to 20 minutes to complete. However, it can take over an hour before an individual agrees to complete the survey. Occasionally, it takes even longer. This past Saturday, I called from 1:00 until 6:00 with no completions.

When it comes to calling individuals, students can encounter several obstacles. For example, if no one answers the randomly dialed number, we cannot simply keep moving on to the next number; the number is marked for a callback about a week later, and students must continue trying to make contact. Other callers simply hang up, mistaking us for telemarketers. Again, students are instructed to follow up with these individuals at a later point and attempt to explain the project in hopes they will change their minds.

Photo courtesy of Cornell.edu
cornell.edu

Professor Jon Schuldt
Cornell.edu

When it comes to successfully completing surveys, Professor Schuldt recommends that students constantly try to “build rapport” with the respondents. In other words, they should encourage the individual throughout the survey while being careful not to seem biased.

The survey concludes at the end of March, at which point the data will be collected and delivered to the class via the Cornell Survey Research Institute. Our next step is to analyze the data gathered as a class and try to understand how the answers relate to the individual interest questions.

In April, following the data analysis, we are expected to write an Op-Ed on our results and submit it to various publications in hopes of it being printed. At the end of the semester, we present our findings.

All in all, this class is truly a unique and extraordinary experience. I am, like most of the class and the folks at the research institute, anxious to see what we uncover through our survey and the subsequent data analysis. Professor Schuldt and Professor Enns are also quite passionate and enthusiastic about the course, striving to help each student to succeed in their research. For anyone who is interested in politics, polling, or policy, I highly recommend taking this class into consideration.