DATA DRIVEN RESEARCH
Measuring Our Impact
We’re part of a proud tradition of progressives using data and behavioral science to hone campaign strategy. We synthesize existing research across a variety of fields and then run our own experiments to measure the impact of our tactics.
Whether we’re asking someone on the phone to rank how likely they are to vote or talking to someone at their door about what motivated them to vote most recently, we assess the effect of each program using a randomized control trial experiment (RCT).
To run an RCT, we allocate people in our universe at random to receive one of several treatments. One group of people will receive no communication – this is the “control group.” By testing the result of the universe of people we did outreach to versus the universe of people in our control group, we can measure the precise impact that our programs had. In many cases for us, the result we are testing for is whether or not a person voted.
In our experimental work, we’re trying to answer three main questions:
How much can we boost turnout with all available tools?
During the 2018 midterms, PTP was able to deploy our scientifically tested tactics to more districts than ever to help elect 36 new Democrats to the House and 2 Democratic Senators. We did this by investing in over 104 races: running 18 intensive field programs in battleground Congressional districts, sending mail to 36 districts, and running targeted digital ads in 30 districts. We reached more people than ever: we had 171,404 conversations at the doors, sent 3.8 million mail pieces, and served up 89.4 million+ digital ad impressions. On average, our (field) programs boosted turnout by 11.01%, helping achieve the highest midterm turnout since the 1960s.
How effective are different methods of contacting voters?
In 2017, we sent 25 staffers to Virginia to help flip the Statehouse and elect a Democratic Governor. With so many voters to reach, we wanted to use every available method to reach them and turn them out to vote, and had the opportunity to test the effectiveness of each method both on its own and its effect when combined with other tactics. While it’s clear that reaching voters at the door is the most effective method on its own, we saw powerful increases in turnout when voters received targeted mail or phone calls, and even bigger effects when they got those as a follow up to a conversation at the door. We took these results into 2018 and expanded our programs even further.
Do different tools work better on certain people than others?
In 2018, we expanded our digital programs and served up almost 90 million digital ad impressions to 1.4 million target voters as part of an experiment to further develop our understanding of what messages and tactics resonate with different groups of Democratic voters. We’re still digging into these results and are excited to use our findings to run even smarter digital programs in 2020!
While field programs remain our flagship, we have seen results from various forms of media, and we will continue deploying and analyzing these additional methods to determine the most successful and cost-effective means of contacting voters. Read more about our data-driven projects here.
To further our program, we have partnered with researchers at the University of Chicago and Northwestern University to learn how we can make our programs more effective.
With the University of Chicago’s Becker Friedman Institute, we are working on how to properly formulate nudges to employ across all aspects of our programs. Nudges are indirect suggestions that mobilize people to vote.
With Northwestern University’s Kellogg School of Management, we are reviewing our 2018 data to identify subgroups that responded differently to our treatments than others in order to better focus our efforts in future elections. We’re also evaluating the best qualities of our field staff in an attempt to replicate as many of these qualities as possible when designing and staffing future programs.