The headline says it all — Charles testified at a hearing of the US Senate Rules and Administration committee earlier this week. This link will take you to his written testimony and the webcast.
Here’s a Q&A that I recently did with Daniel Oberski on the OUPblog, who has recently developed a helpful software package (Survey Quality Prediction) that is getting an award at AAPOR this week.
There is a great story in the NYTimes today about new British rules related to auditing. Specifically, under the new rules:
Auditors are supposed to comment on the particular risks that companies face and to say what they did to deal with those risks.
They are supposed to discuss how much of the company they actually audited, to disclose what figure they deemed to be the lower limit for materiality [the importance/significance of an amount, transaction, or discrepancy], and to explain how they arrived at that number.
Imagine if we did this in elections! What if, in every election, we knew the particular risks that were evident in each jurisdiction — based on an audit of the election, processes, and procedures in the jurisdiction — and what the jurisdiction had done to mitigate the risk? It would provide excellent data on management and allow people to know how well a jurisdiction is working to minimize problems, reduce the possibility of malfeasance, and ensure elections are of the highest quality.
There’s an interesting paper now in early access at American Political Quarterly, by Alan Gerber, Gregory Huber, Daniel Biggers and David Hendry, “Ballot Secrecy Concerns and Voter Mobilization: New Experimental Evidence about Message Source, Context, and the Duration of Mobilization Effects.” Here’s the paper’s abstract:
Recent research finds that doubts about the integrity of the secret ballot as an institution persist among the American public. We build on this finding by providing novel field experimental evidence about how information about ballot secrecy protections can increase turnout among registered voters who had not previously voted. First, we show that a private group’s mailing designed to address secrecy concerns modestly increased turnout in the highly contested 2012 Wisconsin gubernatorial recall election. Second, we exploit this and an earlier field experiment conducted in Connecticut during the 2010 congressional midterm election season to identify the persistent effects of such messages from both governmental and non-governmental sources. Together, these results provide new evidence about how message source and campaign context affect efforts to mobilize previous non-voters by addressing secrecy concerns, as well as show that attempting to address these beliefs increases long-term participation.
My colleague Alexander Trechsel at the European University University and the European Union Democracy Observatory has just launched a new Voting Advice Application (VAA) for the 2014 European Parliament Elections, euandi.eu. If you are in a nation participating in these elections, check out euandi!
VAAs have proliferated in recent years, especially in European elections. They are widely used by voters, and increasingly used by researchers to study political communications, the use of new technologies in politics, voting behavior and electoral politics. For example, I recently published a paper with Ines Levin, Alexander Trechsel and Kristjan Vassil in the Journal of Information Technology and Politics, “Voting Advice Applications: How Useful and for Whom?”. We have another paper on VAAs, in Party Politics, “Party preferences in the digital age: The impact of voting advice applictions”, (work that we did with the late Peter Mair).
There’s a lot of excellent work new work on VAAs that has been published, or is now forthcoming. For example, Diego Garzia and Stefan Marschall have an edited volume forthcoming from ECPR Press, “Matching Voters with Parties and Candidates.” There’s much, much more; VAAs are proliferating and many researchers are studying both their use and the data they yield in their work.
There’s a new VTP working paper now available, “Practical End-to-End Verifiable Voting via Split-Value Representations and Randomized Partial Checking”, by Ron Rivest and Michael Rabin.
Over the years, various VTP reports and publications have discussed the need for a more vibrant and innovative voting systems industry (for a recent example, see the 2012 VTP report, Voting: What Has Changed, What Hasn’t, and What Needs Improvement). In fact, we will be discussing this in my undergraduate elections class at Caltech this week.
It was interesting to see in the news this morning that Scytl, one of the few voting systems companies, is reporting that they are receiving a $40 million investment from Paul Allen’s Vulcan Capital. That’s a pretty significant investment, at a time when there will be increasing demand for voting systems and technologies throughout the world.
The deadline for submissions for the next issue of JETS is rapidly approaching! See this URL for more information: https://www.usenix.org/jets/issues/0203/call-for-articles
The Presidential Commission on Election Administration’s report is getting a lot of attention and praise following its release on Wednesday. One aspect of the report I want to highlight is the degree to which the Commission aimed to ground their findings in the best available research, academic and otherwise. It renews my faith that it may be possible to build a field of election administration that is more technocratic than it currently is.
I want to lift up an important subset of that appendix, which is a collection of white papers written by a collection of scholars, drawn from a variety of fields and perspectives, that summarized the large literatures that were relevant to the commission’s work. A collection of those papers has been assembled in one place, on the VTP web site, so that others might have easy access to them. Here are the authors and subjects:
- Barry C. Burden and Jeffrey Milyo, “The Recruitment and Training of Poll Workers: What We Know from Scholarly Research”
- Barry C. Burden and Brian J. Gaines, “Administration of Absentee Ballot Programs”
- Charles Stewart III and Daron Shaw, “Lessons from the 2012 Election Administration and Voting Survey”
- Charles Stewart III and Stephen Ansolabehere, “Waiting in Line to Vote”
- Daron Shaw and Vincent Hutchings, “Report on Provisional Ballots and American Elections”
- Lisa Schur, “Reducing Obstacles to Voting for People with Disabilities”
- Robert M. Stein, “Election Administration during Natural Disasters and Emergencies: Hurricane Sandy and the 2012 Election”
- Stephen Ansolabehere and Charles Stewart III, “Report on Registration Systems in American Elections”
- Donald S. Inbody, “Voting by Overseas Citizens and Deployed Military Personnel”
Much of this research effort was assisted by the Democracy Fund, though of course, the research is all the work and opinions of the authors. Speaking personally, I greatly appreciate the support and encouragement of the Fund through these past few months.
In the minds of some, the Presidential Commission on Election Administration was President Obama’s “long lines commission.” While that is an overly narrow description of the commission’s mandate, it identifies the most salient of the motivations behind appointing the commission — reports of voters waiting to vote in the 2012 election. In the words of President Obama, “we have to fix that.”
The commission — rightfully, in my view — didn’t weigh in with diagnosis about what causes all lines, nor did it prescribe a magic bullet to fix them. It did pronounce that a maximum of 30 minutes should be the upper bound of acceptable waiting which, again, is defensible and achievable.
One reason for long lines is that resources are sometimes misallocated to polling places (either on Election Day or in Early Voting). The commission encourages the development of computerized tools to help local jurisdictions figure out how many resources — people, voting machines, poll books, etc. — need to be allocated to each voting location. The encouragement is so strong that a link to a collection of such tools appears on the Commission’s web site, right next to the link one clicks on to download its final report. (In addition, a little down the page, the Commission’s web site has a link to the Caltech/MIT Voting Technology-maintained site that will host these tools in perpetuity, hopefully adding more as time goes by.)
I encourage people to give the tools a look. They include resource calculators developed by MIT Sloan School Professor Steve Graves, election geek Aaron Strauss, and software developer Mark Pelczarski, and various online voter registration tools developed by Rock the Vote. The tools range from efforts that have already proven themselves in past elections (the Pelczarski and RtV tools) to more notional examples that I trust will continue to develop in the coming months.
Here is the most important part of the online tool kit, in my view: Most local election officials are flying blind, when it comes to knowing how many voting machines (and similar devices) they should have in order to serve their communities well, and how to spread those devices among their precincts. They will tell you, as they have told me, that they have rules they follow, based on state law and past elections. But, as far as I can tell, the reigning rules of thumb about resource allocation are unrelated to machine performance.
Most election directors in large jurisdictions, where lines were the biggest problem, could not tell you (within a reasonable degree of certainty) how many voting machines and poll books they would need to meet the commission’s 30-minute standard, because they generally don’t have access to engineering-based tools to compute the right answer.
To some degree, these tools do exist, and the online tool kit web site is an effort to begin collecting them. Still, even the existing tools need to be refined, in light of the needs of local election officials. It is my hope that the online tool kit site, hosted by the VTP, will be the focus of tool development in the coming years. I encourage people to give them a look, to try and improve them, and to contribute to the collection.