It’s nice to know that others agreed with my conclusion about the “Voting Systems Testing Summit”, held at the beginning of this week in Sacramento, that there is a growing sentiment for states (especially California) to play a stronger and perhaps leading role in the testing of voting systems. Writing in this morning’s Oakland Tribune, Ian Hoffman (who was at the summit) summarized the current situation of voting system testing in this story lead:
For 11 years, most states have relied on voting systems tested to minimal federal standards, the results withheld from public scrutiny and given the green light by a nongovernmental agency working on a shoestring budget.
After much more detail regarding the current situation, and comments from some participants in the summit, Hoffman noted later in his story that:
But states such as California are moving toward more rigorous testing, perhaps in league with other states. To explore those ideas, California’s Secretary of State Bruce McPherson hosted a conference on voting-systems testing that was the largest West Coast gathering of major players in the debate on voting technology in at least two years.
California uses both the national testing and its own, which under McPherson has grown in rigor.
His office now requires every voting-system maker to supply dozens of machines for a massive, mock election to ferret out manufacturing or reliability problems. McPherson also has agreed to let a computer expert try hacking into a Diebold system, and state officials are weighing whether to require the same kind of security testing for all voting systems.
No doubt, California is taking the lead in implementing a wide set of testing protocols, especially for electronic precinct voting machines. Developing a more coherent process of testing in California is a good idea, especially if it can combine the testing protocols now being used into one program that all devices used for balloting and tabulation (precinct, absentee and early; electronic and optical scanning) must go through. And then we need to get going on testing protocols for other parts of the election process, especially the procedures and technologies used for new statewide voter registration systems.
This was my argument, made during my presentation to the summit. I began my noting that security planning and testing must be focused on the entire voting system, from end-to-end. I then talked about both security and contingency plans, and about the need for new and more comprehensive testing practices (especially talking about field tests, and testing for other important aspects of voting system performance, like usability).
I’m sure we will hear more discussion of these issues early next week, when a number of the participants at this week’s summit will head to Washington for two days of discussion and debate about the future of the electoral process, in a forum sponsored by the National Academy of Sciences. These are the same folks who brought us the recent “A Framework for Understanding Electronic Voting”, and the final report, “Asking the Right Questions About Electronic Voting.” Should be another productive NAS event and project!