CA2475136C - A coercion-free voting scheme - Google Patents

A coercion-free voting scheme Download PDF

Info

Publication number
CA2475136C
CA2475136C CA002475136A CA2475136A CA2475136C CA 2475136 C CA2475136 C CA 2475136C CA 002475136 A CA002475136 A CA 002475136A CA 2475136 A CA2475136 A CA 2475136A CA 2475136 C CA2475136 C CA 2475136C
Authority
CA
Canada
Prior art keywords
voter
ballot
received
value
confirmation value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CA002475136A
Other languages
French (fr)
Other versions
CA2475136A1 (en
Inventor
C. Andrew Neff
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Demoxi Inc
Original Assignee
Dategrity Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dategrity Corp filed Critical Dategrity Corp
Publication of CA2475136A1 publication Critical patent/CA2475136A1/en
Application granted granted Critical
Publication of CA2475136C publication Critical patent/CA2475136C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C13/00Voting apparatus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computer Hardware Design (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)

Abstract

A facility for conducting a coercion-resistant electronic collection is described. The facility receives from the voter a first voter conformation value. At a later time, the facility receives from the voter an encrypted ballot and a second voter confirmation value. Without regard for the value o f the received second voter confirmation value, the facility adds the received ballot to a publicly-available list of cast ballots. After the addition, members of the public are able to verify the addition of the received ballot to the list without being able to determine whether the ballot will be counted. The facility counts the ballot if and only the second voter confirmation value received with the ballot matches the received first voter confirmation value.

Description

A Coercion-free Voting Scheme Technical Field This application is directed to the technical field of security measures for electronically-conducted elections.
Background Various electronic and/or digital election protocols exist that provide cryptographic privacy to voters. With many of these election protocols, the voter needs to keep certain types of information secret. An example of such secret information is a voter's private key. These existing election protocols can be problematic, however, if a person threatens, or entices a voter (e.g., financially) to give up the secret information. When this type of coercion occurs, it is possible for the person to either know how the voter voted, or vote on his or her behalf.
Similar problems arise with the use of absentee vote-by-mail systems.
For example, a husband might force his wife to vote a certain way. The threat of coercion intensifies in a networked world, where people can "look over each other's shoulders" from thousands of miles away. This threat is serious enough that it is often considered a reason to not allow remote electronic voting.
Under threat models that do not include coercion, the notion of a univer-sally verifiable election is fundamental. In the past, it has been considered important that a "computing device based" election scheme be universally verifiable in order to be considered deployable on a wide scale. During elections of this type, transcripts are published that include the final tally.
Under reasonable assumptions about the safety of keys, and the intractabil-ity of some computational problems, these transcripts cannot be feasibly forged by any collection of malicious agents. Although it would be desirable to carry this property over to election schemes under the threat of coercion, this may be difficult. Election schemes under the threat of coercion lack cer-tain very basic properties, which have generally been taken for granted in the election protocol literature, and hence may not be practical in a large-scale implementation.
Brief Description of the Drawings Figure 1 is a block diagram showing a suitable environment for imple-menting the scheme.
Figure 2 is a flow diagram showing steps typically performed in accor-dance with the scheme.
Description The scheme described herein allows the voter to remain in exclusive possession of secret information that is used by a voter to cast a ballot. It allows a voter that has been pushed to reveal secret information to provide a false answer without being discovered. After providing the false answer, the voter can then proceed and cast a "real" vote on his or her own. This is achieved while still maintaining a collection of election audit properties that are characteristic of good electronic election protocols. An election scheme is coercion safe if, even in the coercion threat model, its transcript can not be feasibly forged by any collusion of authorities that, together, are unable to compute a tally. Further, in the case of a collusion that is able to compute a tally, the extent of the forgery is limited by the number of voters coerced.
At a summary level, the invention works as follows:
Voters participate in a secret "voter registration" process in prior to the start of the election. This process must make the voter safe from coercion by standard physical means. In practice, this means the voter must report to a county registration center, where physical privacy is guaranteed. However, the voter need only participate in this registra-tion process once. Thereafter, the method of this invention will protect the voter against coercion through the course of multiple elections.
2. During the registration process, each voter selects a secret "confirma-tion code," or "confirmation pass phrase."
The "confirmation pass phrase" is encrypted by the voter and the encrypted form is publicly registered to that voter.

In order to cast a ballot, each voter must supply an accompanying (encrypted) pass phrase. The accompanying pass phrase does not have any effect on whether the ballot is "accepted" or not - so if the voter is being "supervised" by a coercer, the voter is still free to supply any pass phrase whether it matches the voter's registered pass phrase or not. The coercer will not be able to tell the difference. However, the accompanying pass phrase will have an effect on whether the ballot it accompanies is counted or not. The mechanism for this (described next) nevertheless assures that (a) Anyone, including the coercer, can inspect the ballot box contents and the tally to determine whether the tally is accurate or not (i.e. the election is Universally Verifcable).
(b) In spite of the full availability of election data, the encryption and count mechanisms ensure that the coercer will still not be able to determine what vote, if any cast by the voter is actually included in the count.
5. The tabulation (counting) of encrypted votes is accomplished roughly by randomly mixing voted ballot - encrypted pass phrase pairs as well as the original registration data. After randomization, the appropriate data is decrypted by election authorities holding shares of the encryp-tion key. Only when a match between a pass phrase in the randomized ballot data matches a pass phrase in the randomized registration data is the ballot counted. The matching is done without ever decrypting either of the pass phrases. Since all the randomization is done by way of a cryptographic verifiable shuffle, the results can still be inspected and verified by anyone for accuracy.
Figure 1 and the following discussion provide a brief, general description of a suitable computing environment in which aspects of the invention can be implemented. Although not required, aspects and embodiments of the invention will be described in the general context of computer-executable instructions, such as routines executed by a general-purpose computer, e.g., a server or personal computer. Those skilled in the relevant art will appre-ciate that the invention can be practiced with other computer system con-figurations, including Internet appliances, hand-held devices, wearable com-puters, cellular or mobile phones, multi-processor systems, microprocessor-based or programmable consumer electronics, set-top boxes, network PCs, mini-computers, mainframe computers and the like. The invention can be embodied in a special purpose computer or data processor that is specif ically programmed, configured or constructed to perform one or more of the computer-executable instructions explained in detail below. Indeed, the term "computer" , as used generally herein, refers to any of the above devices, as well as any data processor.
The invention can also be practiced in distributed computing environ-ments, where tasks or modules are performed by remote processing devices, which are linked through a communications network, such as a Local Area Network ( "LAN" ), Wide Area Network ( "WAN" ) or the Internet. In a dis-tributed computing environment, program modules or sub-routines may be located in both local and remote memory storage devices. Aspects of the in-vention described below may be stored or distributed on computer-readable media, including magnetic and optically readable and removable computer discs, stored as firmware in chips (e.g., EEPROM chips), as well as dis-tributed electronically over the Internet or over other networks (including wireless networks). Those skilled in the relevant art will recognize that por-tions of the invention may reside on a server computer, while corresponding portions reside on a client computer. Data structures and transmission of data particular to aspects of the invention are also encompassed within the scope of the invention.
Referring to Figure 1, one embodiment of the invention employs a com-puter 100, such as a personal computer or workstation, having one or more processors 101 coupled to one or more user input devices 102 and data stor-age devices 104. The computer is also coupled to at least one output device such as a display device 106 and one or more optional additional output de-vices 108 (e.g., printer, plotter, speakers, tactile or olfactory output devices, etc.). The computer may be coupled to external computers, such as via an optional network connection 110, a wireless transceiver 112, or both.
The input devices 102 may include a keyboard and/or a pointing device such as a mouse. Other input devices are possible such as a microphone, joystick, pen, game pad, scanner, digital camera, video camera, and the like.
The data storage devices 104 may include any type of computer-readable media that can store data accessible by the computer 100, such as mag-netic hard and floppy disk drives, optical disk drives, magnetic cassettes, tape drives, flash memory cards, digital video disks (DVDs), Bernoulli car-tridges, RAMS, ROMs, smart cards, etc. Indeed, any medium for storing or transmitting computer-readable instructions and data may be employed, in-cluding a connection port to a network such as a local area network (LAN), wide area network (WAN) or the Internet (not shown in Figure 1). As-pects of the invention may also be practiced in a variety of other computing environments.
Figure 2 is a flow diagram showing steps typically performed in accor-dance with the scheme. These steps are described in more detail below. In step 201, voters are registered to add them to the list of registered voters eligible to cast votes, and to provide them with voting credentials. In step 202, the election is initialized to assign ballot choice values to candidates.
In step 203, voters cast their votes by submitting encrypted ballots. In step 204, the votes cast in step 203 are tabulated, and added to the vote total only if the validity of the received ballot can be verified. After step 204, these steps conclude.
1 Coercion Implications of Partitionable Tabula-tion The purpose of this section is to 1. Characterize a class of election schemes that includes the vast majority of schemes previously studied, and also seems likely to include all schemes that are "practical" for large scale, public elections.
2. Establish some bounds on what can be achieved by schemes in this class under the coercion threat model.
Definition 1 Henceforth, we call any participant in the election process, or any individual who exerts, or attempts to exert, an influence on the election process a player. Thus voters, election officials, and tabulators are all players, but so are all individuals who seek to influence the election outcome even though they may have no official role in it.
Definition 2 Player Pl coerces player P2 if Pl obtains from P2 any informa-tion that the election protocol does not require P2 to reveal to Pl. Identical terminology is used when the coercer is actually a grov,P of players. That is, no aspects of the invention limit its utility to the case were the coercer is a single individual. Therefore, henceforth, we will not endevor to make an explicit distinction between coercion by an individual and coercion by a group of individuals acting together.
Definition 3 Coercible information is all information whose authenticity can be "verified" by the coercer. If the authenticity can not be verified, then the voter (or individual being coerced) is free to lie about it to the coercer.
J

Definition 4 Recall that a tally is a function t : C ~ N = Z+ U {0~, where r = {c1, . . . , c~~ is the "candidate slate". We write l t ~ _ ~ t(~;,) z=1 The invention requires something roughly like digital ballot box. At very least, this is a storage device connected to a network, or otherwise openly ac-cessible to voters. A standard web server and database application provides an embodiment of such a device. In practice, more security measures would be built into, or around this device in order to protect against damage or destruction caused by either malicious or natural forces. The invention also requires that voters be able to translate their choices into a digital represen-tation, and further encrypt that representation by the methods presented in the remainder of this invention. A generic PC provides an embodiment of such a device.
Definition 5 Since the transmission and storage of information are the key elements of this invention rather than the particular technologies that facilitate transmission and storage, we will adopt the more generic term bulletin board to denote the openly accessible storage device, and we denote the act of recording information on (or in) the bulletin board as posting.
(In the voting context, this corresponds, intuitively, to the act of "casting a ballot".) Further, we denote the strings, or records of information that are posted to the bulletin board as posts. (In the voting context, these correspond, intuitively, to voted ballots.) Let us now consider a set of very general properties that characterize a broad class of election protocols. The properties are considered in the absence of coercion. That is, in verifying a given property with respect to a partic-ular protocol, we consider all potential protocol executions where the only information excha~,ged between players is that which is specified by the pro-tocol. (We number these properties sequentially as PP-1, PP-2, etc.) PP-1. Posts are always appended to the bulletin board, Cil3, that is, dele-tions are not allowed. And posting is an atomic transaction, that is, at any given time, CiCi will contain exactly k posts, for some non-negative integer k.
PP-2. Any player may append a post regardless of the state (contents) of aCi.

PP-3. At any given time, a tally can be formed, and it is unique. That is, it is not possible (or at least "overwhelmingly improbable"), that ,1313 is in some state, C(C313) that is "invalid" for tabulation, and the tally, tally(C(Iili)) : C --> N is well defined.
PP-4. A collection of players either can or cannot compute the tally inde-pendent of the state of 1313.
Recall that the voter role, C~, is essentially a public list of players (eligible voters), {v1, . . . , v,~}. Also, we use C(1313) to denote the contents of ,1311 at an arbitrary time, that is, the sequence of posts p1, . . . , pt. Let P be the the set of all players in the protocol, so C7 C P.
For simplicity of presentation, we assume that the ballot consists of one issue, that the candidate slate, T, is given by {c1, . . . , cl}, and that each voter is allowed to choose (vote for) "at rreost one" candidate. Generalizing this setting to one that includes more general ballot types (that do not include "write-ins" ) is fairly straightforward.
Definition 6 Let C = C(Cili) be any state of 1313 (sequence of posts). If p is a post, we denote by C ~ p the state of 1313 after appending the single post p. We also use the notation tC to denote the tally, tally(C).
Definition 7 A vote fv,nction (on .t3,Ci) is a map x : P x C(1313) -> {0,1}r (1) characterized by the following vfl. ForallpEP
X(P, C(~~)) ~ E {0, 1}
of 2. For all C(Cit3), if p ~ C~, then (with "overwhelming probability") X(p, C(a~)) Intuitively, this says that the protocol "only allov~s members of the voter role (eligible voters) to vote".

of 3. For all p E P, if p posts p, then the following holds (with "over-whelming probability" ) for all q E P, tc(r~a)~P - tc(z~z~) if q = p x(q~ C(~~) ~ P) - X(q, C(~~)) _ 0 ifq~p ('I) Intuitively, this says that the protocol "only allows a voter to vote on his own behalf ". It rules out schemes that allow multiple voters to combine their votes into one or more posts.
of 4. For all 1 <_ i < l, and all 1 <_ j < k, if ~ X(vi, C(,Ci.L3)) ~ = 0, then vi can compute (with probability 1) a post p such that 1 if ~r = j tc(aa)~P ('~) - tc(aa) (~) _ ~ p if ~r ,~ j Intuitively, this simply says that if vi has "not yet voted", then vi can append a "vote" for any candidate. However, the statement does not preclude the possibility that the protocol may allow vi to "cast a vote"
and then later "change it". (Nevertheless, the majority of protocols in the literature, which essentially allow each voter "one and only one chance to vote", do satisfy this criteria.) of 5. For all 1 <_ i <_ l, if I x(vi, C(Ci,Ci)) ~ = l, then vi can with at most negligible probability compute a post p satisfying tC(LiCi)~p ~ ~ ~ tC(Cili) Intuitively, this simply says that no voter may "vote more than once" .
Again, however, the statement does not preclude the possibility that the protocol may allow a voter to "change a vote" or to "retract a vote". (As before, the majority of protocols in the literature satisfy this criteria.) Let A2~ be the event that vi computes a post, p, satisfying tC(CiCi)~P (e7) - tC(CiLi) (~~) _ -1 (7) Let B2~ be the event that x(vi, C(,t3,L3)) (c~) = 1.

of 6. There is a constant, a (0 <_ a <_ 1) such that, for all 1 <_ i _< l, and all 1 < j < k, the conditional probability, P(A2~~BZ~) satisfies P(AZ~~Bz~) = a (8) independent of the values of i, j, and the state of the bulletin board, C(,Cia).
Intuitively, this says that if the protocol allows "a voter to change a vote at some time" then the protocol allows "any voter to change a vote at any time". However, it does not preclude the protocol from forbidding vote changes, which is more common in the literature.
of 7. For all 1 <_ i <_ l, and all 1 <_ j ~ rl <_ k, the conditional probability, P(AZ~~B~.~) satisfies P(AZ~~Bi~) ~ E
where a > 0 is negligible.
Intuitively, this says that the protocol only allows "a voter to reduce the count for a candidate" if "that voter has voted for that candi-date". Again, this does not preclude the protocol from forbidding vote changes.
PP-5. The protocol admits a vote function. (Note that this does not require that the vote function be computable by any of the players, only that it exist.) Definition 8 An election protocol is said to have partitionable tabulation if and only if it satisfies PP-1-PP-5. For brevity, we will also use the term partition.able election protocol to describe any election protocol has partitionable tabulation.
Theorem 1 If an election protocol has partitionable tabulation, and a co-ercer contains a collection of players capable of computing a tally, tiaen for any 1 < i < l, the value of x(vi, C(13B)) is coercible.
Proof: (Sketch) The coercer can step through the sequence of ballot box images, at each point computing the tally (see assumption PP-4) and re-quiring vi to "add a vote" of a particular value. By re-computing the tally with vi's post appended, the coercer can determine which posts were added by v.i and their cumulative effect on the tally.

Note that this presumes a model in which "after the fact" coercion is allowed. That is, the coercer may interact with the voter after the bulletin board has been closed. However, this assumption can be eliminated with a reasonable assumption on the computing power of voters. In particular, we can show that the coercer is able, by way of a single coercion event, to 1. Impersonate the voter during the course of the election - thereby "adding any chosen vote to the bulletin board (ballot box)", and con-sequently forging "part" of the election transcript.
2. Detect any attempts by the voter to independently change the vote.
Definition 9 A partitionable election protocol is coercion resistant if, un-der the assumption that there is no coercer capable of independently com-puting a tally:
CS-1. If p E P and vi E C~, vi ~ p, then p cannot compute X(vi, C(,Ci,Ci)) with probability higher than "random guess +e" .
CS-2. The election results are publicly verifiable.
Definition 10 A partitionable election protocol is coercion safe if, it is coercion resistant and, under all collusion scenarios, CS-3. If t1 is the "ideal tally", then verification of the election guar-antees tc~aa> - t1 I c M (10) 2 A Coercion Safe Election Protocol We assume the standard ElGamal cryptographic setting: p and q are large primes with q1 p - 1. A subgroup generator, g E ZP with (g1 = q, 'and h = gs with s shared by a (t, n) threshold scheme among n tabulation authorities, Ar,...,An.
The protocol we next describe is coercion resistant. We will later de-scribe how it can be easily augmented to give a coercion safe protocol. The advantage of describing the weaker version first is that most of the difficulty lies in its construction.

2.1 Registration Recall that we assume voters are safe from coercion during their registration session. Care must still be taken to assure that information exchanged during registration is not coercible afterwards.
We denote the voter by vi.
R-1. vi chooses a random ri E (g), and a random ai E Zq, and forms (UiO, Wi0) _ (gaze halri) (11) R-2. For each 1 < j < n R-2.1. vi obtains from Aj the pair (Uij, Wij) given by (Uij~Wij) _ (g'~'~,~7,aii) (12) where ,~ij E (g) is chosen randomly by Aj.
R-2.2. vi and Aj execute an interactive Chaum-Pedersen proof of validity for the relation logs Uij = logh Wij. That is, the chal-lenge is generated randomly by vi rather than via a hash function (Fiat-Shamir heuristic). This allows vi to later produce simulated proofs in the face of coercion.
R-3. After checking each Chaum-Pedersen proof, vi computes (Ui, Wi) _ ( ~ UiN. , ~ WiN.) (13) ~,=o ~=o R-4. For each 1 < j < n, vi obtains a signature on (Ui, Wi) from Aj as a receipt.
R-5. (Ui, Wi) is added to the voter roll, CJ. When the registration period ends, each authority should sign C7.
Remark 1 As long as vi knows that one specific authority, A~, is not a coercer, and fewer than t authorities (the number necessary to compute a tally) are colluding to coerce (though vi may not explicitly know their identities), the value of ri is not coercible. This is because vi can justify the validity of any ri and ai by lying to the coercer about the value of (Ui~, V
~) and presenting a forged (i.e. simulated) Chaum-Pedersen proof.

The requirement that vi knows a specific honest A~ may be relaxed if we assume that it is acceptable for vi to be caught lying to the coercer.
Alternatively, if n » t, then vi can pick an J at random, 1 <_ J <_ n, assume that Aj is honest, and then know that the chance of being caught lying is at most (t - 1)~n.
2.2 Election Initialization EI-1. For each 1 <_ j <_ n, and for each 1 <_ i <_ l = ~ (~ ~, authority Aj generates randomly and independently a pair of elements in (g), (~ij, ~lij). The quantities n n (~i ~ rli) _ (~ ~ij ~ ~ rlij) (14) j=1 j=1 are publicly computed. These are all published (and signed).
EI-2. The ballot choices y~, E (g), 1 <_ ~, <_ k = ~ r ~, are assigned by some public random process, or by sharing. (The value -y~,, will indicate a vote for candidate c~,.) 2.3 Voting V-1. vi chooses random vil E Z9 and encrypts her vote as the ElGamal pair (Ai, Bi) _ (g"m hv~l,y(i)) (15) V-2. vi then chooses random vie E Zn, computes si = ri/y(i) and encrypts it as ((Ji Di) _ (9~z2 h~i2gi) (1~) V-3. vi then constructs non-interactive proofs of knowledge, Q,AB and QCD
for the pairs (Ai, Bi) and (Ci, Di) respectively. These proofs show that the pairs are of the required form, and that vi knows the values of si and ~y(i).
V-4. vi submits the encrypted voted ballot AB CD
L'i = ( (Ai, Bi) ~ (C'i~ Di) ~ ~i ~ ~i ) (17) wo osio69aas rcT~so3io~~9s V-5. Though not necessary in the "append only" bulletin board model, in practice, ui would be issued a receipt for Ei.
2.4 Tabulation In this section, we assume a subset of t authorities has been fixed. Without loss of generality, we may assume these are Al, . . . , A=.
T-1. For each 1 < i < l, the quantities (Ui a Wi) _ (~iUi ~ '~Ii.Wi) (18) are publicly computed.
T-2. The authorities execute a veri, fiabLe shu$le of the sequence of pairs of ELGamal pairs, (Ui, Wi), (~i, ~7i), resulting in output set of pairs of ELGamal pairs J i ~(~i, ~i)s (~i~'rli)~i-1 (19) where ~;, Vii, Vii, ~, E (g). The properties of this mix are that the set of decrypted value pairs, (ai, bi) of the output sequence are ex-actly the same as the set of decrypted value pairs of the input se-quence, but in randomly permuted order. Executing such a verifiable shuffle is discussed in greater detail in U.S. Patent Publication Wo.
2002-0007457-A1 i, entitled "VERIFIABLE, SECRET SHUFFLES OF EN-CRYPTED DATA, SUCH AS ELGAMAL ENCRYPTED DATA FOR
SECURE MULTI-AUTHORITY ELECTIONS," dated 17 Jan . 2002 and PCT Application No. W002/77929, entitled "VERIFIABLE SE-CRET SHUFFLES AND THEIR. APPLICATION TO ELECTRONIC
VOTING," dated 3 Oct. 2_002, copies of each of which axe included in Appendix A hereto.
T-3. Let {((A"1, B,,i) , (C"~, D"~))};;~ 1 be the set resulting from all voted ballots with verified validity proofs. The authorities execute another verifiable shine of the sequence of these M ELGamal pair pairs, with resulting output set ~((Am~ Bm) , (~'m~ Dm))}ra--1 (20) T-4. For each 1 < m < M, the l ElGamal pairs (Omi s ~mi) _ (AmCm~i~i 1 , BmDm~li~i 1) (2I) 1 < i < l are publicly computed.
T-5. The authorities jointly decrypt all of the pairs (A.",,, Bm), and (Omi, Stmi), 1 < i < l, 1 < m < M. Let these be, respectively, am, and xmi.
T-6. For each 1 < m < M, am is added to the tally if and only if T-6.1. am E {~1, . . . , p,~}
T-6.2. For some l < i < l, xmi = 1.
2.5 Tabulation - Alternate Embodiment In this section, we assume a subset of t authorities has been fixed. Without loss of generality, we may assume these are Al, . . . , At.
T2-1. For each 1 < i < l, the quantities (Ui , Wi) _ (~iUi ~ rliWi) are publicly computed.
T2-2. The authorities execute a verifiable shuffte of the sequence of EIGa-mal pairs, (Ui, Wi), resulting in output set of ElGamal pairs f ('~i ~ ~i)~i-1 where phi, ~i E (g). The properties of this mix are that the set of decrypted values of the output sequence are exactly the same as the set of decrypted values of the input sequence, but in randomly permuted order.
T2-3. For each voted ballot, Em, 1 <_ m <_ M, with verified validity proofs, the l ElGamal pairs (~mi , ~mi) _ (Am.~%m.~i ~ BmDm~i) (24) are publicly computed.

T2-4. The authorities execute a verifiable shuffle of the sequence of M x l ElGamal pair fairs, ( (A.",,, B,",,) , (0,"1i, SZ",,i) ), resulting in the output set (A~n, Bin) ~ (D~ni~ ~~n.Z) ) ~.~ -Mi Z1 (25) T2-5. The authorities jointly decrypt all of the pairs (~i , Vii), (A",,, B.",,), and (0.",,i, S2"'i) Let these be, respectively, øi, a",,, and x,ni.
T2-6. For each 1 < m < M, a,"' is added to the tally if and only if T2-7. a"z E {~,1, . . . , ~,k}
T2-8. For some 1 < i < L and 1 < j < l, x,,",i = ~~.
2.6 Making the Protocol Coercion Safe The protocol, as presented is clearly not coercion safe. If t or more author-ities collude, they can decrypt the original voter secrets, ri, and this allows them to impersonate all the voters. The problem can be fixed by adding an anonymous signature requirement to the ballot casting operation. (See aforementioned patent applications for a detailed description of an anony-mous signature protocol that is "authority free".) In this case, even if a malicious agent has access to a secret, ri, it can not affect the tally without the corresponding private signing key, which can not be obtained without coercion. The reason for this should be clear. An authority free, anonymous sigu,ature on the voted ballot prevents the authorities (even in collusion) from linking the original encrypted ballot (input to the verifiable shuffle, or mix) to an individual the way they can with a standard digital signature.
A standard digital signature explicitly links signed data to a registered in-dividual. An anonymous signature only links signed data to a member of a set, or group, of individuals.

Claims (2)

WHAT IS CLAIMED IS:
1. A method in a computing environment for conducting a coer-cion-resistant electronic election, comprising:
receiving from the voter a first voter confirmation value;
after receiving the first voter confirmation value, receiving from the voter a first encrypted ballot and a second voter confir-mation value, the second voter confirmation value being formed based upon input from the voter, such that the voter may deter-mine whether the first ballot will be counted by varying the input;
without regard for the value of the received second voter confirmation value, adding the received first ballot to a publicly available list of cast ballots, such that members of the public are able to verify the addition of the received first ballot to the list without being able to determine whether the first ballot will be counted;
if the value of the received second voter confirmation value received with the first ballot reflects a determination by the voter that the received ballot will not be counted, giving the voter an opportunity to provide an additional encrypted ballot and an additional second voter confirmation value; and for any ballots received from the voter, counting the ballot if and only if the second voter confirmation value received with the ballot matches the received first voter confirmation value.
2. A computer-readable medium having computer-readable code embodied therein for conducting a coercion-resistant electronic election by:
receiving from the voter a first voter confirmation value;
after receiving the first voter confirmation value, receiving from the voter a first encrypted ballot and a second voter confir-mation value, the second voter confirmation value being formed based upon input from the voter, such that the voter may deter-mine whether the first ballot will be counted by varying the input;
adding the received first ballot to a publicly-available list of cast ballots, such that members of the public are able to verify the addition of the received first ballot to the list without being able to determine whether the first ballot will be counted;
if the value of the second voter confirmation value received with the first ballot reflects a determination by the voter that the received ballot will not be counted, giving the voter an opportu-nity to provide a substitute encrypted ballot and a substitute second voter confirmation value; and for any ballots received from the voter, counting the ballot if and only if the second voter confirmation value received with the ballot matches the received first voter confirmation value.
CA002475136A 2002-02-14 2003-02-14 A coercion-free voting scheme Expired - Fee Related CA2475136C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US35721002P 2002-02-14 2002-02-14
US60/357,210 2002-02-14
PCT/US2003/004798 WO2003069448A2 (en) 2002-02-14 2003-02-14 A coercion-free voting scheme

Publications (2)

Publication Number Publication Date
CA2475136A1 CA2475136A1 (en) 2003-08-21
CA2475136C true CA2475136C (en) 2007-04-17

Family

ID=27734736

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002475136A Expired - Fee Related CA2475136C (en) 2002-02-14 2003-02-14 A coercion-free voting scheme

Country Status (8)

Country Link
EP (1) EP1532556A4 (en)
JP (1) JP2005526307A (en)
KR (1) KR20040078165A (en)
CN (1) CN1659554A (en)
AU (1) AU2003215282A1 (en)
CA (1) CA2475136C (en)
RU (1) RU2292082C2 (en)
WO (1) WO2003069448A2 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2906058B1 (en) * 2006-09-14 2008-11-21 Eads Defence And Security Syst METHOD AND SERVER FOR VERIFYING THE CONTENT OF A VIRTUAL URN OF A DIGITAL ELECTRONIC VOTING SYSTEM USING A HOMOMORPHIC ALGORITHM
CN101727692B (en) * 2008-10-17 2013-01-02 中科院成都信息技术有限公司 Method and system for processing poll information
RU2452013C2 (en) * 2010-01-26 2012-05-27 Российская Федерация, от имени которой выступает Федеральная служба по техническому и экспортному контролю (ФСТЭК России) Method of creating public key certificate
CN102622572A (en) * 2012-03-16 2012-08-01 中科院成都信息技术有限公司 Anti-counterfeit method for quickly matching and associating votes
CN102629396B (en) * 2012-04-09 2014-04-02 中科院成都信息技术股份有限公司 Information encryption and quick processing method for alternative candidate-containing electronic votes
RU2751315C2 (en) * 2019-11-05 2021-07-13 Публичное Акционерное Общество "Сбербанк России" (Пао Сбербанк) Universal system of distributed secure remote voting
US11356267B2 (en) 2020-05-15 2022-06-07 Op Osuuskunta Apparatus, method and software for electronic voting during web conference

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6092051A (en) * 1995-05-19 2000-07-18 Nec Research Institute, Inc. Secure receipt-free electronic voting
US6081793A (en) * 1997-12-30 2000-06-27 International Business Machines Corporation Method and system for secure computer moderated voting
US20010034640A1 (en) * 2000-01-27 2001-10-25 David Chaum Physical and digital secret ballot systems

Also Published As

Publication number Publication date
AU2003215282A1 (en) 2003-09-04
RU2292082C2 (en) 2007-01-20
EP1532556A2 (en) 2005-05-25
EP1532556A4 (en) 2007-09-19
KR20040078165A (en) 2004-09-08
CN1659554A (en) 2005-08-24
AU2003215282A8 (en) 2003-09-04
CA2475136A1 (en) 2003-08-21
JP2005526307A (en) 2005-09-02
WO2003069448A2 (en) 2003-08-21
WO2003069448A3 (en) 2005-03-24
RU2004127443A (en) 2005-05-27

Similar Documents

Publication Publication Date Title
Adida Advances in cryptographic voting systems
Panja et al. A smart contract system for decentralized borda count voting
US20190019366A1 (en) System and method of determining ballots of voters collected with the aid of electronic balloting
US20200351083A1 (en) Systems and methods for communication, storage and processing of data provided by an entity over a blockchain network
AU2005225783A1 (en) Electronic voting systems
US7389250B2 (en) Coercion-free voting scheme
US20230282052A1 (en) Blockchain-based voting system
Benaloh et al. Shuffle-sum: coercion-resistant verifiable tallying for STV voting
Bernhard et al. Security proofs for participation privacy, receipt-freeness and ballot privacy for the helios voting scheme
Haines et al. Sok: Secure e-voting with everlasting privacy
US20220239502A1 (en) System to Securely Issue and Count Electronic Ballots
CA2475136C (en) A coercion-free voting scheme
Lu et al. Self-tallying e-voting with public traceability based on blockchain
EP3474241A1 (en) Electronic balloting
Grontas et al. Blockchain, consensus, and cryptography in electronic voting
JP2004192029A (en) Electronic voting system, voting data generating server, terminal equipment, tabulation server and computer program
Bag et al. E2E verifiable electronic voting system for shareholders
Müller et al. DeVoS: Deniable Yet Verifiable Vote Updating
Liu et al. A Publicly Verifiable E-Voting System Based on Biometrics
Gupta et al. Improving the End to End Protection in E-voting using BVM-Blockchain based e-Voting Mechanism
Krishnamoorthy et al. A Robust Blockchain Assisted Electronic Voting Mechanism with Enhanced Cyber Norms and Precautions
Sarier Efficient and Usable Coercion-Resistant E-Voting on the Blockchain
Tassa et al. Towards Secure Virtual Elections: Multiparty Computation of Order Based Voting Rules
Blanchard The peasants are revoting, sire, and at random times
Forsythe Encrypted receipts for voter-verified elections using homomorphic encryption

Legal Events

Date Code Title Description
EEER Examination request
MKLA Lapsed