Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND DEVICE BINDING METADATA WITH HARDWARE INTRINSIC PROPERTIES
Document Type and Number:
WIPO Patent Application WO/2015/200196
Kind Code:
A1
Abstract:
A system, device, and method for binding metadata, such as information derived from the output of a biometric sensor, to hardware intrinsic properties by obtaining authentication-related metadata and combining it with information pertaining to a root of trust, such as a physical unclonable function. The metadata may be derived from a sensor such as a biometric sensor, the root of trust may be a physical unclonable function, the combination of the metadata and root of trust information may employ a hash function, and output from such a hash process may be used as an input to the root of trust. The combined information can be used in interactive or non-interactive authentication.

Inventors:
WALSH JOHN (US)
WALLRABENSTEIN JOHN ROSS (US)
Application Number:
PCT/US2015/036937
Publication Date:
December 30, 2015
Filing Date:
June 22, 2015
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SYPRIS ELECTRONICS LLC (US)
International Classes:
H04L9/32
Foreign References:
US20100122093A12010-05-13
US20140108786A12014-04-17
US20130046990A12013-02-21
Other References:
DWOSKIN, J ET AL.: "Hardware-rooted Trust for Secure Key Management and Transient Trust'';", PROCEEDINGS OF THE 14TH ACM CONFERENCE: COMPUTER & COMMUNICATIONS SECURITY;, 2 November 2007 (2007-11-02), pages 389 - 400, XP055225494, Retrieved from the Internet [retrieved on 20150825]
Attorney, Agent or Firm:
BRINDISI, Thomas (Key West, Florida, US)
Download PDF:
Claims:
What is claimed is:

1. A device configured to bind authentication-related metadata with hardware intrinsic properties, the device having associated device enrollment parameters and associated authentication-related metadata, and comprising a) a root of trust having a root of trust input and a root of trust output and constructed to generate, in response to input, an output value that is characteristic to the root of trust; and

b) a processor including a processor input that is connected to the root of trust output, and including a processor output connected to the root of trust input, the processor configured to:

i) algorithmically combine device enrollment parameters associated with the device with authentication-related metadata associated with the device to produce a binding value;

ii) convey the binding value to the root of trust input such that the root of trust generates a corresponding output; and

in) create a token as a function of the root of trust output corresponding to the conveyed binding value.

2. The device of claim 1, wherein the authenticating zero knowledge proof that the processor is configured to perform incorporates non-sensitive metadata.

3. The device of claim 1, wherein the authenticating zero knowledge proof that the processor is configured to perform is non-interactive and includes a nonce that incorporates non-sensitive metadata.

4. The device of claim 1, wherein the authenticating zero knowledge proof that the processor is configured to perform is interactive.

5. The device of claim 1, wherein the processor is configured to algorithmically combine enrollment parameters with metadata using a cryptographic hash function.

6. The device of claim 1, wherein the processor is configured to algorithmically combine enrollment parameters with metadata using an iterative cryptographic hash function.

7. The device of claim 1, wherein the metadata that the processor is configured to algorithmically combine with enrollment parameters consists solely of sensitive metadata.

8. The device of claim 1, wherein the token that the processor is configured to create is a public identity token.

9. The device of claim 1, wherein the metadata that the processor is configured to algorithmically combine with enrollment parameters consists solely of sensitive metadata, and wherein the token that the processor is configured to create is a public identity token.

10. The device of claim 1, wherein the processor is configured to algorithmically combine enrollment parameters that include values pertaining to a cryptographic mathematical framework.

11. The device of claim 10, wherein the processor is further configured to perform elliptic curve cryptography.

12. The device of claim 11, wherein the values pertaining to a cryptographic mathematical framework include a challenge value, an elliptic curve base point, and a modulus.

13. The device of claims 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, or 12, wherein the processor is further configured to, in response to an authentication request from an external verifying entity, perform an authenticating zero knowledge proof without conveying any sensitive metadata to the external verifying entity.

14. The device of claims 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, or 12, wherein the root of trust is a physical unclonable function fPUF), the root of trust input is a PUF input, and the root of trust output is a PUF output, and wherein the PUF is constructed to generate, in response to the input of a specific challenge, an output value that is characteristic to the PUF and the specific challenge.

15. The device of claim 14, wherein the processor is further configured to, in response to an authentication request from an external verifying entity, perform an authenticating zero knowledge proof without conveying any sensitive metadata to the external verifying entity.

16. The device of claim 14, wherein the processor is configured to algorithmically combine enrollment parameters that include a challenge value.

17. The device of claim 15, wherein the processor is configured to algorithmically combine enrollment parameters that include a challenge value.

Description:
SYSTEM AND DEVICE BINDING METADATA WITH HARDWARE INTRINSIC

PROPERTIES

CROSS REFERENCE TO RELATED APPLICATIONS

[0001] This application claims the benefit of the priority of provisional U.S. Patent Applications S.N.62/017,045 filed June 25, 2014, and U.S. Patent Application S.N.14/704,963, filed May 5, 2015, each of which is incorporated by reference here.

BACKGROUND OF THE INVENTION

[0002] Authentication protocols generally rely on private information held by entities in order to establish identity. In traditional systems, the private data may consist of a user- name and password pair, personal identification numbers (PINs) or cryptographic keys. Multi-factor authentication protocols generally require two or more kinds of identifying information such as information the entity knows (e.g., a username and password), something the entity has (e.g., a smart card or token), and information representing what the entity is (e.g., a fingerprint).

[0003] Metadata comprises auxiliary information relating to the identity or state of an entity involved in authentication. Examples of metadata include biometric data, sensor output, global positioning data, passwords or PINs and similar auxiliary information that may be used to construct a characterization of an entity's identity or state. Biometric data comprises measurements of physical characteristics of a user (e.g., fingerprints, retina, iris, voice and vein patterns) that are adequately unique to be used as a proof of identity.

[0004] Systems that rely on sensor output, however, maybe vulnerable to forged sensor output; and while biometric systems utilize apotent property for authentication, they can face challenges relating to the exposure and/or loss of sensitive biometric data. Since a sensor transforms a measured physical characteristic into a binary string, which is stored (enrolled) by the computer system and then compared to the binary strings subsequently generated by the sensor upon authentication requests, without further measures the system cannot distinguish between a string returned from the sensor and a string supplied by an adversary without the sensor. Thus for example an adversary may attempt to observe the output of a biometric sensor for a particular user and "clone" the user by supplying the surreptitiously-obtained biometric data to the system. An adversary may similarly attempt to clone a user by reading biometric data stored in the system. Further, since the features utilized in biometric systems tend by definition to be substantially immutable, the compromise of a user's biometric data cannot be remedied in the way that a lost user password can simply be changed.

[0005] Characteristics that are unique and intrinsic to individual hardware devices (e.g., wire resistance, initial memory state, CPU instruction timing) can also be extracted and used as part of authentication protocols. A leading example of this is the physical unclonable function (PUF). A PUF function /(c) maps an input domain (or challenges) c to an output range (or response) r, where the mapping is defined based on characteristics unique to the device computing /(·). A circuit or hardware description of /(·) may be identical across all devices, yet the mapping from the domain to the range will be unique based on the specific hardware device executing the circuit computing /(·).

[0006] In various device authentication schemes, physical unclonable functions (PUFs) have been used such that each device has a unique identity intrinsically-linked to the device. Ruhrmair et al. ("Modeling Attacks on Physical Unclonable Functions," Proceedings of the 17th ACM conference on Computer and communications security, CCS '10, pages 237-249, ACM, 2010) define three distinct classes of PUF devices:

• AWeakPUFistypicallyusedonlytoderiveasecretkey. The challenge space may be limited, and the response space is assumed to never be revealed. Typical constructions include the SRAM (Holcomb et al., "Initial SRAM State as a Fingerprint and Source of True Random Numbers for RFID Tags," In Proceedings of the Conference on BHD Security, 2007), Butterfly (Kumar et al., "Extended abstract: The Butterfly PUF Protecting IP on Every FPGA," IEEE International Workshop on Hardware-Oriented Security and Trust, pages 67-70, 2008), Arbiter (Lee et al., "A technique to build a secret key in integrated circuits for identification and authentication applications," IEEE Symposium on VLSI Circuits: Digest of Technical Papers, pages 176-179, 2004), Ring Oscillator (Sub. et al., "Physical Unclonable Functions for Device Authentication and Secret Key Generation," Proceedings of the 44th annual Design Automation Conference, DAC '07, pages 9-14, ACM, 2007), and Coating (Tuyls et al., "Read-Proof Hardware from Protective Coatings," Proceedings of the 8th international conference on Cryptographic Hardware and Embedded Systems, CHES'06, pages 369-383, Springer, 2006) PUFs.

• A Strong PUF is assumed to be (i) physically impossible to clone, (ii) impossible to collect a complete set of challenge response pairs in a reasonable time (typically taken to be on the order of weeks), and (iii) difficult to predict the response to a random challenge. For example, the super-high information content (SHIC) PUF described by Runrmair ("Applications of High-Capacity Crossbar Memories in Cryptography," IEEE Trans. NanotechnoL, volume 10, no. 3:489-498, 2011) may be considered a Strong PUF.

• A Controlled PUF satisfies all of the criteria for strong PUFs, and additionally implements an auxiliary control unit capable of computing more advanced functionalities to cryptographically augment protocols.

[0007] PUF output is noisy in that it varies slightly despite evaluating the same input This is generally addressed with fuzzy extraction, a method developed to eliminate noise in biometric measurements. (See Juels et al., "A Fuzzy Commitment Scheme," Proceedings of the 6th ACM conference on Computer and Communications Security, CCS '99, pages 28-36, ACM, 1999). Fuzzy extraction may in part be employed within a device having a PUF such as within an auxiliary control unit, such that the output is constant for a fixed input. Fuzzy extraction (or reverse fuzzy extraction) may for example employ a "secure sketch, " as described by Juels et aL to store a sensitive value pf" to be reconstructed and a helper string helper, for recovering ρ * . A secure sketch SS for input string O, where ECC is a binary (», k,2t + 1) error correcting code of length n capable of correcting t errors and PF"+- {0,1}* is a A; -bit value, may for example be defined as SS(0; pf* * ) = ΟΘ ECCip * * ). The original value V then may be reproduced given the helper string helper, and an input O 1 within a maYimiinn Hamming distance t of O using a decoding scheme D for the error-correcting code ECC and &,

[0008] A physical unclonable function P_i : {0, l}* 1 → {0, l}* 2 bound to a device d preferably exhibits the following properties:

1. Unclonability: Pr[dist[y,x)≤ t\x *- U Kl ,y *- P(JC),Z «- Pi < ei, the probability of duplicating PUF P with a clone PUF V such that their output distributions are t- statistically close is less than some sufficiently small βι.

2. Unpredictability: It is desirable that an adversary cannot predict a device's PUF response r for a challenge c with more than negligible probability (at least without physical access to the device), and that helper data does not reveal anything to an adversary about PUF responses. Assuming that all entities are bound to probabilistic polynoinial-time (PPT), i.e., can only efficiently perform computation requiring polynomially many operations with respect to a global security parameter A (which refers to the number of bits in the relevant parameter), Adv^ " ™^) == Pr[r = r 7 ], denoting the probability of the adversary J4 guessing the correct response r of the PUF P to the challenge c, is preferably negligible in κ 2 . This can be assessed, for example, through a game between an adversary J4 and a PUF device P : {0, l}* 1 — > {0, l}* 2 mapping input strings from the challenge space ¾p of length K to the response space &p of length κ 2 where A is the security parameter for the protocol, given in unary as lA PUF-PRED: PUF Prediction Game

The game proceeds as follows:

1. The adversary J4 issues polynomially many (w.r.L the security parameter A) challenges Ci€ ¾ to the PUF device P, where the challenge set ¾ is a proper subset of the entire challenge space ¾p.

2. The PUF device P returns the responses {η\η «- P(c*)} to J4.

3. The adversary J4 eventually outputs a challenge c that was not in the original set of challenge queries ¾p. The adversary is not allowed to query the PUF device P on the committed challenge c.

4. The adversary J4 may once again issue a new set of polynomially many challenges c€ to the PUF device P. The adversary is not allowed to query the PUF device P on the committed challenge c.

5. The PUF device P returns the responses {τ\\τ\ «- P(c|)} to J4. 6. The adversary J4 eventually outputs a guess r* for P's response to the committed challenge c

The adversary only wins the game when guess r* is equal to P's actual response r «— P(c) to s/'s committed challenge c. (As noted, the PUF's output is noisy and will vary slightly on any fixed input, so the equality is typically taken with respect to the output of a fuzzy extractor (e.g., Dodis et al.,"Fuzzy Extractors: How to Generate Strong Keys from Biometrics and Other Noisy Data," SIAMJ. Comput., volume 38, no. 1:97-139, 2008)).

3. Robustness: Pr[dist( ,z) > t\x «- U Kl ,y «- V(x),z «- P(x)] < e 2 , i.e., the probability of a fixed PUF P yielding responses t -distant on the same input x is less than some sufficiently small e 2 .

4. mdistmguishability: The output of the PUF device (typically fuzzy extractor output) preferably is computationally indistinguishable from a random string of the same length I, such that a PPT adversary #'S advantage Adv "nD (<) is at most negligibly more than \. The mdistinguishability of a PUF can be assessed, for example, through a game in which an adversary J4 is asked to differentiate between the output r of the fuzzy extractor for a PUF P and a randomly chosen string s {0, 1}' of the same length I.

PUF-IND: PUF Indistin uishability Game

This game proceeds as follows:

1. Adversary J4 executes the enrollment phase on any challenge c* ¾p.

2. The PUF device returns the corresponding helper string if*, which blinds the error corrected sensitive value ECC{Ri) with the output of the PUF P(c). Denote this set of challenge-helper pairs {c it H t ) as

3. Adversary J4 now requests the PUF response r* = P(c<) for any c* Denote the set of requested challenges in this step

4. For all requests c* the PUF device returns the set «— Pfc*)}. 5. Adversary J4 selects a challenge c such that J4 has if* but not R t for c. The PUF device chooses a bit b€ {0, 1} uniformly at random.

6. If b = 0, J4 is given Ri = D(H { Θ P(c)). Otherwise, if b = 1 then J4 is given a random string s e {0, 1}'.

7. Adversary J4 is allowed to query the PUF device for c so long as no c| = c.

8. For all requests c|≠ c, the PUF device returns the set {r |r/ <- P(c|)}.

9. The adversary outputs a guess bit b*, and succeeds when V=b.

Related assessments of PUFs are provided by Hon et aL, "Quantitative and Statistical Performance Evaluation of Arbiter Physical Unclonable Functions on FPGAs," 2010 International Conference on Reconfigurable Computing and FPGAs (ReConFig), pages 298-303, 2010; Maiti, A Systematic Approach to Design an Efficient Physical Unclonable Function, dissertation, Virginia Tech, 2012, and others.

[0009] Various authentication schemes utilize zero knowledge proofs of knowledge, which is a method for proving that a given statement is true, while revealing nothing beyond this fact. The zero knowledge proof is an interaction between two parties: a prover 9 that wishes to establish the validity of a statement, and a verifier Ύ that must be convinced the statement is true. The verifier should be convinced with ovemhelming probability that a true statement is indeed true. With a zero knowledge proof of knowledge, the verifier could not use the messages from a previous proof to convince a new party of the statement's validity, and the messages reveal only a single bit of information: whether or not the prover 9 possesses the secret. There are two general classes of zero knowledge proofs: interactive zero knowledge proofs, where a series of messages are exchanged between the prover 9 and verifier V, and non- interactive zero knowledge proofs, where the prover 9 conveys a single message M without interaction with Ύ, yet Ύ is convinced that 9 possesses the secret. Many (interactive) zero knowledge proof systems require multiple iterations to establish the validity of a statement. That is, each interaction may succeed with some probability, even if the prover does not possess the secret (or the statement is false). Thus, if the probability of success when the statement is false is p, the protocol is run n times until 1 - (p) n is sufficiently close to 1.

[0010] U.S. Patent Nos. 8,577,091 to Ivanov et al. and 8,566,579 to Armstrong et al. describe authentication systems wherein intrinsic hardware characteristics (e.g., PUF output) as well as human biometrics are required to successfully complete an authentication, but neither provide a method for inexorably linking the PUF with the authentication or a method for handling non-sensitive sensor output.

[0011] Frikken et aL ("Robust Authentication using Physically Unclonable Functions," Information Security, volume 5735 of Lecture Notes in Computer Science, pages 262-277, Springer, 2009) teach a method for combining metadata (e.g., PIN) into the input of a PUF, but does not provide an extension to arbitrary metadata (e.g., biometrics) or non-sensitive metadata (e.g., temperature, pressure).

[0012] Rust (editor) in "Dl.l Report on use case and architecture requirements," Holistic Approaches for Integrity of ICT-Systems (2013) mentions the idea of merging biometric features with a cell-based PUF, but does not elaborate on a means for achieving this.

[0013] U.S. Patent Application Publication No. 20110002461 to Erhart et al. describes a method for authenticating sensor output by employing a PUF, which extracts unique characteristics of the physical sensor hardware. The method does not directly link the output of the sensor to the authentication of the hardware, however, and also requires that sensitive biometric sensor output leave the device.

SUMMARY OF THE INVENTION

[0014] An intrinsic identity of a device is constructed by generating an enrollment token or public key, which is based on intrinsic characteristics unique to the device such as a physical unclonable function (PUF). An authentication system utilizes the device's enrollment token or public key to verify the device's authenticity, preferably through a zero knowledge proof. Sensitive metadata is preferably also incorporated into the enrollment token or public key, which may be accomplished through an algorithmic means such as a hash function combining the metadata with hardware-intrinsic (e.g., PUF) data. The authentication maybe interactive or non-interactive.

BRIEF DESCRIPTION OF THE DRAWING

[0015] Fig. 1 is a functional diagram illustrating metadata binding operational flow in an embodiment of the present invention; and

[0016] Fig. 2 is a functional diagram illustrating an embodiment that provides zero knowledge proof generation for arbitrary sensor output.

DETAILED DESCRIPTION OF EMBODIMENTS

[0017] Although the invention applies to metadata generally, an exemplary embodiment utilizing biometric sensors is described. The present invention is also described with reference to the example of an embodiment utilizing elliptic curve cryptography (including the associated tenninology and conventions), but the inventive concept and teachings herein apply equally to various other cryptographic schemes such as ones employing different problems like discrete logarithm or factoring, and the invention is not limited by the various additional features described herein that may be employed with or by virtue of the invention.

[0018] In order to construct an intrinsic identity of a device, a public representation of the device's identity (referred to here as an enrollment token or public key) is generated. In this process of device enrollment, a cryptographic enrollment token is collected from the device. An elliptic curve mathematical framework for enrollment and authentication may be used, but other suitable frameworks (e.g., discrete logarithm frameworks, in which regard U.S. Patent No. 8,918,647 is incorporated here by reference) will provide the same functionality. A cryptographic enrollment token (or series of tokens) {{c P At mod p)} is collected from the PUF device in response to a challenge query c* (or queries) by the server. In one embodiment, the device chooses a private key pf 1 * uniformly at random from the space {0, l , where A is the security parameter (e.g., the number of bits in the modulus p) and calculates A t = ρ *" · G mod p as the device's public key, where G is a base point of order q on an elliptic curve overlFp. Using Algorithm 1, a device can optionally perform a local enrollment protocol using the PUF without interacting with a server. Ibis allows each PUF circuit to generate a local public key pf - This can also be useful in bootstrapping more complex key setup algorithms,' though bootstrapping may be unnecessary if key setup is performed internally within the device (rather than externally among a set of distinct devices). In another embodiment, the device may accept elliptic curve parameters and a challenge from an external server, with the server subsequently storing the challenge and helper data for the device.

Algorithm 1 Enrollment

for Device d do

Select finite field ¥ p of order p

Select E, an elliptic curve over ¥ p

Find G e E/¥ p , a base point of order q

Ci G F p , a group element

x = H{c G, p, q) q)

end for

[0019] In the example of an embodiment employing elliptic curve cryptography, Algorithms 2 and 3 below can optionally be used to allow a PUF -enabled device to store and retrieve a sensitive value without storing any sensitive information in non-volatile memory. Algorithm 2 illustrates the storing of a sensitive value p riv using a PUF, and Al- gorithm 3 illustrates the The challenge a and helper data helper^ can be public, as neither reveals anything about the sensitive value p riv . These values may be stored locally to the device, or externally on a different device. If stored externally, the values would be provided to the device prior to ninning the algorithm. While the present example uses encryption of p by exclusive-or, θ, pf 1 * could also be used as a key to other encryption algorithms (e.g., AES) to enable storage and retrieval of arbitrarily sized values.

Algorithm 2 PUF -Store

Goal: Store value riv

for PUF Device d do

x = H{c G, p, q)

0 = PUF{x)

helper,. = O 0 ECC(p iv )

Write { Ci , helper, }

end for

Algorithm 3 PUF -Retrieve

Goal: Retrieve value

for PUF Device d do

Read {c / , helper, }

x ^ H{c G, p, q) Θ Ο) θ Ο')

end for

Whenever O and C are t -close, the error correcting code ECC can be passed to a decoding algorithm D which will recover the sensitive value p^ riv .

[0020] In an elliptic curve embodiment, upon receiving an authentication request from a device, the server can conduct an elliptic curve variant of Chaum et aL's ("An Improved Protocol for Demonstrating Possession of Discrete Logarithms and some Generalizations," Proceedings of the 6th annual international conference on Theory and application of cryptographic techniques, EUROCBYPT'87, pages 127-141, Springer, 1988) zero knowledge proof protocol with the device d in order to authenticate the device d, as shown in Algorithm 4. Algorithm 4 Authentication

for PUF Device d do

Server s <— request

end for

for Server s do

Device d <— {c i t helper;, G, p, q, N] where N is a nonce

end for

for PUF Device d do

x <- H{ Ci, G, p, q)

pP riv «_ (helper; Θ PUF(x))

Ai = p 9 G mod p

r <— random e F p , a random group element

B <— r - G mod p

c *- Uas {G, B,Ai, N)

priv j

m <—r + c - p { mod q

Server s <— {B, m\

end for

for Server s do

c' <- Hash(G, -9,_4 / , _V)

e

[0021] The verifying server in Algorithm 4 provides the device a nonce value specific to the current proof, so as to prevent an eavesdropping adversary from using previous proofs from a valid device to successfully complete an authentication protocol and masquerade as the device. A non-interactive zero knowledge proof removes this communication requirement, and allows the proof to be completed without interacting with the verifying end point. Achieving a non-interactive construction requires the proving device to generate the nonce on behalf of the verifier in a manner that prevents the proving end device from manipulating the proof.

[0022] One method for constructing a non-interactive zero knowledge proof is for the device to construct a nonce N as N*- H(A\ \ τ), where A is the device's public key, if (·) is a cryptographic hash function, τ is a timestamp and x \ \y denotes concatenation of x and y . The timestamp ensures that previous proofs constructed by the proving device cannot be replayed by an adversary in the future, while the hash function ensures that the proving device cannot manipulate the nonce in an adversarial manner. The reliance on times- tamps is substantially less onerous than reliance on globally synchronized clocks. That is, the timestamp need not match the current timestamp on arrival at the prover exactly, which eliminates the potential of network delay to affect the proof. Rather, the verifying end point checks that the timestamp is reasonably current (e.g., second granularity) and monotonically increasing to prevent replay attacks. An exemplary non-interactive zero knowledge proof for a PUF-enabled device is described in Algorithm 5.

Algorithm 5 Non-Interactive Authentication

for PUF Device d do

^pnv ^_ puF-Retrieve(C j , helper,)

Ai = p? riv · G mod p

r <— random e F p , a random group element

B <— r - G mod p

N <— Hash(A| | ir) where τ is the current timestamp

c <- Hash(G, -9,_4 / , _V)

priv j

m <—r + c - p { mod q

Server s *— {Β, ιη, τ}

end for

for Server s do

Ai = p? riv · G mod p (public key stored from device enrollment)

Verify τ is reasonably current (e.g., τ = current time—e)

N ^ Hash HO

c' *- Uash{G, B,Ai, N)

e

Metadata Binding

[0023] Let metadata binding refer to the process of incorporating auxiliary metadata into the authentication process. Metadata is arbitrary auxiliary information upon which the authentication protocol should depend. That is, without the correct metadata, the authentication should fail. Metadata may be characterized as either sensitive or non- sensitive, where sensitive metadata should not leave the device (e.g., password, PIN, bio- metric) and non-sensitive metadata may leave the device (e.g., sensor output on temperature, pressure).

[0024] Sensitive metadata is preferably incorporated into the public identity token cre- ated during enrollment For example, when no sensitive metadata is provided, device enrollment outputs a public identity that characterizes only the device. However, when sensitive metadata is provided during enrollment (e.g., biometrics, PIN, etc.), the public identity characterizes both the device and the sensitive metadata. One embodiment of the invention never requires the sensitive metadata to leave the device, as the zero knowledge proof protocol is completed without the verifier having access to the sensitive metadata.

[0025] Non-sensitive metadata is preferably not incorporated into the enrollment process such that the public identity output from enrollment does not depend on non-sensitive metadata (e.g., sensor output for temperature, pressure, etc.). Rather, non-sensitive metadata is preferably incorporated into the zero knowledge proof protocol, such that the proof of device and/or user authenticity is only valid if the corresponding non-sensitive metadata is also provided to the verifier. This allows the device and/or user to have a single public identity, and yet a verifier given access to the non-sensitive metadata can verify both the authenticity of the device and/or user as well as the origin of the metadata.

[0026] Fig. 1 illustrates the process flow of metadata binding. First, Enrollment Parameters 1 are retrieved and may be combined with Sensitive Metadata 2 through a Cryptographic Hash Function 3. The output of the cryptographic hash function is used as input to the Physical Unclonable Function 4 , which links the enrollment parameters and optional metadata to the hardware identity. Finally, an Enrollment Token 5 is returned as a function of the PUF output.

[0027] In general, a hash function is defined as HO) : {0, 1}*— > {0, l , where A is a fixed constant That is, a hash function if (·) (or written explicitly as Hash(-)) takes an input of arbitrary size, and maps to a finite output domain. For cryptographic settings, hash functions must satisfy additional properties. In the context of binding metadata in authentication protocols, a hash function should preferably be one-way, collision-resistant and satisfy an avalanche condition. One-way means that it is computationally infeasi- ble to determine the input x when given the output H(x), ensuring the output H{x) does not reveal anything about the input x. Collision-resistant means that it is computationally infeasible to provide a different set of metadata y such that H{x) = H(y), where x is the proper metadata for a given entity. The avalanche condition means that each bit of H{x) is complemented from H{x) with probability , where x is any hash input and x is x with a single bit complemented, ensuring that the output if (x) changes substantially in response to a minor change to the input x, which allows any change to the metadata to be detected and force a failed authentication.

[0028] One way to bind the metadata M % is to simply redefine the PUF input x as H{c M G, p, q) rather than H{c t , G, p, q). Thus, the revised Enrollment Algorithm 6 becomes:

Algorithm 6 Metadata Enrollment

for Device d do

Select finite field ¥ p of order p

Select E, an elliptic curve over ¥ p

Find G e E/¥ p , a base point of order q

Retrieve Metadata . //,·

Ci e F p , a group element d q)

Store {p ^ Ci, helper,.}

end for

Various other permutations of values (pertinent to the mathematical framework used) may be hashed to produce a PUF input incorporating metadata, however, and moreover one or more values could be iteratively hashed and/or hashed values nested (e.g., H{H{Ci\\Mi),G,p,q), etc.). Further, other methods for linking and/or combining the parameters (e.g., an all-or-nothing transformation) maybe employed.

[0029] Due to the avalanche property of hash functions, the metadata must be exactly the same in order for the authentication to succeed. However, the exemplary embodiment of biometric authentication frequently results in noise, where scans differ slightly despite observing the same characteristic (e.g., fingerprint, iris, etc.). Thus, means such as a fuzzy extractor may preferably be employed to ensure that the biometric reliably returns a constant value. For example, a constant value M for the metadata may be chosen and linked to an associated public helper data value hf. A noisy biometric scan Sf can then be used to compute hf «— ECC{ C t ) Θ Sf where ECC is an error correcting code, and given access to a new biometric scan & that is t -close to Sf, the constant value M % can be recovered by computing M % «— D(h Θ S?), where D is the corresponding error decoding algorithm.

[0030] Fig.2 illustrates the process flow of constructing a zero knowledge proof demonstrating sensor integrity, user authentication, and sensor output verification in an embodiment utilizing a biometric authentication sensor (e.g., fingerprint scanner). The hardware may be part of a module (e.g., U.are.U 4500) or limited to only a sensor (e.g., TCS4K Swipe Sensor). Preferably, the PUF circuit (e.g., a ring oscillator, SRAM, arbiter, etc.) will be directly integrated with the sensor hardware, such that modification to the sensor hardware alters the PUF mapping. One embodiment of such a device may comprise a Xi]inxArtix7 field programmable gate array (FPGA) platform, equipped, e.g., with 215,000 logic cells, 13 Megabytes of block random access memory, and 700 digital signal processing (DSP) slices. In an embodiment employing elliptic curve cryptography, for example, the hardware mathematics engine may be instantiated in the on-board DSP slices, with the PUF construction positioned within the logic cells, and a logical processing core including an input and output to the PUF and constructed to control those and the device's external input and output and to perform algorithms (sending elliptic curve and other mathematical calculations to the math engine) such as those described above.

First, Enrollment Parameters 6 are retrieved and maybe combined with Sensitive Metadata 7 through a Cryptographic Hash Function 8. The output of the cryptographic hash function is used as input to the Physical Unclonable Function 9 , which links the enrollment parameters and sensitive metadata to the hardware identity. Next, the Proof Parameters 10 and Sensor Output 11 are linked through a Cryptographic Hash Function 12. The output of Physical Unclonable Function 9 and Cryptographic Hash Function 12 are synthesized to generate a Zero Knowledge Proof 13 , which outputs a Proof Token 14 that will convince a verifier of the integrity of the sensor, authenticate the user, and validate the sensor output

[0031] Algorithm 7 provides an example of how a fingerprint scan may be bound to the authentication protocol such that both the device and fingerprint must match those originally enrolled. Non-sensitive metadata (e.g., sensor output for temperature, pressure, etc.) i ^ maybe incorporated into the non-interactive authentication algorithm by incorporating it into the construction of the nonce N, and providing i ^ to the verifier. Thus, the verifier is only able to construct the nonce N (and, consequently, the variable c if i ^ matches the output from the sensor.

Algorithm 7 Device & Fingerprint Authentication

for User do

Scan Fingerprint

FP <— Scan

Read Fingerprint Helper Data hf

Jl p <- ECQhf Θ FP)

end for

for PUF Device d do

^pnv ^_ puF-Retrieve(c / ,^ , i F - p , helper,)

Ai = p? riv · G mod p

r <— random e F„, a random group element

B<— r-G mod p

N <- Hash(A 11 ( ub \ \ τ) where τ is the current timestamp

c<-Hash(G,.3,_4,_V)

priv j

m<—r + c-p { modq

Server s <— {B, m,^f uh , τ}

end for

for Server s do

Ai = p 9 G mod p (public key stored from device enrollment)

-V«-Hash(A||^P ub ||T)

c'<-Has!a!iG,B,A,N)

e

First, a user's fingerprint scan FP is used in combination with the helper data hf for the original fingerprint scan to recover the metadata value Next, the metadata value Mi is used as input to the PUF, such that the PUF output depends on the metadata. In order to bind non-sensitive metadata C^ to the proof, it is used to construct the nonce N, which depends on the public identity A as well as the current timestamp τ (which prevents replay attacks) . The non-sensitive metadata i ^ is then provided to the verifier, as it is now necessary to verify the proof. (If the non-sensitive metadata should only be revealed to the verifier, it may be sent encrypted). Finally, the device constructs the non-interactive zero knowledge proof, which enables the server to verify if both the device and (sensitive and non-sensitive) metadata are correct

[0032] An interactive zero knowledge proof may also be constructed by requiring the server to issue a nonce N to the device. This exemplary construction is illustrated in Algorithm 8.

Algorithm 8 Interactive Device & Fingerprint Authentication

for Server s do

Send nonce N e {0, 1} λ to Device, where λ is the number of bits in the modulus p end for

for User do

Scan Fingerprint

FP <— Scan

Read Fingerprint Helper Data hf

£FP ^ ECQh f Θ FP)

end for

for PUF Device d do

pf riV PUF-Retrieve(c/, ^ , i F - p , helper,)

Ai = p 9 G mod p

r <— random e F p , a random group element

B <— r - G mod p

end for

for Server s do

Ai = p? riv · G mod p (public key stored from device enrollment)

c' *- Uash{G, B, Ai, N)

e

[0033] The addition of (sensitive and/or non-sensitive) metadata is optional in embod- iments of the invention. That is, non-sensitive metadata may be included while sensitive metadata is excluded. This requires only that the public identity token did not incorporate the sensitive metadata. Similarly, sensitive metadata may be included while non- sensitive metadata is excluded. This requires only that the nonce is not constructed using non-sensitive metadata.

[0034] As one embodiment of the present invention relies on an elliptic curve mathematical framework, one skilled in the art will realize that it may be extended to support cryptograpnically- enforced role based access control (RBAC). That is, data access policies and device credentials may be specified mathematically, and the RBAC algorithm computes a function f{&, ¾")— > {0, 1} mapping policies 9 and credentials ¾" to an access decision in {0, 1}. This is typically accomplished by constructing a bilinear pairing (e.g., Weil or Tate pairing), and is a natural extension of the present invention.

[0035] While the foregoing embodiments have been described with various features, one of ordinary skill in the art will recognize that, the authentication protocol need not be limited to zero knowledge, and could be based on other cryptographic constructions for estabUshing identity. For example, the device could use its hardware identity to digitally sign the contents of a packet, and include this signature in the packet header (e.g., TCP Options Header, where an example header would include {B = r - G mod p, m = r+Hash(G,5,A,iV)-raiid mod *?,?}) and the hardware identity may be applied to a variety of other cryptographic authentication techniques, and need not be limited by the zero knowledge aspect of the example provided.