Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
KEY GENERATION FOR USE IN SECURED COMMUNICATION
Document Type and Number:
WIPO Patent Application WO/2021/062537
Kind Code:
A1
Abstract:
Systems and methods for key generation for secure communication between a first user computing device and a second user computing device without requiring direct communication during key generation. The method using a plurality of privacy providers and a first private table and a second private table. The method including: performing by the second user computing device: receiving indexes each associated with a value in the second private table, each index received from the respective privacy provider sharing those values, each index associated with a value that matches an indexed value in the first private table received by the respective privacy provider from the first user computing device; and generating a common key for the secure communication by combining the indexed values of the second private table.

Inventors:
LO HOI-KWONG (CA)
MONTAGNA MATTIA (IT)
Application Number:
PCT/CA2020/051307
Publication Date:
April 08, 2021
Filing Date:
September 30, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GOVERNING COUNCIL UNIV TORONTO (CA)
International Classes:
H04L9/30; G06F21/62; H04L9/08; H04L9/32
Domestic Patent References:
WO2019076737A12019-04-25
Foreign References:
CA3078558A12019-04-11
JP2012147341A2012-08-02
Other References:
See also references of EP 4029188A4
Attorney, Agent or Firm:
BHOLE IP LAW (CA)
Download PDF:
Claims:
CLAIMS

1. A method for key generation for secure communication between a first user computing device and a second user computing device without requiring direct communication during key generation between the first user computing device and the second user computing device, the method for key generation using a plurality of privacy providers each comprising a computing device, the method for key generation using a first private table comprising values with associated indexes each shared with the first user computing device and one of the privacy providers and a second private table comprising values with associated indexes each shared with the second user computing device and one of the privacy providers, the method comprising: performing by the second user computing device: receiving indexes each associated with a value in the second private table, each index received from the respective privacy provider sharing those values, each index associated with a value that matches an indexed value in the first private table received by the respective privacy provider from the first user computing device; and generating a common key for the secure communication by combining the indexed values of the second private table.

2. The method of claim 1, wherein generating the common key comprises performing exclusive-or (XOR) operations on the indexed values of the second private table.

3. The method of claim 2, further comprising using the common key to securely communicate between the first user computing device and the second user computing device using a one-time-pad (OTP) encryption approach.

4. The method of claim 3, wherein the number of indexed values represent the number of bits to be communicated by the first user computing device to the second user computing device.

5. The method of claim 1 , wherein the values of the first private table and the second private table are generated using a quantum random number generator (QRNG).

6. The method of claim 1 , further comprising marking as used, by the second user computing device, values in the second private table that have been used to generate the common key.

7. The method of claim 1, further comprising performing erasure, by the second user computing device, of values in the second private table that have been used to generate the common key.

8. The method of claim 1, further comprising authenticating, by the second user computing device, the common key using an authentication service.

9. The method of claim 1, further comprising performing, by the second user computing device, privacy amplification comprising applying a linear matrix to the common key.

10. The method of claim 1, wherein receiving the indexes associated with the values in the second private table comprises receiving values determined by performing exclusive-or (XOR) operations, by each respective privacy provider, on the indexed values in the first private table and the second private table.

11. A system for key generation for secure communication with a first user computing device without requiring direct communication during key generation with the first user computing device, the system comprising one or more processors and a data storage, the system in communication with one or more privacy providers each comprising a computing device, a first private table comprising values with associated indexes each shared with the first user computing device and one of the privacy providers, the one or more processors in communication with the data storage device and configured to execute: a table module for receiving indexes each associated with a value in a second private table, the second private table comprising values with associated indexes each shared with one of the privacy providers, each index received from the respective privacy provider sharing those values, each index associated with a value that matches an indexed value in the first private table received by the respective privacy provider from the first user computing device; and a common key module for generating a common key for the secure communication by combining the indexed values of the second private table.

12. The system of claim 11, wherein the common key module generates the common key by performing exclusive-or (XOR) operations on the indexed values of the second private table.

13. The system of claim 12, wherein the one or more processors are further configured to execute a communication module to use the common key to securely communicate with the first user computing device using a one-time-pad (OTP) encryption approach.

14. The system of claim 13, wherein the number of indexed values represent the number of bits to be communicated by the first user computing device.

15. The system of claim 11 , wherein the values of the first private table and the second private table are generated using a quantum random number generator (QRNG).

16. The system of claim 11, wherein the table module marks, as used, values in the second private table that have been used to generate the common key.

17. The system of claim 11 , wherein the one or more processors are further configured to execute an erasure module to perform erasure of values in the second private table that have been used to generate the common key.

18. The system of claim 11 , wherein the one or more processors are further configured to execute an authentication module to authenticate the common key using an authentication service.

19. The system of claim 11 , wherein the one or more processors are further configured to execute an amplification module to perform privacy amplification comprising applying a linear matrix to the common key.

20. The system of claim 11, wherein receiving the indexes associated with the values in the second private table comprises receiving values determined by performing exclusive-or (XOR) operations, by each respective privacy provider, on the indexed values in the first private table and the second private table.

Description:
KEY GENERATION FOR USE IN SECURED COMMUNICATION

TECHNICAL FIELD

[0001] The following relates to data communication systems and encryptions schemes utilized in such systems; and more specifically, to a method and system for key generation for secure communication.

BACKGROUND

[0002] Data communication systems are used to exchange information between devices. The information to be exchanged comprises data that is organized as strings of digital bits formatted so as to be recognizable by other devices and to permit the information to be processed and/or recovered. The exchange of information may occur over a publicly accessible network, such as a communication link between two devices, over a dedicated network within an organization, or may be between two devices within the same dedicated component, such as within a computer or point of sale device. A large number of communication protocols have been developed to allow the exchange of data between different devices. The communication protocols permit the exchange of data in a robust manner, often with error correction and error detection functionality, and for the data to be directed to the intended recipient and recovered for further use. Because the data may be accessible to other devices, it is vulnerable to interception and observation or manipulation. The sensitive nature of the information requires that steps are taken to secure the information and ensure its secrecy, commonly using various types of encryption schemes.

[0003] Various encryption schemes, such as advanced encryption standard (AES), are based on computational assumptions, which may be susceptible to be broken by advances in algorithms and hardware. For example, the emergence of quantum computers provides a substantial leap in computing ability. However, adversaries or interlopers looking to intercept the encrypted communication may also gain access to the power of quantum computing to break encryption and gain access to supposedly secured communications. For medium-term and long-term security, this can present a significant challenge because an interloper or eavesdropper (an entity referred to as “Eve”) might save communications sent now and wait for the proliferation of algorithms and hardware some years or decades later that can decode present encryption schemes. [0004] It is therefore an object of the present invention to provide systems and methods in which the above disadvantages are obviated or mitigated and attainment of the desirable attributes is facilitated.

SUMMARY

[0005] In an aspect, there is provided a method for key generation for secure communication between a first user computing device and a second user computing device without requiring direct communication during key generation between the first user computing device and the second user computing device, the method for key generation using a plurality of privacy providers each comprising a computing device, the method for key generation using a first private table comprising values with associated indexes each shared with the first user computing device and one of the privacy providers and a second private table comprising values with associated indexes each shared with the second user computing device and one of the privacy providers, the method comprising: performing by the second user computing device: receiving indexes each associated with a value in the second private table, each index received from the respective privacy provider sharing those values, each index associated with a value that matches an indexed value in the first private table received by the respective privacy provider from the first user computing device; and generating a common key for the secure communication by combining the indexed values of the second private table.

[0006] In a particular case of the method, generating the common key comprises performing exclusive-or (XOR) operations on the indexed values of the second private table.

[0007] In another case of the method, the method further comprising using the common key to securely communicate between the first user computing device and the second user computing device using a one-time-pad (OTP) encryption approach.

[0008] In yet another case of the method, the number of indexed values represent the number of bits to be communicated by the first user computing device to the second user computing device.

[0009] In yet another case of the method, the values of the first private table and the second private table are generated using a quantum random number generator (QRNG).

[0010] In yet another case of the method, the method further comprising marking as used, by the second user computing device, values in the second private table that have been used to generate the common key. [0011] In yet another case of the method, the method further comprising performing erasure, by the second user computing device, of values in the second private table that have been used to generate the common key.

[0012] In yet another case of the method, the method further comprising authenticating, by the second user computing device, the common key using an authentication service.

[0013] In yet another case of the method, the method further comprising performing, by the second user computing device, privacy amplification comprising applying a linear matrix to the common key.

[0014] In yet another case of the method, receiving the indexes associated with the values in the second private table comprises receiving values determined by performing exclusive-or (XOR) operations, by each respective privacy provider, on the indexed values in the first private table and the second private table.

[0015] In another aspect, there is provided a system for key generation for secure communication with a first user computing device without requiring direct communication during key generation with the first user computing device, the system comprising one or more processors and a data storage, the system in communication with one or more privacy providers each comprising a computing device, a first private table comprising values with associated indexes each shared with the first user computing device and one of the privacy providers, the one or more processors in communication with the data storage device and configured to execute: a table module for receiving indexes each associated with a value in a second private table, the second private table comprising values with associated indexes each shared with one of the privacy providers, each index received from the respective privacy provider sharing those values, each index associated with a value that matches an indexed value in the first private table received by the respective privacy provider from the first user computing device; and a common key module for generating a common key for the secure communication by combining the indexed values of the second private table.

[0016] In a particular case of the system, the common key module generates the common key by performing exclusive-or (XOR) operations on the indexed values of the second private table.

[0017] In another case of the system, the one or more processors are further configured to execute a communication module to use the common key to securely communicate with the first user computing device using a one-time-pad (OTP) encryption approach. [0018] In yet another case of the system, the number of indexed values represent the number of bits to be communicated by the first user computing device. [0019] In yet another case of the system, the values of the first private table and the second private table are generated using a quantum random number generator (QRNG). [0020] In yet another case of the system, the table module marks, as used, values in the second private table that have been used to generate the common key. [0021] In yet another case of the system, the one or more processors are further configured to execute an erasure module to perform erasure of values in the second private table that have been used to generate the common key. [0022] In yet another case of the system, the one or more processors are further configured to execute an authentication module to authenticate the common key using an authentication service. [0023] In yet another case of the system, the one or more processors are further configured to execute an amplification module to perform privacy amplification comprising applying a linear matrix to the common key. [0024] In yet another case of the system, receiving the indexes associated with the values in the second private table comprises receiving values determined by performing exclusive-or (XOR) operations, by each respective privacy provider, on the indexed values in the first private table and the second private table. [0025] These and other aspects are contemplated and described herein. The foregoing summary sets out representative aspects of systems and methods to assist skilled readers in understanding the following detailed description. DESCRIPTION OF THE DRAWINGS [0026] An embodiment of the present invention will now be described by way of example only with reference to the accompanying drawings, in which: [0027] FIG. 1 is a conceptual diagram of a computing environment for key generation for secure communication, according to an embodiment; [0028] FIG. 2 is a conceptual diagram of a system for key generation for secure communication, according to an embodiment; [0029] FIG. 3 is a flow chart for a method for key generation for secure communication, according to an embodiment;

[0030] FIG. 4 is a diagram illustrating an example representation of results of generation and distribution of private tables, in accordance with the system of FIG. 2;

[0031] FIG. 5A is a diagram illustrating an example of key reconstruction for Alice, in accordance with the system of FIG. 2;

[0032] FIG. 5A is a diagram illustrating an example of key reconstruction for Bob, in accordance with the system of FIG. 2;

[0033] FIG. 6 illustrates a plot of an example of a minimum fraction of dishonest providers normalized by total number of providers, in accordance with the system of FIG. 2;

[0034] FIG. 7 illustrates a plot of an example of probabilities of an attack for five different scenarios, in accordance with the system of FIG. 2;

[0035] FIG. 8 illustrates a plot showing a probability to have an attack, for a case of ten privacy providers, as a function of a probability that a privacy provider fails, in accordance with the system of FIG. 2;

[0036] FIG. 9 illustrates a plot showing a probability to have an attack, for a case of fifty privacy providers, as a function of a probability that a privacy provider fails, in accordance with the system of FIG. 2; and

[0037] FIG. 10 illustrates a plot showing a probability to have an attack, for a case of four privacy providers, as a function of a probability that a privacy provider fails, in accordance with the system of FIG. 2.

DETAILED DESCRIPTION

[0038] Embodiments will now be described with reference to the figures. It will be appreciated that for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein may be practiced without these specific details.

In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the embodiments described herein. Also, the description is not to be considered as limiting the scope of the embodiments described herein. [0039] It will also be appreciated that any module, unit, component, server, computer, computing device, mechanism, terminal or other device exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the device or accessible or connectable thereto. Any application or module herein described may be implemented using computer readable/executable instructions that may be stored or otherwise held by such computer readable media and executed by the one or more processors.

[0040] The following relates to data communication systems and encryptions schemes utilized in such systems; and more specifically, to a method and system for key generation for secure communication.

[0041] Cyber-security is considered by many among the most important public assets. Many Encryption schemes are employed ubiquitously for digital communication and are generally based upon some computational assumptions about a hypothetical eavesdropper or interloper. An example is the widely used RSA (Rivest-Shamir-Adleman) protocol, which assumes the intractability to factorize a large integer into its two prime factors. For an adversary in possession of this decomposition, the communication between the parties involved would appear totally transparent. In 1994, an efficient quantum algorithm for factoring large integers was discovered; thus breaking the RSA scheme. Problematically, a malicious eavesdropper could easily store current encrypted communications, and in the future, use advances in technology to then decrypt the information.

[0042] In various approaches, the two communicating computing entities (referred to as Alice and Bob) must pre-share a quantum string; for example, via quantum key distribution (QKD) techniques. Advantageously, the present embodiments can, in some cases, substitute having to pre-share via QKD and the inherent risks that pre-sharing necessitates; and thus, does not necessitate any pre-shared material directly between Alice and Bob. While some approaches use a Certification Authority (CA) to support authentication and help Alice and Bob generation of keys, such approaches using CA generally either rely on computational assumptions to guarantee security against a potential enemy, or they generally still require pre-sharing material between Alice and Bob to generate the key.

[0043] Generally, one-time-pad (OTP) approaches offer information theoretical security from an eavesdropper. However, it would likely be naive to trust any party to distribute keys necessary for communication in a key-agreement scheme. Embodiments of the present disclosure address this substantial challenge, among others. Protocols for securitization of data provided herein can substantially reduce the probability of having data breaches or denial of service (DoS) attacks by using a protocol unrelated to the time and computational resources of Eve. Furthermore, protocols for securitization of data provided herein can be considered to be secure against most computational attacks, including quantum computing attacks.

[0044] The one-time-pad (OTP) approach generally offers information-theoretic security. For the one-time-pad approach, two communicating parties share a random key only known to the two of them; thus, they can communicate safely by using the key to encrypt a message. Any eavesdropper listening to their conversation would generally not be able to, by any means, distinguish the intercepted messages from random noise. Thus, the communication is informational-theoretic secure. However, a substantial technical challenge is distributing the secret key between any two parties; known as the key distribution problem. In a particular approach, a key provider entity can fulfill this service by distributing a secret key to any two parties requiring the key. However, such a protocol would require total trust on the key provider entity, as that entity could easily use the provided key to access all communication between the two parties. Furthermore, such a configuration introduces a single point of failure, which is particularly dangerous in most practical situations.

[0045] Recently, one time pad encryption is becoming much cheaper due to the cost of hard drives and computer memories dropping exponentially. However, scalability is generally an issue for various approaches to OTP. The present embodiments can make OTP encryption widespread by providing a key management layer for OTP; and thus solve the scalability issue.

[0046] Embodiments of the present disclosure provide a protocol for key distribution, while, advantageously, not requiring full trust on any particular agent or entity involved in the protocol. The present embodiments provide a protocol that represents an informational-theoretically secure mobile communication protocol that can be used for peer-to-peer encrypted communication.

[0047] Embodiments of the present disclosure use a certain number of entities or parties, referred to as privacy providers. In order to decrypt data exchanges between any two parties using this protocol for communication, or to access private data stored on a server, it would necessitate compelling or coordinating a large fraction of the privacy providers. In an example, this possibility can be substantially minimized by locating different privacy providers in different countries; for example, ones in Switzerland, Panama, Cyprus, Canada, U.S.A., and Germany.

In this way, no Government alone can effectively legally enforce data disclosure. In further cases, heavy financial forfeitures can be levied on privacy providers in case of disclosures. Advantageously, the protocol of the present disclosure defines a communication protocol in which cybersecurity does not rely on computational assumptions; and thus, can be considered safe against future potential attacks due to advances in technology (such as by quantum computers).

[0048] In the protocol of the present disclosure, given a first party (referred to as A or Alice) and a second party (referred to as B or Bob) who want to communicate, the privacy providers can be collectively used to generate the common key necessary for the parties to communicate using, for example, OTP encryption.

[0049] In some cases, a key agreement phase of the protocol may require at least a subset of the privacy providers to be online; however, actual communication between parties A and B can occur after the key agreement phase when it is not required that the privacy providers are online. In some cases, the protocol described herein can be initiated to set up a network of privately shared keys among parties that may have never met each other, and those keys can be used at a point in the future to communicate, even if the privacy providers are offline. Advantageously, each party need only communicate with a limited number of the privacy providers, but can still have key distribution with any other party using the scheme; effectively obtaining the same outcomes as a bilateral key exchange.

[0050] Advantageously, the protocol of the present embodiments can achieve information- theoretic secure communication under no computational assumptions. Also advantageously, the protocol of the present embodiments only requires at least one of the privacy providers to be uncorrupted and uncompromised. Also advantageously, the protocol of the present embodiments can be future-proof such that if encrypted data stolen today can likely be considered safe against future attacks. In this way, the protocol of the present embodiments can be considered quantum secure, even in view of future advancements in technology.

[0051] FIG. 1 shows, in an example embodiment, a diagram of a computing environment can include a first user computing device 10 (referred to as ‘A’, ‘Alice’, or first correspondent device) interconnected by a network 20 of communication links to a second user computing device 12 (referred to as ‘S’, ‘Bob’, or second correspondent device). The example environment also includes one or more privacy providers 15 in communication with the network 20. The network can be any suitable communication architecture; for example, the Internet, a wide-area-network, a local-area-network, a mobile communication structure, or the like. The communication links may be any suitable communication approach; for example, fixed telephone lines, wireless connections, near field communication connections, or other forms of communication. The devices 10, 12 and the privacy providers 15 may executed on any suitable type of computing device; for example, a desktop computer, a laptop computer, a tablet, a smartphone, a wearable device, an internet-of-things (IOT) device, a server, a distributed computing system, a dedicated piece of hardware, and the like.

[0052] Turning to FIG. 2, shown is a diagram of a system 150 for key generation for secure communication, according to an embodiment. In this embodiment, the system 150 is run on a single computing device 10, 12. In further embodiments, the functions of the system 150 can be run on multiple devices 10, 12, and/or the privacy providers 15. In other embodiments, the components of the system 150 can be distributed among two or more computer systems that may be locally or remotely distributed; for example, using cloud-computing resources.

[0053] FIG. 2 shows various physical and logical components of an embodiment of the system 150. As shown, the system 150 has a number of physical and logical components, including a central processing unit (“CPU”) 152 (comprising one or more processors), random access memory (“RAM”) 154, a user interface 156, a network interface 160, non-volatile storage 162, and a local bus 164 enabling CPU 152 to communicate with the other components. CPU 152 executes an operating system, and various modules, as described below in greater detail. RAM 154 provides relatively responsive volatile storage to CPU 152. The user interface 156 enables an administrator or user to provide input via an input device, for example a mouse or a touchscreen. The user interface 156 also outputs information to output devices; for example, a display. The network interface 160 permits communication with the network 20 or other computing devices and servers remotely located from the system 150. Non-volatile storage 162 stores the operating system and programs, including computer-executable instructions for implementing the operating system and modules, as well as any data used by these services. Additional stored data can be stored in a database 166. During operation of the system 150, the operating system, the modules, and the related data may be retrieved from the non-volatile storage 162 and placed in RAM 154 to facilitate execution.

[0054] In an embodiment, the system 150 further includes a number of conceptual modules to be executed on the one or more processors 152, including a table module 172, a common key module 174, an authentication module 176, an amplification module 178, a communication module 180, and an erasure module 182.

[0055] Advantageously, in some cases, hardware comprising critical cryptographic information, for example pre-installed keys and any additional generated keys, can be physically separated from hardware performing steps of the protocol described herein. For example, hardware security modules (HSMs) may be used for storing key materials and/or performing secure cryptographic operations. In this way, a physical layer of cyber-security can guarantee that access to encrypted data is restricted to a final user 10, 12.

[0056] In embodiments of the present disclosure, one or more keys 50 can be stored by user ‘A’ and one or more keys 52 can be stored by user Ί3’. In some cases, these keys can be installed on the computing device 10, 12 itself, or a separate device collocated with the respective correspondent device, enabling a one-time pad (OTP) encryption scheme for any communication taking place from/to each device. In some cases, a large quantity of keys can be stored with the devices prior to it being received by a user of the respective device.

[0057] The system 150 can use a random number generator (RNG) to generate a plurality of random numbers. In some cases, the RNG can be a quantum random number generator (QRNG) based on natural inputs, for example, light. Some eavesdroppers try to find deficiencies in pseudo-RNG to break the encryption. Accordingly, using a QRNG can create a defense against such class of attacks. In an example, it has been determined that it is possible to implement a QRNG using a camera of a smartphone, and it can produce random bit rates greater than 1 Mbps.

[0058] In an embodiment, the protocol includes two phases: 1) key distribution phase, and 2) key agreement phase. At the end of the protocol, two devices, Alice 10 and Bob 12, share a common key which can then be used for, for example, secure communication via one-time-pad (OTP). Generation of the keys and physical distribution of the keys to the users from the privacy providers, is referred to as a first phase of the protocol: the key distribution phase. [0059] For the key distribution phase, consider the two devices, Alice 10 and Bob 12, connected to the communication network 20. Also consider a set of n privacy providers 15 P 1 ,P 2 , ... ,P n . Alice’s device is equipped with n tables of keys, K 1 ,K 2 , ...,K n , where K i = {0,1} l and l is very large. In some cases, those tables of keys can be previously installed on Alice’s device before she gained access to it. Table K i was provided by the i-th privacy provider P i at construction time; in this way, Alice and P i are the only devices in possession of K i . This strict dual-possession is true for every table of keys. In most cases, each table of keys is shared with the respective user and only one of the privacy providers. Similarly for Bob, he also shares n tables of keys with the n privacy providers. We call H 1 ,H 2 , ... ,H n the tables of keys known only by P i and B, where H i = {0,1} l and l is very large. In particular, table H i was provided to Bob by the i- th privacy provider P i at the time of construction; in this way, Bob and P i are the only devices in possession of H i . Let us denote by K (i,j) the j-th bit of K i (and similarly, by H (i,j) the j-th bit of H i . As described herein, indexes are associated with one value in the tables of keys. For example, for Bob, the indexes (3;28) will be assigned to corresponding value H3,28, which is the 28th value in the table H3 uniquely shared between Bob and P3. The value H3,28 can either be 0 or 1. While the present disclosure generally refers to each index referring to a single bit as the associated value, it is understood that the value can be any suitable denomination; for example, a bit, a byte, a number or series of numbers, an alphanumeric digit or series of digits, or the like.

[0060] The tables of keys K 1 ,K 2 , ...,K n , held by Alice, are permitted to be different from the tables of keys H 1 ,H 2 , ...,H n held by Bob. This is advantageous in a network setting with many users (e.g. 10 or 1 million users) where it may not be feasible for each user to share some tables of keys with all other users. By allowing the tables of keys to be different, the approach of the present embodiments is highly scalable in a network setting.

[0061] In a second phase of the protocol, the key agreement phase, Alice and Bob wish to achieve the sharing of a common key k AB between each other, in some cases, without any single data provider having access to the value of k AB . In an embodiment, this second phase of the protocol can comprise four parts: "Authentication", "Key distribution", "Key authenticity control" and "Privacy amplification.”

[0062] As a result of the second phase, the two parties, Alice 10 and Bob 12, then share a secret common key, that they can be used to communicate at an immediate or a future point in time. Advantageously, after the completion of the second phase, the privacy providers do not need to be online or accessible in order for Alice and Bob to communicate in future. In some cases, future communication between Alice and Bob can comprise using an OTP approach with the shared key. In some cases, after communication, Alice and Bob can erase information used in previous phases of the protocol from their tables shared with the privacy providers. This can be referred to as a key erasure phase.

[0063] FIG. 3 is a flow diagram of a method 300 for key generation for secure communication, according to an embodiment.

[0064] At block 302, for example as part of the key distribution phase, the table module 172 on both of the first user device (Alice) and the second user device (Bob) receives respective private tables of keys. Each private table is received from a respective one of the privacy providers and is unique for each pair of user and privacy provider. These private tables are distributed in a secure fashion. In some cases, each provider needs to maintain a number of tables at least equal to the number of users (or a subset thereof) and each user needs to maintain the number of tables equal to the number of privacy providers (or a subset thereof). In some cases, when the keys in a private table have all been used, a new private table needs to be received from the corresponding privacy providers. Accordingly, Alice receives table K 1 from privacy provider P 1 , K 2 with privacy provider P 2 , and so on. For clarity of illustration, these tables can be organized into a matrix K, with dimension n x l A ( where n is the number of privacy providers and l A is a very large number representing the size of each table), and with elements in {0; 1}. Matrix H is a matrix shared between Bob and the privacy providers; i.e., the first row H 1 (of length l s , which is not necessarily equal to l ) is shared between Bob and P 1 , the second row H 2 between Bob and P 2 , and so on. In an embodiment, table K 1 will have length l A i and table H j will have length l s,j . For simplicity of exposition, l A i = l s,j = l for each / and j and for some l > 0.

[0065] FIG. 4 illustrates a representation of an example of results of generation and distribution of the private tables. In this example, Alice and Bob share an ordered table of bits with each of the privacy providers, and each provider only knows their respective part of the users’ table K and H.

[0066] In some cases, the private tables can be organized in memory in a matrix fashion. In an example, information in the first row of the table is shared only between the respective user and one of the privacy providers. In most cases, different users have different series of binary numbers shared with the same privacy provider. In this way, each user has a substantially different private table than other users; except for coincidentally similar entries. In some cases, the series of binary numbers can be indexed by integers, and these indexes can be used in the key agreement phase to construct a common key, as described herein. In this way, the common key will only be known by the respective users. In an embodiment, each privacy provider provides, to each of the private tables for each user, a subgroup of the binary numbers included in the private tables. In some cases, the binary numbers can be generated by a QRNG. For example, the first row of the private tables can be provided by a first privacy provider, a second row of the private tables can be provided by a second privacy provider, and so on. In some cases, the private tables can be generated by a pseudo RNG, a physical RNG, or a QRNG, and stored on a physical device and distributed to the users by a trusted agent. In some cases, the private tables may be stored in hardware security modules (HSMs).

[0067] In some cases, Alice and Bob can authenticate to each other and/or to the privacy providers. Any suitable authentication protocol can be used; for example, the Kerberos Network Authentication Service. In some cases, the authentication protocol can be assumed to be communicated over a secure connection. Advantageously, once the key agreement phase is completed, Alice 10 and Bob 12 can communicate any time in the future and their communication will remain secure; even when a future protocol can be found to break their previous authentication protocol.

[0068] For the key agreement phase, at block 304, the common key module 174 on both of the first user device (Alice) and the second user device (Bob), via communication with the privacy providers over the network interface 160, generates a common key. In this way, Alice and Bob, who may have never communicated before and likely do not share any initial information, use the support of the privacy providers to construct a common key k without directly communicating any key parameters to generate the common key. This common key is privately shared among Alice and Bob, and remains unknown to any of the privacy providers alone where more than one privacy provider is used. In most cases, it can be assumed that authentication and key agreement can be performed over authenticated channels. In an embodiment, generation of the common key can include:

• Alice generates n random indexing groups x A i . Each group x A i is a collection of integers of size m < l .

• Alice sends (for example, on public communication channels) each indexing group x A i to a respective privacy provider P i . An eavesdropper can listen to these communications as they only contain serial indexing numbers necessary to locate the value stored in the table, and there is no information obtainable about the value itself. This is because there is no correlation between indexes and the binary values associated with it. • Alice records the bits related to random indexes she generates. In this way, she builds a matrix Y A whom entries Y A ij represent the bit in K i position X A ij within that key table: Y A ij =K i,Xij . Further, Alice marks K i,XAij as used.

• Alice creates a common key k A by XOR the rows of Y A . This key has length m.

• Each privacy provider P i receives from Alice the respective set X A and marks the indexes in K i whom position is specified in X A , as used.

• Each privacy provider P i matches the bits specified by the indexes sent over by Alice into equivalent bits in H i . By doing this, each privacy provider P i creates a set of random indexes xf that the privacy provider sends to Bob via, for example, the public channel. The indexes created by each privacy provider therefore has the property:

• Bob receives the respective sets xf and builds up the matrix Y B , where Furthermore, Bob marks all the bits in H i with indexes that appear in xf as used.

• Bob creates a common key k B by XOR the rows of Y B . This key has length m and is equal to Alice’s common key k A .

[0069] In another embodiment, generation of the common key can include:

• Alice decides a certain quantity of bits she would like to share with Bob, referred to as m bits.

• Alice determines the key values of her private table she would like to use. In some cases, Alice may use each row of her private tables in order of left to right. She keeps track of the location, r, of the first unused bit. (i.e. , the r-th bit is the first unused bit in her list).

• Alice broadcasts the values of r, m and the identity of Bob, to all privacy providers, in most cases, over authenticated channels.

• Each privacy provider P i relays the values of r, m and the identity of Alice to Bob.

• Bob determines the key values of his private table he would like to use. In some cases, Bob may use each row of his private tables from left to right. He keeps track of the location, s, of the first unused bit.

• Bob broadcasts the value of s to all privacy providers. • Each privacy provider, P i broadcasts, the value of L (i,r+j,s+j) = K (i,r+j) XOR H (i,r+j) for j = 0, 1, 2, .. m-1 as well as the identity of Alice and Bob, to the first user, Alice.

• Alice confirms to each privacy provider, P i the receipt of the above communication.

• Using the value of K (i,r+j) and the received value of L (i,r+j,s+j) Alice privately determines the parity bit N (l,r+j,s+j) = L (i,r+j,s+j) XOR K (i,r+j) for each j = 0, 1, 2, ..., m-1. Assuming Pi is honest in executing the protocol, then N (l,r+j,s+j) = L (i,r+j,s+j) XOR K (i,r+j) = K (i,r+j) XOR H (i,s+j) XOR X (i,r+j) = H (i,s+j) . Thus, Alice and Bob now share component keys H (i,s+j) with each other. Note that, only the i-th privacy provider, P i , Alice and Bob have the knowledge of H (i,s+j) .

• In some cases, Alice and Bob can use key authentication, as described herein, to verify that they share the same component keys H (i,s+j) with each other.

• Each of Alice and Bob take the XOR of the component keys to generate a common key, referred to as K AB or K common , whose y-th bit, K j = [∑ n i = 1 H (i,s+j) ] mod 2.

• In some cases, as described herein, Alice and Bob can perform privacy amplification,

PA, on the string K to distill a secure key K common = PA ( K ); which an eavesdropper does not have any information. In some cases, PA can be performed by applying a linear matrix; for example, a Toeplitiz matrix, T.

• In some cases, once Alice and Bob have confirmed to each other that they have successfully generated a common key K common , they can direct each privacy provider, P i , to erase the used key materials, K (i,r+j) and H (i,s+j) for j = 0, 1, 2, .. m-1. In some cases, they can also direct the deletion of the fact that Alice is sharing a key with Bob. This can ensure that a future eavesdropper who breaks into the privacy provider, P it can no longer find the used key materials.

• In some cases, Alice can erase her used key materials, K (i,r+j) for j = 0, 1 , 2, .. m-1. Similarly, Bob can erase his used key materials, H (i,s+j) for j = 0, 1, 2, .. m-1.]

• Alice updates the location of her first unused bit in her tables of key to r+m. Similarly,

Bob updates the location of his first unused bit to s+m.

[0070] Note that, at the end of the key agreement phase, Alice and Bob share a common secure key K common , which can be used either immediately or in future applications. For instance, it could be used for achieving information-theoretic security in communication via a one-time-pad approach in general-purpose communication channels; such as optical fiber, mobile phone, or Internet.

[0071] While the above embodiments describe certain communications and determinations, other suitable variants can be used. In the above embodiment, Alice uses the bits in her private tables of keys from left to right. In another embodiment, Alice may apply a random permutation before choosing each bit to use. More specifically, note that in case Alice wants to send m bits to Bob, she needs a key of length m; i.e. a series of purely random binary numbers of length m. To build this key, Alice can decide n series of indexes, one for each data provider, that she can re-map into binary numbers by looking up their values in her table.

[0072] In an example, Alice can pick number 34, 45 and 104 for the first privacy providers. Then she can look into the first row of the table (in an example, the row shared only with the first privacy provider) and look at the values in positions 34, 45 and 104. Suppose she finds 1,1,0. She does not share the binary values with anyone, but instead she shares the values 34, 45 and 104 with the first privacy provider. The first privacy provider receives the numbers 34, 45 and 104, and he can reconstruct the same series (1,1,0) as Alice. Alice can repeat these steps with multiple privacy providers. This way she will have n series of binary numbers (the first being 1,1,0); with every series being shared with only one privacy provider. Anyone could listen to the conversation between Alice and the provider without getting any information regarding the binary values themselves. For example, anyone who listens to 34, 45, 104 has no information at all about the series (1,1,0), as the information on the table is only available to Alice and the respective privacy provider. Alice can XOR all the n series of length m (in the example here m is equal to 3), and she can obtain another series of length m, which only she knows. To rebuild this series, an interloper would need all the series entering in the XOR operation. Therefore, the privacy providers would need to all cooperate in order to rebuild the series; which is highly unlikely. The resultant series from the XOR operation is the secret common key that can be used by Bob and Alice for communication; for example, using OTP.

[0073] In the above example, for Bob to arrive at the common key, each privacy provider can communicate the associated series to Bob. In this example, the first privacy provider communicates the series 1,1,0 (i.e., the first series of binary numbers used by Alice to rebuild the secret key) to Bob. To communicate the series (1,1,0) to Bob, the first privacy provider can pick three indexes from the table shared with Bob that remap into (1,1,0), in some cases, the indexes can be randomly chosen as long as they remap to (1, 1, 0). For example, suppose that in the first row of the table of Bob, in this example, the series of binary numbers shared with the first privacy provider, at position 78, 98, 132 there are the binary numbers 1,1,0 respectively.

The first privacy provider communicates 78, 98, 132 to Bob. An eavesdropper listening to this conversation will not be able to rebuild any further information. Bob, however, once received the ordered set of indexes from the privacy provider (i.e. 78, 98, 132), can re-build the series 1,1,0 by looking up the binary numbers in the respective row of his private table; thus, looking up in the first row the positions specified by the first privacy provider. The above steps are repeated by all the privacy providers. Bob therefore arrives at having a matrix n x m, where n is the number of privacy providers, and m is the length of the secret k. Bob can XOR all the n series of length m to arrive at the common key K common .

[0074] In another example, Alice can have a table with n rows (in this example, n is also the number of privacy providers) and a significantly large number of columns. For the purposes of this illustrative example, the number of columns is equal to 10 for simplicity. The first row can be: (1 0 0 1 1 0 1 00 1). This series of binary numbers is known only by Alice and the first privacy provider. In contrast, the first row of Bob’s table can be: (000 1 1 0 1 1 0 1). This series of binary numbers is known only by Bob and the first privacy provider. Alice can send the series (2,5,1) to the first privacy provider; where the series is determined by Alice translating the positions indicated by the series, as indexes of the table, into binary numbers: (2,5,1) -> (0,1,1). The first privacy provider can also perform such translation, such that only Alice and the first privacy provider know the binary series. The first privacy provider can then communicate the same series (0,1,1) to Bob. To do so, the first privacy provider can pick random indexes in the series shared with Bob, that re map into (0,1,1). In this example, any of (3,4,5), (9,10,8), or (6,10,8), or any other series of indexes that re map into (0,1,1), can be communicated to Bob. For example, if the first privacy provider sends (9,10,8) to Bob, Bob can look up these indexes in his private table to arrive at the same binary sequence as Alice (0,1,1). Alice and Bob can do the above steps with all the n privacy providers, such that they each will have n series of binary numbers. If they each apply an XOR operation on these series of numbers, they can construct a secret common key only they can re-build. These steps can be performed vice versa for communications from Bob to Alice. Suppose Eve, a malicious intruder, wants to rebuild the secret key. Even in the case where Eve listens to all the conversations between Alice and Bob and the privacy providers, it is impossible for her to rebuild the secret key as she has no way to convert each index into the re-mapped binary digit (in the example above, Eve cannot map (2,5,1) into (0,1,1)), as the binary numbers are generated randomly. Suppose now a privacy provider would like to rebuild the secret key. As each privacy provider only knows his part of the series using by Alice and Bob in the XOR operator, it is impossible to predict the final result of the XOR and therefore the key can remain secret.

[0075] FIGS. 5A and 5B illustrate another illustrative example of key generation for Alice and Bob, respectively. Alice initiates an exchange with the three privacy providers P lt P 2 , P 3 . The public messages from Alice to the three privacy providers are respectively [3,5,6], [1,2,6] and

[3.4.5], In this example, tor n = 3,X A 1 = [3,5,6], x A 2 = [1,2,6] and X A 3 = [3,4,5], These sets of serial numbers are associated, within the respective tables, with key values [1,1,0], [0,1,1], [1,0,1], The privacy providers then generate the serial numbers for Bob such that the serial numbers are a set of indexes that reproduce the same key values in Bob’s tables. Among the many possibilities, the one sent by the privacy providers in the figure are [2,3,4], [4,5,6] and

[1.4.5], In this example, tor n = 3,X B 1 = [2,3,4], X B 2 = [4,5,6] and X B 3 = [1,4,5], At the end of this phase, all the bits employed in the communication are marked as used such that the bits will not be used again.

[0076] Advantageously, the approach of the present embodiment is independent on the type of communication; accordingly, it can be used for video calls, voice calls, emails, messages, and any sort of data Alice wishes to send to Bob. Also advantageously, it is not necessary that Alice knows in advance the amount of bits to be transmitted to Bob, as this amount can be adjusted during runtime.

[0077] In another embodiment, RNG at the user level may not be required by letting each privacy provider P i select the key to be used for the encryption. In this way, when Alice starts a communication with Bob, both parties immediately receive a set of indexes that can be locally mapped into the same binary string; the indexes are different for Alice and Bob for the same data privacy provider. The different strings obtained by the different privacy providers are combined as described herein into a common key that will be equal for Alice and Bob. In this embodiment, generation of the common key can include:

• Alice decides a certain quantity of bits she wants to communicate to Bob. If Alice needs to send m bits, Alice sends this information (the number of bits m and the receiver Bob) to each privacy provider P i .

• Each privacy provider P i generates a random binary string s i ∈ {0; 1} m of length m, for example using a QRNG. Each privacy provider matches m indexes in the table K i (shared with Alice) which (i) has never been used and (ii) remaps exactly to s i . This set of indexes can be referred to as k i . In the same way, each privacy provider can generate a set of indexes, h i , for Bob. Each privacy provider marks as used the indexes used for this communication round.

• Each privacy provider communicates, via public channel, k i to Alice and h i to Bob. Advantageously, as these indexes contain serial numbers to reconstruct the keys, they have no information about the real key; thus, eavesdroppers can listen to this exchange without gaining useful information.

• Alice creates a matrix Y A by matching the indexes sent by each privacy provider to the shared tables. Y A is a matrix with n rows and m columns, where element Y i A is equal to

K i,kij·

• Alice generates a common key k A by XOR the rows of Y A ; where the key has length m.

• Alice marks as used the indexes used in this communication round.

• Bob creates a matrix Y B by matching the indexes sent by each privacy provider to the shared tables. Y B is a matrix with n rows and m columns, where element Y B is equal to

• Bob generates a common key k B by XOR the rows of Y B ; where the key has length m.

• Bob marks as used the indexes used in this communication round.

[0078] In another embodiment, RNG at the user level may not be required by letting each privacy provider P i select a key to be used for encryption. In this way, when Alice needs to rebuild a secret key shared with Bob, Bob immediately receives a set of values from each privacy provider that can be used to generate the secret key. In this embodiment, generation of the common key can include:

• Alice decides a certain quantity of bits she wants to communicate to Bob. If Alice needs to send m bits, Alice sends this information (the number of bits m and the receiver Bob) to each privacy provider Pi.

• For each of her table K i (shared with privacy provider P i ) Alice considers the first m unused bits. Suppose the last used bit in table K i is at index j, then Alice considers the series of bits: (Y A i = K ij+1 ,K ij+2 ,... , K ij+m ). Alice marks those bits as used. Alice builds a matrix Y A where the / th row is Y A i .

• Each privacy provider P i computes the series of bits was the last used bit in table K i and s was the last used bit in table H i . Each privacy provider P i marks the bits used to construct N i as used.

• Each privacy provider communicates, via public channel, N i to Bob. Advantageously, as these indexes contain the exclusive or operator on series on random bits, they have no information about the real key; thus, eavesdroppers can listen to this exchange without gaining useful information.

• Bob creates a matrix Y B where the / th row is the series: Y B i=(N i, 1 xor H i s+1 ,

N i 2 xor H i s+2 , ... , N i m xor H i :S+m )=Y A i· Bob marks all the bits employed to build Y B as used.

• Alice generates a common key k A by XOR the rows of Y A ; where the key has length m.

• Bob generates a common key k B by XOR the rows of Y B ; where the key has length m.

[0079] The above embodiments for generating the common keys is much less computational and/or resource intensive for the user devices as it does not require random number generation (especially QRNG) at the user level; and it involves the same number of bits exchanged over the network.

[0080] Alice and Bob both have a key, k A and k B respectively, that can be used for encrypted communication with each other. In some cases, at block 306, before using the common keys for communication, the authentication module 176 on both of the first user device (Alice) and the second user device (Bob) can validate the authenticity of the keys; i.e. , verify that k A = k B . Any suitable approach for key verification can be used. In an example approach, Alice and Bob can authenticate the final keys k A and k B . In this example approach:

• Alice uses a hash-function H to compute is a binary string of length L. The choice of the hash function may be application specific, as the requirement in terms of costs and security of this function may very considerably depending on the context of use. An example can be using the SHS-256 hash function. Another example can be a set of two-universal hash functions such as the Toeplitiz matrix T.

• Alice broadcasts a certain number of pairs ( j; s A.j ), with j e 1,2, ...,L.

• Bob uses the same hash-function H to compute s B = H(k B )\ where s B is a binary string of length L. • Bob collects the pairs (j; s A . j ) broadcasted by A, and determines whether s A;J · = s Bj . , for all the pairs communicated by Alice.

[0081] In another example approach to key validation, the validation of the key can be performed over each of the key components Y A and Y B , for i = 1,2, While this approach generally involves more bits to be exchanged between Alice and Bob, and hence higher communication costs, it allows the users to identify and exclude privacy providers potentially performing a denial of service attack. In this example approach:

• Alice uses an hash-function H to compute s iiA = H( Y A i ); where s i,A is a binary string of length L, and Y A represents the bit in K i , position X A within a key group.

• Alice sends the pair {i; s i,A ) to Bob.

• Bob uses the same hash-function H to compute s i B = H( Y A i ); where s i B is a binary string of length L, and Y B is equal to H i XBi .

• Bob collects the pair {i; s i,A ) broadcasted by Alice, and determines whether s i A = s i B . Bob communicates this result to Alice.

• The above steps are repeated for each i. In the end, Alice and Bob have a shared list of indexes in {1,2, ... , m } that passed the test of whether s i,A = s i B, .

• Alice and Bob use the common set of indexes to rebuild their key k A and k B by the XOR operation performed on Y A and Y B , respectively, on the rows which passed the test.

[0082] In the above approaches to key validation, some information about the key k may be disclosed. In the first approach, there is a trade off between having a large L, where more and more information gets disclosed but the authenticity of the key becomes more reliable, and having a small L, where less information gets disclosed but the authenticity of the key is less certain. In the second approach, a substantial advantage is the ability of Alice and Bob to identify privacy providers running DoS attack. In another approach, Alice and Bob can check randomly some of the bits of k A and k B until the probability that k A ≠ k B is smaller than an acceptable threshold.

[0083] In some cases, at block 308, the amplification module 178 on both of the first user device (Alice) and the second user device (Bob) can perform privacy amplification. During key validation, Alice and Bob may have needed to release some information about the keys that Eve may have accumulated in an attempt to rebuild the key. Among a set of two universal hash functions, to arbitrarily reduce the amount of information possessed by Eve, Alice and Bob can publicly choose a function g: {0,1} m ® {0,1} r , with r < m, and build up a new revised key k = g(k). The resulting revised k can be generally uniformly distributed given all information in possession of Eve; and can thus be safely used. The size r of the new revised key, as well as the choice of the function g, can depend on the amount of information Eve could have obtained during key validation; which in turn depends on the Hash function used for their key validation test. As the Hash function is application specific, the choice of both H and g can depend on specific usages.

[0084] At block 310, the communication module 180 on both of the first user device (Alice) and the second user device (Bob) can can communicate, via each respective network interface 160, using the shared key, k = k A = k B . In some cases, the communication can use an OTP approach:

• Alice uses k in a one-time pad scheme to encrypt her initial message msg\ ctx =

OTP (msg; k ); where ctx represents the encrypted message, msg the original message, and OTP the One-Time Pad scheme, used with key k.

• Alice sends the message to Bob over a public communication channel.

• Bob decrypts the message: msg = OTP -1 (ctx; k).

[0085] The above OTP approach applies vice versa with Bob’s encrypted messages sent to Alice.

[0086] In some cases, at block 312, the erasure module 182 on both of the first user device (Alice) and the second user device (Bob) can can perform key erasure to ensure security even in the eventuality that any of the tables are stolen from either user. In an example, both parties can flip all the used bits with probability 0.5; i.e. , for each bit that has been marked as used, a 0.5 probability is evaluated for whether the respective bit x will be flipped into 1 - x. This evaluation ensures that, even if the device is lost, previous communications cannot be decrypted. In this way, even if the device gets stolen, there is still no way to rebuild the original data, even if such data is stored in a cloud computing environment. In some cases, the privacy providers can also erase their keys. In these cases, there is definitively no way of recovery of the information transmitted. In another case, consider the case where Alice is a user who wants to store information on one or more servers and access it at a later stage; or even grant the access to Bob. The information can be a key, and can also be the data itself; this information can be stored within the providers in a fashion described herein. The privacy providers, by maintaining the information intact, allow for the data to be rebuilt in a future time, by Alice or by any other authorized user who shares key tables with the privacy providers.

[0087] In an example for encrypted phone calls, for each m bits of message sent, further nm bits must be sent (where n is the number of privacy providers), plus a certain (negligible) number of bits for authenticating (for example, Alice can send / bits to party P u taken as the first / unused bits in K i , to authenticate). Therefore, to initiate a 2-minute call (roughly 1.2MB on VOiP protocol), the amount of data to be exchanged in advance between Alice and the privacy providers is roughly 3.6MB (assuming n = 3); this requires (assuming the QRNG always keeps updated a small register with new random numbers) roughly 0.04s for reading the data from the memory, plus the time required for sending the data to the providers. The privacy providers need to match the key and send over the same amount of data to Bob. Assuming Bob has the same speed in reading data from his registers (90MBs), it will be approximately a 0.3 s latency time to initiate the 2-minute call. This latency is commercially reasonable considering the added value of the encryption.

[0088] In some cases of the method 300, the randomness can be assumed to arrive from an RNG (e.g., QRNG) located on the user’s device. In other cases, a large number of random numbers (e.g., quantum random numbers) can be pre-installed on the user’s device. In these cases, for example, the user devices may be replaced whenever the keys have been completely used; such as via a subscription service. In further cases, the quantum random numbers can be pre-installed and stored on a separate memory storage device that is in communication with the user’s device; whereby the memory storage device can be replaced when the pre-installed numbers have been completely used.

[0089] Generally, privacy providers and the communication system between the devices are likely the weakest portions of the protocol. In some cases, legal enforcement can be used to ensure compliance by the privacy providers consisting of having the privacy providers sign a contract. This contract can legally force them to reimburse final users in case of data (key) leaks where they are liable. While, even with legal contracts, privacy providers may still engage in unwanted activities, the probability that all of the privacy providers engage in those activities can be considered as negligible. Particularly, the probability of all cooperating is significantly low where the privacy providers are diverse; for example, diverse in interests, background, geographic location, and the like. In some cases, the role of privacy providers can be provided, at least in part, to national authorities. In some cases, a further layer of cryptography can be used; for example, at least AES-128 can be used. This further layer of security may be implemented on the user devices.

[0090] In some cases, for large enough organizations, assuming the devices are used to only communicate among each other within the same organization, the organization itself can also act as the privacy providers and manage production and distribution of the keys.

[0091] In some cases, the privacy providers can be authenticated such that a DoS attack by any privacy provider can be immediately identified. The privacy provider can then be expelled and removed from their role. In some cases, privacy providers can receive compensation for their role, a DoS attack would be disincentivized as a permanent loss of income.

[0092] Advantageously, it is impossible for each single privacy provider to reconstruct the encryption key without collaborating with all other privacy providers. This implies, that as long as at least one of the privacy providers is honest, the key cannot be reconstructed: a key- reconstruction (KR) attack is impossible. In some cases, a single dishonest privacy provider can perform a Denial of Service (DoS) attack by providing wrong data to Alice and/or Bob during the exchange phase of the protocol. The following provides common key generation at block 304, according to another embodiment. In this embodiment, components are provided to avoid potential DoS attacks.

[0093] In this embodiment, to generate the key that Alice and Bob need to use to communicate, using for example OTP, Alice generates n random points on a Cartesian plane that she distributes to the n privacy providers P 1 ,P 2 , ... ,P n using the shared keys between Alice and the privacy providers. The n points generates by Alice lay on a polynomial of degree g, where 1 ≤ g ≤ n — 1. The intersection between the polynomial and the y-axis of the Cartesian plane uniquely identifies the key that will be used in communication between Alice and Bob. The privacy providers, using the table shared with Bob, deliver the n points to him, which can be used to rebuild the polynomial and therefore the common key.

[0094] The degree g of the polynomial affects the security of the protocol; such that:

• When g is overly small, for example g = 1, it may be easy for a small number of privacy providers to cluster and perform KR attacks, as few points are enough to rebuild the polynomial. At the same time, when g is so small, it is very hard to conduct a DoS attack, as many privacy providers have clusters to stop Bob from reconstructing the key.

• When g is overly large, a larger number of corrupted providers is likely necessary to conduct a KR attack, but a smaller number of privacy providers are enough to conduct a DoS attack. For example, when g = n - 1, the previously described approach for performing block 304 should be used.

[0095] In view of the above trade-off, determining a value for g depends on the number of providers n, on a probability of them being dishonest, on the cost of a KR attack (for example, high for mobile communication and data storage applications) versus the cost of a DoS attack (for example, high for data storage), and on the cost of having a large number of privacy providers.

[0096] In this embodiment, there are the two users, Alice and Bob, and the set of n providers P 1 ,P 2 ,P n sharing a set of private keys with Alice and Bob. In particular, K i is the set of bits of length l privately shared between Alice and P i, so that the bit e {0; 1} represents the value of the jth bit in the set shared between Alice and P i . In the same way, H i is the set of keys shared between Bob and P i . Consider the situation where Alice would like to construct a shared key with Bob that can be used for communication or any other application requiring a key; for example, using OTP encryption. Alice can use a secret sharing protocol (for example, Shamir’s Secret Sharing protocol) to build such a key and share it with Bob. For the purposes of illustration, the key can be considered a number k; where in further cases the key can be described in terms of binary bits. In this embodiment, a particular secret sharing protocol is used as an example, however any suitable secret sharing protocol can be used. In this embodiment, an example approach of common key generation can include:

• Alice generates a g-degree polynomial that intersects the y-axis on the Cartesian plane in k, where g is an integer and g e [1; n 1], Where f A is a polynomial such that ^(0) equals the common key k A .

• Alice selects exactly n random points on the generated polynomial. The random points are

• For each point a i , Alice prepares a series of indexes never used before in communication between Alice and P i , and such that uniquely identify a i on the Cartesian plane. In some cases, the length of k i can be the amount of bits necessary to identify one point (i.e. two coordinates) on the Cartesian plane.

• Alice sends the ith series to provider P i , over a public channel.

• Alice denotes the indexes used from each table K i as ‘used’.

• Alice sends the number g to Bob. • Each privacy provider P i prepares a series of indexes never used before in communication between Bob and P u and such that. uniquely identify a i on the Cartesian plane.

• P i sends the series to Bob, over a public channel.

• Bob receives the series from the privacy providers, and generates the points a 1 = Bob measures the intersection point of the curve with the y-axis, which forms the common key k B .

[0097] Note that when g = 1, the above approach is information-theoretically equivalent to the previously described approach.

[0098] To determine an approximately optimal g, the following can be considered:

• Given n privacy providers, and a g-degree polynomial, to conduct a Key Reconstruction attack at least g + 1 dishonest privacy providers are required. The probability for a provider to not follow the protocol can be considered to be equal to q, and the probabilities can be considered uncorrelated. Then the final probability for a KR attack is:

• Given n privacy providers, and a g-degree polynomial, at least n — g dishonest providers are required to conduct a Denial of Services attack. The probability for a provider to not follow the protocol can be considered to be equal to q, and the probabilities can be considered uncorrelated. Then the final probability for a DoS attack is:

[0099] In the worst-case scenario, a dishonest privacy provider tries to engage in both KR attacks and DoS attacks at the same time. FIG. 6 illustrates a plot of an example of a minimum fraction of dishonest providers as a function of g normalized by the total number of providers n. On the vertical axis, there is the minimum fraction of providers that must be dishonest to conduct the attack, as a function of polynomial degree g; where both quantities are divided by n. Taking the minimum of the two curves gives, for every point g, the minimum fraction of dishonest players necessary to conduct at least one of the two attacks. It is clear from the plot that values of g equal to n/2 provides the safer configuration, although the different attacks may have different costs for the final users, and therefore the optimal g can be moved more to the left or to the right. For example, for mobile communication, a KR attack may be way more costly than a DoS attack, and therefore the optimal g would be closer to n.

[0100] The actual probability to have an attack as a function of g can be determined by calling q the probability for a provider to be corrupted, and by assuming no correlation among these probabilities. The total probability to have an attack is equal to:

[0101] Consider an example where n = 10, and different values for the qs are used; in particular, q 4 = 0.01, q 2 = 0.02, q 3 = 0.03, q 4 = 0.05, q 5 = 0.1. FIG. 7 illustrates a plot of an example of the probabilities of an attack for the five different scenarios q 4 to q 5 . On the vertical axis, there is the probability to have an attack, for the case n = 10, as a function of g. In the optimal case g = 4, the following probabilities for an attack are: p 4 = 10 -10 (for scenario q 4 ), p 2 = 10 -8 , p 3 = 10 -7 , p 4 = 10 -6 , and p 5 = 10 -4 . In particular, even in an adverse scenario where the probability to have a corrupted provider is in the order of 0.2, the final gain on the probability of an attack amounts to three order of magnitude (going from 0.1 to 0.0001). Accordingly, this risk is well within what would be considered practically commercially reasonable.

[0102] Consider for each value of q, again in the example case of n = 10, the probability to have an attack; considering an example for the optimal degree of g being equal to n/ 2. FIG. 8 illustrates a plot showing an example for the result. On the vertical axis, there is the probability to have an attack, for the case n = 10, as a function of q, computed for the optimal g = n/2; where the straight line is y = x. As shown, for values of q below 0.55, there is an improvement in the total probability to have an attack. For q smaller than 0.2, the improvement amounts to several orders of magnitude. FIG. 9 illustrates a plot showing, on the vertical axis, the probability to have an attack, for the case n = 50, as a function of q, computed for the optimal g = n/2; where the straight line is y = x. FIG. 10 illustrates a plot showing, on the vertical axis, the probability to have an attack, for the case n = 4, as a function of q, computed for the optimal g n/2; where the straight line is y = x.

[0103] Two assumptions are generally made for the security of the embodiments described herein. First, at least one of the privacy providers is honest and sound; i.e. they do not coordinate with other privacy providers and does not reveal any of the keys to anyone that should not have access to the key. Second, the keys are correctly prepared; i.e., none of the other parties (interlopers, the privacy providers, and/or the manufacturer of the device) know all the sets of keys on or sent to any device.

[0104] Advantageously, the present embodiments allow for scalability. In a network, the number, n, of users can be very large. With a large number of users, if one user were to require each user to share a key with another user, this would require n(n-1)/2 (i.e. , in the order n 2 ) numbers of keys; which can be extremely large. In contrast, using a privacy provider, as in the present embodiments, can reduce the total number of keys, such as only requiring in the order of n keys. This is because each user only needs to share a key with a privacy provider, instead of the other users. The same scaling, of order n, number of keys will also apply in the case of multiple, e.g., m , privacy providers.

[0105] Advantageously, the present embodiments can be implemented via modification of various regimes, for example, IPSec and VPN.

[0106] Advantageously, the present embodiments allow for the use of a plurality of privacy providers for greater security and reliability. For example, in an approach with only one privacy provider, there is generally a single node of failure; thus, the probability of having a data breach is the probability of having a data breach in that privacy provider (referred to as p). In the present embodiments with a plurality of privacy providers, the probability of data breach is effectively arbitrarily small, even with the probability of having data breaches in each single privacy provider is still equal to p. Thus, with realistic values of p (e.g., p smaller than 20%), the probability of having a data breach with, for example, ten privacy providers is several orders of magnitude smaller than a single privacy provider p. In an example, for p equal to 0.05 on a yearly bases (i.e., on average one data breach every 20 years), with ten privacy providers, the probability of having a data breach is approximately 0.0001% (i.e., on average one data breach every 1,000,000 years).

[0107] The use of privacy providers as described in the present embodiments provides, at least, the following advantages:

• Information-theoretical security when combined with OTP encryption.

• Easy implementation with current technology (such as in the use of virtual private networks (VPN)).

• Privacy providers can be off-line during the communication of encrypted data (after the key generation procedure); which can mitigate delays or denial-of-service attacks. • Highly scalable in terms of the number of users; whereby if new users join, old users are not affected as the privacy providers provide new key tables to the new users.

• Highly scalable in terms of the number of privacy providers; whereby if users increase by k, the number of privacy providers to ensure security, with the number of key tables for each user, simply scales linearly with k.

• Minimal time delay in the resulting communication due to not requiring privacy provider communication after key generation. In an example experiment for a VPN using two privacy providers, a ping time for communication was 90ms; which is a significant improvement over other approaches.

• Secret sharing reduces failure probability.

• Government enforcement is possible by enforcing a legal requirement on its own privacy providers to surrender key tables to law enforcement.

• International cooperation can be implemented; for example, each country owns a privacy provider, then, unless all countries cooperate, the communication remains secure against any individual country.

• A hierarchical design of privacy providers can be implemented such that, for example, each country/major organization has its primary privacy provider and it distributes trust among a set of secondary privacy providers.

[0108] Further, the use of multiple privacy providers provides at least the following advantages:

• No need for the users to trust in the single centralized entity. As privacy providers are operated independently, no single privacy provider is actually able to decrypt Alice and Bob’s communication. In this way, users do not have to share their data with the privacy provider.

• Reduce the probability of hack or data breach from a single node or a single point of failure by using multitude privacy providers; in addition to providing protection against DDOS attacks.

• Allows users to dynamically choose which privacy providers to use for the communication.

• Does not require messages from Alice to Bob to pass through the privacy providers; thus not requiring the privacy provider(s) to be always online for the communication to take place. Alice and Bob can instruct the privacy providers to provide a certain amount of keys now in expectation of a communication happening later. When the communication happens later, the privacy providers do not have to be online. o In an example of the present embodiments, Alice can simply send to the privacy providers (i) Bob’s address and (i) the length of the message that she separately sent to Bob. The Hub then prepares the key instruction (which is as long as the message) and sends those to Bob. In comparison, in other approaches, a centralized entity receives the message from Alice and re-transmits the message to Bob; so the entity has to disadvantageously manage double the traffic.

[0109] Although the invention has been described with reference to certain specific embodiments, various other aspects, advantages and modifications thereof will be apparent to those skilled in the art without departing from the spirit and scope of the invention as outlined in the claims appended hereto. The entire disclosures of all references recited above are incorporated herein by reference.