# Difference between revisions of "Quantization - Reliability versus Secrecy"

In real-world realizations, the digital representations of channel measurements are provided by virtually all digital receivers to a communication party, however, they are usually not fully identical. The non-perfect reciprocity in measurement in addition with noise leads to errors in the bit vectors of the preliminary key material. To make the system more robust against nonperfect symmetric channels further quantization is applied. The idea is that, for example, fewer quantization stages or artifact-based profile selection and quantization makes the system more robust. In all recent works, channel profiles are further quantized into bit vectors to obtain the initial preliminary key material. The overall aim is to achieve with a high probability synchronized key material unknown to others. An important security question for a given attacker model and for a given quantization scheme is therefore: How robust is secure enough? This is in particular interesting because the presence of a passive eavesdropper is a fundamental part of the system model and must be considered. While for channel profiles the mutual information and secret-key rate are the metrics of choice, after quantization the metric BER simplifies the analyses. The BER between measurements of Alice and Bob represents the efficiency. The BER between measurements of Bob and Eve represents the security. Additionally to the distributed source coding problem, the amount of extracted (worst-case) bit-entropy is of importance. As a result, three features are of essential interest for designing new and analyzing previous schemes:

• BER between legitimate notes (for robustness)
• BER to a potential attacker in a defined vicinity (attacker model)
• the amount of bit-entropy/randomness (for security and efficiency)

All three features might be sufficiently addressable in a stationary scenario (providing a stationary random process) using a single metric: the secrecy rate or a lower bound on secrecy capacity. However, due to the fact that non-stationary environments are virtually always the source of common entropy in relevant scenarios, we introduce blockwise evaluation strategies. Furthermore, we introduce on-line entropy estimation using cryptographic approved mechanisms instead of Shannon entropy estimators widely known from information theoretical approaches. Note that in several works [1] [2] [3] for detecting and correcting bit errors different information reconciliation approaches were presented. Reconciliation represents an alternative and/or an extension to align the initial preliminary key material. However, the performance of secure reconciliation schemes is also limited. To be precise, we analyze and discuss this fact in the page Information Reconciliation.

## Related Work on Evaluation Methodologies of Quantization Schemes for CRKG

Over the last years, there have been a lot of inspiring publications in the field of channel based key generation. Especially the issue of finding an optimal quantization scheme has been extensively studied. As a result, many different approaches focusing on the improvement of individual performance aspects are available. In particular, three fundamental performance metrics have been identified in literature:

1. Initial Key Generation Rate (IKGR)
2. Bit Error Rate (BER),
3. randomness

As it has been pointed out in [4] and [5], usually there is a trade-off between these metrics. This can be well demonstrated by applying the level-crossing scheme proposed in the first section mentioned above. The authors use a guard band to decrease the BER, thus also decreasing the IKGR, since possibly valuable information is dropped. In order to further decrease the mismatch probability, they employ an excursion detection scheme by which sequences of equal symbols can be detected. Hence, redundancy is reduced. However, a quantization resulting in high BER has always been considered to be useless, since identical secret keys are required and a high number of mismatches would increase the effort of reconciliation. As a consequence, robustness has been considered to be a fundamental quality indication [4] [5] [6] [7] [8]. A common evaluation strategy of the security (robustness versus security) for practiceoriented CRKE schemes is based on channel measurements. In the past (extensive) channel measurement campaigns have been performed with respect to legitimate participants and potential passive adversaries, succeeded with an analysis of the goodness of the preliminary key material applying off-line statistical tests [4] [5] [6] [9] [10]. This analysis is suitable for environments with stationary statistics. However, such environments are hard to guarantee or to determine in practice. Several prior works — our first approaches included — demonstrated a low probability of success for potential passive attacks on quantization schemes based on individual experimental measurements, e.g., by [11] [7] [4] [5] [12] [6] [9] [10] [13]. However, little attention has been paid to a secure CRKE scheme against active attackers. In particular, attackers who attack systems in low-entropy environments or attackers who have active control over the environment. Jana and Premnath et al. [5] [14] proposed the first analysis of different quantization schemes introducing entropy estimations. The authors evaluated different schemes [4] [11] [15] [16] [17]. They analyzed the statistical properties off-line by calculating the Shannon entropy. As the entropy of the key material may decrease due to changes in the environment, solely applying such estimations offline may not be sufficient for security applications. A theoretical analysis of the BER vs. Pearson correlation behavior of several quantization schemes [15] [11] [4] [5] [9] was introduced by Guillaume et al. [18]. This work extends those of [5] [14] [18] by (1) analyzing the ’BER vs. Pearson correlation coefficient’ behavior of further recent practice-oriented protocols via simulation, (2) introducing a real-world-based evaluation strategy of 10 protocols, (3) demonstrating experimental validation of indoor channels decorrelation behavior, (4) introducing results of on-line entropy estimation for security level verification, and (5) demonstrating first results of key extraction efficiency using mutual information.

## Basis of Fair Comparison

The design and analysis of quantization schemes for CRKE are complex because the key material is subject to both simultaneous reliability and secrecy constraints. Due to a trade-off between the fundamental performance metrics IKGR, BER and randomness, optimizing a quantization scheme for all these metrics is not trivial. Especially robustness, hence low BER, has been commonly pursued when a quantization scheme is designed — as we will see later in this chapter. However, tuning the quantization to robustness at all cost is heavily affecting the system’s vulnerability to eavesdropping. In the following, we will present a way of evaluating a system’s reliability and secrecy constraints in a general manner. Based on the source model, let us assume the channel measurements $\displaystyle h_{\beta\alpha}$ and $\displaystyle h_{\alpha\beta}$ are taken by two nodes $\displaystyle \alpha$ and $\displaystyle \beta$ , respectively. Since a communication channel is reciprocal, $\displaystyle h_{\beta\alpha}$ and $\displaystyle h_{\alpha\beta}$ are correlated. The degree of correlation can be measured by the Pearson correlation coefficient $\displaystyle \rho_{\alpha\beta}$ . During the key generation process, $\displaystyle h_{\beta\alpha}$ and $\displaystyle h_{\alpha\beta}$ are given into quantization blocks to generate binary sequences $\displaystyle \gamma_\alpha$ and $\displaystyle \gamma_\beta$ . In order to evaluate the scheme’s robustness and security, we compute the BER between the output of the quantizers of $\displaystyle \alpha$ and $\displaystyle \beta$ over the input correlation. The performance is evaluated using the IKGR as well as statistical testing such as online entropy estimation. Figure 6.1 // TODO ADD IMAGE exemplary shows how the BER decreases with increasing correlation coefficient $\displaystyle \rho$ in case of two different quantization schemes. As can be seen, scheme 1 is less robust than scheme 2, since it causes more mismatches between the two sequences $\displaystyle y_\alpha$ and $\displaystyle y_\beta$ . To evaluate every scheme’s performance, we use the BER, the IKGR. As a function of the correlation coefficient, these metrics are a good basis for a robustness versus security evaluation. Moreover, such representation helps to analyze drawbacks of half duplex transmission, since the implied delay will reduce the measurements’ correlation. To evaluate potential statistical defects, e.g., biasing, we evaluate NIST p-values and online entropy estimation. Given a correlation coefficient $\displaystyle \rho_0$ measured in real environmental conditions, depending on the required maximum BER $\displaystyle \rho_{b,req}$ scheme 1 or scheme 2 can be chosen. However, considering a passive eavesdropper E, the channel measurements of two legitimate nodes A and B are possibly correlated to E’s observation. Thus, again we can use $\displaystyle \rho_{\alpha\beta}$ with $\displaystyle \alpha, \beta \in \{A, B, E \}$ as a quantitive representation of the similarity between respective channel measurements. As a consequence, the representation given in Figure 6.1 also holds for an attacker who is assumed to know the currently used quantization scheme. Obviously, a robust quantization technique with low BER is also beneficial for an attacker. In case of quantization scheme 2, for a correlation coefficient of $\displaystyle \rho_0= 0.8$ such an attacker’s bit sequence would disagree by only 6 %, while A’s and B’s keys disagree by around 1.7 %. Considering that A and B still have to align all mismatches, the attacker would have already known a significant amount of secret bits, making the key generation process very inefficient.

## Blockwise evaluation: Pros and Cons

In the past, researchers (e.g., [11] [6] [12]) have estimated the reciprocity of their gathered channel profiles using the Pearson correlation coefficient $\displaystyle \rho$ . Therefore, for each measurement series a corresponding correlation coefficient was provided. However, because one single correlation value over a large measurement series (e.g., with 1 000 000 channel profiles) is not convincing, we evaluate channel profiles and quantization outputs blockwise. Please refer to Figure 6.2. for the illustration of an example. Blockwise evaluation gives us the ability to evaluate, for example, the reciprocity over time with a lower uncertainty (according to Heisenberg’s uncertainty relationship of communication engineering). This is advantageous since channel profiles may include static as well as dynamic sequences. Furthermore, most quantizations schemes are based on thresholds, which are based on statistical values, such as mean and variance. Dividing channel profiles into blocks increases the extracted entropy, because slow fading effects do not contribute predominately into the key material. Therefore, our metric of choice for practice oriented evaluation is blockwise.

// TODO IMAGE S.141

To achieve initial key material with a maximum of secret information concerning a passive adversary, the BER between a potential eavesdropper and Alice or Bob should be 0.5. As an example, we present a blockwise quantization output in Figure 6.3. Next, we want to information-theoretically analyze the statistics such as entropy $\displaystyle H(X), H(Y ), and H(X, Y )$ , mutual information $\displaystyle I(X; Y ) and I(X;Z)$ , equivocation $\displaystyle H(X|Y)$ , as well as irrelevance $\displaystyle H(Y|X)$ . In the analysis of interest X represents the channel profiles and Y represents the output of the quantization scheme. As mentioned above, we utilize the kNNE of Kraskov et al. to estimate the statistics. However, the kNNE requires 4000 input sets [19], which conflicts with the blockwise evaluation strategy. Using blockwise quantization may result in an inconsistent mapping from identical input values to different output values. The problem is explained using an example: for Q4 and an estimator width of 8 the sample −54 gets inconsistently mapped to 2 and 3, whereas for Q8 it is consistently mapped to 2.

## References

1. Gilles Brassard and Louis Salvail. Secret-key reconciliation by public discussion. In Tor Helleseth, editor, Advances in Cryptology - EUROCRYPT ’93, Workshop on the Theory and Application of of Cryptographic Techniques, Lofthus, Norway, May 23-27, 1993, Proceedings, volume 765 of Lecture Notes in Computer Science, pages 410–423. Springer, 1993.
2. Junxing Zhang, Sneha Kumar Kasera, and Neal Patwari. Mobility assisted secret key generation using wireless link signatures. In INFOCOM 2010. 29th IEEE International Conference on Computer Communications, Joint Conference of the IEEE Computer and Communications Societies, 15-19 March 2010, San Diego, CA, USA, pages 261–265. IEEE, 2010.
3. Yevgeniy Dodis, Rafail Ostrovsky, Leonid Reyzin, and Adam Smith. Fuzzy extractors: How to generate strong keys from biometrics and other noisy data. SIAM J. Comput., 38(1):97–139, 2008.
4. Suhas Mathur, Wade Trappe, Narayan B. Mandayam, Chunxuan Ye, and Alex Reznik. Radio-telepathy: extracting a secret key from an unauthenticated wireless channel. In J. J. Garcia-Luna-Aceves, Raghupathy Sivakumar, and Peter Steenkiste, editors, Proceedings of the 14th Annual International Conference on Mobile Computing and Networking, MOBICOM 2008, San Francisco, California, USA, September 14-19, 2008, pages 128–139. ACM, 2008. formerly known as: mathur2008radio
5. Suman Jana, Sriram Nandha Premnath, Mike Clark, Sneha Kumar Kasera, Neal Patwari, and Srikanth V. Krishnamurthy. On the effectiveness of secret key extraction from wireless signal strength in real environments. In Kang G. Shin, Yongguang Zhang, Rajive Bagrodia, and Ramesh Govindan, editors, Proceedings of the 15th Annual International Conference on Mobile Computing and Networking, MOBICOM 2009, Beijing, China, September 20-25, 2009, pages 321–332. ACM, 2009.
6. Neal Patwari, Jessica Croft, Suman Jana, and Sneha Kumar Kasera. High-rate uncorrelated bit extraction for shared secret key generation from channel measurements. IEEE Trans. Mob. Comput., 9(1):17–30, 2010.
7. Babak Azimi-Sadjadi, Aggelos Kiayias, Alejandra Mercado, and B¨ulent Yener. Robust key generation from signal envelopes in wireless networks. In Peng Ning, Sabrina De Capitani di Vimercati, and Paul F. Syverson, editors, Proceedings of the 2007 ACM Conference on Computer and Communications Security, CCS 2007, Alexandria, Virginia, USA, October 28-31, 2007, pages 401–410. ACM, 2007.
8. Qian Wang, Hai Su, Kui Ren, and Kwangjo Kim. Fast and scalable secret key generation exploiting channel phase randomness in wireless networks. In INFOCOM 2011. 30th 327 IEEE International Conference on Computer Communications, Joint Conference of the IEEE Computer and Communications Societies, 10-15 April 2011, Shanghai, China, pages 1422–1430. IEEE, 2011.
9. A. Ambekar, M. Hassan, and H.D. Schotten. Improving channel reciprocity for effective key management systems. In Signals, Systems, and Electronics (ISSSE), 2012 International Symposium on, pages 1–4, Oct 2012.
10. Christian T. Zenger, Abhijit K. Ambekar, Fredrik Winzer, Thomas P¨oppelmann, Hans D. Schotten, and Christof Paar. Preventing scaling of successful attacks: A cross-layer security architecture for resource-constrained platforms. In Berna Ors and Bart Preneel, editors, Cryptography and Information Security in the Balkans - First International Conference, BalkanCryptSec 2014, Istanbul, Turkey, October 16-17, 2014, Revised Selected Papers, volume 9024 of Lecture Notes in Computer Science, pages 103–120. Springer, 2014.
11. T. Aono, K. Higuchi, T. Ohira, B. Komiyama, and H. Sasaoka. Wireless secret key generation exploiting reactance-domain scalar response of multipath fading channels. Antennas and Propagation, IEEE Transactions on, 53(11):3776–3784, Nov 2005.
12. Sana Tmar Ben Hamida, Jean-Benoˆıt Pierrot, and Claude Castelluccia. An adaptive quantization algorithm for secret key generation using radio channel measurements. In Khaldoun Al Agha, Mohamad Badra, and Gregory B. Newby, editors, NTMS 2009, 3rd International Conference on New Technologies, Mobility and Security, 20-23 December 2009, Cairo, Egypt, pages 1–5. IEEE, 2009.
13. Christian T. Zenger, Markus-Julian Chur, Jan-Felix Posielek, Christof Paar, and Gerhard Wunder. A novel key generating architecture for wireless low-resource devices. In Gabriel Ghinita, Razvan Rughinis, and Ahmad-Reza Sadeghi, editors, 2014 International Workshop on Secure Internet of Things, SIoT 2014, Wroclaw, Poland, September 10, 2014, pages 26–34. IEEE, 2014.
14. Sriram Nandha Premnath, Suman Jana, Jessica Croft, Prarthana Lakshmane Gowda, Mike Clark, Sneha Kumar Kasera, Neal Patwari, and Srikanth V. Krishnamurthy. Secret key extraction from wireless signal strength in real environments. IEEE Trans. Mob. Comput., 12(5):917–930, 2013.
15. Michael A. Tope and John C. McEachen. Unconditionally secure communications over fading channels. In Military Communications Conference, 2001. MILCOM 2001. Communications for Network-Centric Operations: Creating the Information Force. IEEE, volume 1, pages 54–58 vol.1, 2001.
16. 17
17. 103
18. Ren´e Guillaume, Andreas Mueller, Christian T Zenger, Christof Paar, and Andreas Czylwik. Fair comparison and evaluation of quantization schemes for phy-based key generation. OFDM 2014, 2014.
19. A. Kraskov, H. St¨ogbauer, and P. Grassberger. Estimating mutual information. Physical Review E, 69(6):066138, 2004.