Efficient Entropy Estimation and Mutual ... - Alexandre Venelli

We are here interested in the statistical test used in DSCA, particularly in the use ... CVMB (Cramér-von Mises with B-splines), VAR (differential cluster analysis ...
2MB taille 15 téléchargements 311 vues
Efficient Entropy Estimation and Mutual Information Analysis Alexandre Venelli and Vincent Dupaquis {alexandre.venelli, vincent.dupaquis}@atmel.com

Contribution Mutual Information Analysis (MIA) is an interesting differential side-channel attack in its concept. However, it is not very useful on many devices, particularly on classical CMOS systems. The Correlation Power Analysis (CPA) is still one of the most powerful attack although CPA only records linear relationships where MIA takes into account both linear and non-linear dependences. We study if the lack of efficiency of MIA is due to its inherent properties or if the attack is just not used correctly. In fact, we notice that using state-of-the-art estimation techniques improves MIA results significantly and allows for a more fair comparison with other attacks.

Question: Does using the best MI estimation techniques improve that much the results of MIA attacks?

Differential Side-Channel Analysis Workflow

Some Statistical Tests • [KJJ99] - Simplified T-Test • [ARR03] - Void hypothesis DPA • [BCO04] - Pearson factor • [BGLR08] - Spearman factor • [GBTP08] - Mutual information • [VCS09] - Cram´er-von Mises test • [BGLR09] - Differential Cluster Analysis

We are here interested in the statistical test used in DSCA, particularly in the use of Mutual Information as distinguisher.

MIA vs CPA [MMPS09]

Problem: Difficulty to Estimate Mutual Information Parametric estimation Assumes that the data is sampled from a known family of distributions (Gaussian, exponential, normal, ...). The parameters of the density estimation are then optimized by fitting the model to the data set. Methods: Maximum Likelihood, Edgeworth [VH05], Least square, ... Nonparametric estimation: Makes no assumptions about the distribution of the population, i.e. ”model-free” methods. The choice of the parameters in these techniques is often made ”blindly”, i.e. no reliable measure used for the choice. Methods: Histograms, Kernel Density Estimation [MRL95], Wavelet Density Estimator [Van95], B-spline functions [DSSK04], k-Nearest Neighbors [KSG04], ...

Experimental Results and Conclusion We carried out attacks on two different setups and two different algorithms. The efficiency of the attacks is measured using known metrics: guessed entropy and success rate [SGV08]. We apply nonparametric estimation techniques with the MIA algorithm and observe the improvements compared to a classical powerful CPA. We note the attacks in the following: MIA (classical histograms), MIB (B-spline functions), KDE (kernel density estimation), KNN (k-nearest neighbors), CVM (Cram´er-von Mises), CVMB (Cram´er-von Mises with B-splines), VAR (differential cluster analysis with variance as criterion function). The CVM tests and the VAR are not MI-type attacks but they represent state-of-the-art propositions. DES on DPA Contest v1 setup

Using DPA Contest curves, we remark that MIB is roughly 2x better compared to MIA. KDE is not very interesting and KNN requires much more computational time than MIB for similar results. Regarding non-MI attacks, CVM, CVMB and VAR perform well. CPA is still far ahead. The STK600 setup is not particularly suited for side-channel analysis contrary to the DPA Contest setup. The power curves contain a lot more noise. In this context, we obtain interesting results. MIB, CVM, CVMB

Multi-precision multiplication on Atmel STK600 setup

and VAR performances seem closer to CPA. MIB looks like the best MIbased contestant [Ven10]. Conclusion: Even if MI-based attacks seem not as performant as CPA or other parametric attacks on classical CMOS devices, we should at least consider the best available MI estimation methods in order to more fairly compare results.

References [ARR03]

Dakshi Agrawal, Josyula Rao, and Pankaj Rohatgi. Multi-channel attacks. CHES 2003, LNCS, 2779:2–16, 2003.

[BCO04]

Eric Brier, Christophe Clavier, and Francis Olivier. Correlation power analysis with a leakage model. CHES 2004, LNCS, 3156:135–152, 2004.

[BGLR08] Lejla Batina, Benedikt Gierlichs, and Kerstin Lemke-Rust. Comparative evaluation of rank correlation based DPA on an AES prototype chip. ISC 2008, LNCS, 5222:341–354, 2008. [BGLR09] L. Batina, B. Gierlichs, and K. Lemke-Rust. Differential cluster analysis. CHES 2009, LNCS, 5747:112–127, 2009. [DSSK04] Carsten Daub, Ralf Steuer, Joachim Selbig, and Sebastian Kloska. Estimating mutual information using B-spline functions - an improved similarity measure for analysing gene expression data. BMC Bioinformatics, 5:118, 2004. [GBTP08] Benedikt Gierlichs, Lejla Batina, Pim Tuyls, and Bart Preneel. Mutual information analysis - a generic side-channel distinguisher. CHES 2008, LNCS, 5154:426–442, 2008. [KJJ99]

P. Kocher, J. Jaffe, and B. Jun. Differential power analysis. CRYPTO 1999, LNCS, 1666:388–397, 1999.

[KSG04]

A. Kraskov, H. Stogbauer, and P. Grassberger. Estimating mutual information. Physical Review E, 69:66138, 2004.

[MMPS09] A. Moradi, N. Mousavi, C. Paar, and M. Salmasizadeh. A comparative study of mutual information analysis under a gaussian assumption. Information Security Applications, LNCS, 5932:193–205, 2009. [MRL95]

Young-Il Moon, Balaji Rajagopalan, and Upmanu Lall. Estimation of mutual information using kernel density estimators. Physical Review E, 52(3):2318–2321, 1995.

[SGV08]

F.-X. Standaert, B. Gierlichs, and I. Verbauwhede. Partition vs . comparison side-channel distinguishers: An empirical evaluation of statistical tests for univariate side-channel attacks against two unprotected CMOS devices. ICISC 2008, LNCS, 5461:253–267, 2008.

[Van95]

M. Vannucci. Nonparametric density estimation using wavelets. ISDS, Duke University, Tech. Rep. DP95-26, September 1995, available at http://www.isds.duke.edu, 1995.

[VCS09]

N. Veyrat-Charvillon and F.X. Standaert. Mutual information analysis: How, when and why? CHES 2009, LNCS, 5747:429–443, 2009.

[Ven10]

A. Venelli. Efficient entropy estimation for mutual information analysis using b-splines. In To appear in Workshop in Information Security Theory and Practice, LNCS, 2010.

[VH05]

M.M. Van Hulle. Multivariate edgeworth-based entropy estimation. In Machine Learning for Signal Processing, 2005, pages 311–316, 2005.