Privacy-Aware State Estimation based on Obfuscated Transformation and Differential Privacy

With applications to smart grids and supply chain economics

More Info
expand_more

Abstract

With the emergence of many modern automated systems around us that rely heavily on the private data collected from individuals, the problem of privacy-preserving data analysis is now gaining a significant attention in the field of systems and control. In this thesis, we investigate the privacy concerns of these systems arising in the process of state estimation - a well known and a widely studied concept in systems and control. Our work draws motivation from smart grids and supply chain economics, and hence, we study two different privacy problems in the context of state estimation and rely on cryptography to solve these challenges.

In the first problem, we study the privacy challenges of state estimation in smart grids. Smart grids promise a more reliable, efficient, economically viable, and an environment-friendly electricity infrastructure for the future. State estimation in smart grids plays a vital role in system monitoring, reliable operation, automation, and grid stabilization. However, the power consumption data collected from the users during estimation can be privacy-sensitive. Furthermore, the topology of the grid can be exploited by malicious entities during state estimation to launch attacks without getting detected. Motivated by the essence of a secure state estimation process, we propose a weighted-least-squares estimation carried out batch-wise at repeated intervals where the resource-constrained clients utilize a malicious cloud for computation services. We exploit a highly efficient and verifiable obfuscation-based cryptographic solution to perform the computations of the estimation process securely in the presence of a malicious adversary. Simulation results demonstrate a high level of obscurity both in time and frequency domain making it difficult for the malicious adversary to interpret information about the original power consumption data of the consumers and the grid topology from the obfuscated datasets.

Our second problem deals with the challenge of protecting a dynamical supply chain model while releasing the state sequences generated by the model for data aggregation to an external possible adversary. Releasing state samples generated by a dynamical system model with high accuracy for data aggregation and other statistical purposes can also be used for reverse engineering and estimating sensitive model parameters. Upon identification of the system model, the adversary may even use it for predicting sensitive data in the future. Hence, preserving a confidential dynamical process model is crucial for the survival of many industries. Motivated by the need to protect the system model as a trade secret, we propose a mechanism based on differential privacy to render such model identification techniques ineffective while preserving the utility of the state samples for data aggregation purposes. We deploy differential privacy by generating noise according to the sensitivity of the query and adding it to the state vectors at each time instant. We derive analytical expressions to quantify the bound on the sensitivity function and estimate the minimum noise level required to guarantee differential privacy. Furthermore, we present numerical analysis and characterize the privacy-utility trade-off that arises when deploying differential privacy. Simulation results demonstrate that through differential privacy, we achieve acceptable privacy level sufficient to mislead the adversary while still managing to retain high utility level of the state samples for data aggregation.