Privacy-preserving of system model with perturbed state trajectories using differential privacy: With application to a supply chain network

Journal Article (2019)
Author(s)

Lakshminarayanan Nandakumar (CGI )

Riccardo Ferrari (TU Delft - Team Jan-Willem van Wingerden)

T. Keviczky (TU Delft - Team Tamas Keviczky)

Research Group
Team Jan-Willem van Wingerden
Copyright
© 2019 Lakshminarayanan Nandakumar, Riccardo M.G. Ferrari, T. Keviczky
DOI related publication
https://doi.org/10.1016/j.ifacol.2019.12.173
More Info
expand_more
Publication Year
2019
Language
English
Copyright
© 2019 Lakshminarayanan Nandakumar, Riccardo M.G. Ferrari, T. Keviczky
Research Group
Team Jan-Willem van Wingerden
Issue number
20
Volume number
52
Pages (from-to)
309-314
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Releasing state samples generated by a dynamical system model, for data aggregation purposes, can allow an adversary to perform reverse engineering and estimate sensitive model parameters. Upon identification of the system model, the adversary may even use it for predicting sensitive data in the future. Hence, preserving a confidential dynamical process model is crucial for the survival of many industries. Motivated by the need to protect the system model as a trade secret, we propose a mechanism based on differential privacy to render such model identification techniques ineffective while preserving the utility of the state samples for data aggregation purposes. We deploy differential privacy by generating noise according to the sensitivity of the query and adding it to the state vectors at each time instant. We derive analytical expressions to quantify the bound on the sensitivity function and estimate the minimum noise level required to guarantee differential privacy. Furthermore, we present numerical analysis and characterize the privacy-utility trade-off that arises when deploying differential privacy. Simulation results demonstrate that through differential privacy, we achieve acceptable privacy level sufficient to mislead the adversary while still managing to retain high utility level of the state samples for data aggregation.

Files

License info not available