Abstract

The differential-privacy idea states that maintaining privacy often includes adding noise to a data set to make it more challenging to identify data that corresponds to specific individuals. The accuracy of data analysis is typically decreased when noise is added, and differential privacy provides a technique to evaluate the accuracy-privacy trade-off. Although it may be more difficult to discern between analyses performed on somewhat dissimilar data sets, injecting random noise can also reduce the usefulness of the analysis. If not, enough noise is supplied to a very tiny data collection, analyses could become practically useless. The trade-off between value and privacy should, however, become more manageable as the size of the data set increase. Along these lines, in this paper, the fundamental ideas of sensitivity and privacy budget in differential privacy, the noise mechanisms utilized as a part of differential privacy, the composition properties, the ways through which it can be achieved and the developments in this field to date have been presented.

Keywords

  • AI Surveillance
  • PIR Sensors
  • Institutional Security
  • Smart ID Systems
  • Emergency Notification
  • IoT in Education

References

  1. References
  2. X. Yue, M. Du, T. Wang, Y. Li, H. Sun, and S. S. M. Chow, “Differential Privacy for Text Analytics via Natural Text Sanitization,” Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pp. 3853–3866, Jun. 2021, doi: 10.48550/arxiv.2106.01221.
  3. A. N and C. Obimbo, “Privacy-Preserving Data Publishing: A Classification Perspective,” International Journal of Advanced Computer Science and Applications, vol. 5, no. 9, 2014, doi: 10.14569/IJACSA.2014.050919.
  4. P. Jain, M. Gyanchandani, and N. Khare, “Differential privacy: it is technological prescriptive using big data,” J Big Data, vol. 5, no. 1, pp. 1–24, Dec. 2018, doi: 10.1186/S40537-018-0124-9/TABLES/4.
  5. C. Yin, L. Shi, R. Sun, and J. Wang, “Improved collaborative filtering recommendation algorithm based on differential privacy protection,” Journal of Supercomputing, vol. 76, no. 7, pp. 5161–5174, Jul. 2020, doi: 10.1007/S11227-019-02751-7/METRICS.
  6. M. Hardt, K. Ligett, ⇤ Caltech, and F. McSherry, “A Simple and Practical Algorithm for Differentially Private Data Release.”
  7. T. Zhang, T. Zhu, R. Liu, and W. Zhou, “Correlated data in differential privacy: Definition and analysis,” Concurr Comput, vol. 34, no. 16, p. e6015, Jul. 2022, doi: 10.1002/CPE.6015.
  8. S. K. Sood, “A combined approach to ensure data security in cloud computing,” Journal of Network and Computer Applications, vol. 35, no. 6, pp. 1831–1838, Nov. 2012, doi: 10.1016/J.JNCA.2012.07.007.
  9. M. Bishop et al., “Relationships and data sanitization: A study in scarlet,” Proceedings New Security Paradigms Workshop, pp. 151–163, 2010, doi: 10.1145/1900546.1900567.
  10. J. He, L. Cai, and X. Guan, “Differential Private Noise Adding Mechanism and Its Application on Consensus Algorithm,” in IEEE Transactions on Signal Processing, 2020, vol. 68, pp. 4069–4082. doi: 10.1109/TSP.2020.3006760.
  11. L. Sweeney, “Only You, Your Doctor, and Many Others May Know,” Technol Sci, Accessed: Feb. 22, 2023. [Online]. Available: /a/2015092903/
  12. N. Carlini et al., Extracting Training Data from Large Language Models. 2021. Accessed: Feb. 22, 2023. [Online]. Available: https://www.usenix.org/conference/usenixsecurity21/presentation/theofanos
  13. Y. Zhao et al., “Local Differential Privacy based Federated Learning for Internet of Things,” Apr. 2020, [Online]. Available: http://arxiv.org/abs/2004.08856
  14. H. Liu, Z. Wu, Y. Zhou, C. Peng, F. Tian, and L. Lu, “Privacy-preserving monotonicity of differential privacy mechanisms,” Applied Sciences (Switzerland), vol. 8, no. 11, Oct. 2018, doi: 10.3390/app8112081.
  15. W. L. Croft, J.-R. Sack, and W. Shi, “Differential Privacy Via a Truncated and Normalized Laplace Mechanism,” J Comput Sci Technol, vol. 37, no. 2, pp. 369–388, Nov. 2019, doi: 10.1007/s11390-020-0193-z.
  16. D. G. Hasuda and J. de Melo Bezerra, “Exploring Differential Privacy in Practice,” in International Conference on Enterprise Information Systems, ICEIS - Proceedings, 2021, vol. 1, pp. 877–884. doi: 10.5220/0010440408770884.
  17. T. Zhu, D. Ye, W. Wang, W. Zhou, and P. S. Yu, “More Than Privacy: Applying Differential Privacy in Key Areas of Artificial Intelligence,” IEEE Trans Knowl Data Eng, vol. 34, no. 6, pp. 2824–2843, Jun. 2022, doi: 10.1109/TKDE.2020.3014246.
  18. Ú. Erlingsson, V. Feldman, I. Mironov, A. Raghunathan, K. Talwar, and A. Thakurta, “Amplification by Shuffling: From Local to Central Differential Privacy via Anonymity,” Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms, pp. 2468–2479, Nov. 2018, doi: 10.48550/arxiv.1811.12469.
  19. C. Wright and K. Rumsey, “The Strengths, Weaknesses, and Promise of Differential Privacy as a Privacy-Protection Framework”.
  20. C. Dwork and A. Roth, “The Algorithmic Foundations of Differential Privacy,” Foundations and Trends® in Theoretical Computer Science, vol. 9, no. 3–4, pp. 211–407, 2014, doi: 10.1561/0400000042.
  21. “What is Differential Privacy? – A Few Thoughts on Cryptographic Engineering.” https://blog.cryptographyengineering.com/2016/06/15/what-is-differential-privacy/ (accessed Feb. 25, 2023).
  22. “Microsoft Corporation | Differential Privacy for Everyone Differential Privacy for Everyone,” 2012. [Online]. Available: http://paulohm.com/
  23. P. Jain, M. Gyanchandani, and N. Khare, “Big data privacy: a technological perspective and review,” J Big Data, vol. 3, no. 1, pp. 1–25, Dec. 2016, doi: 10.1186/S40537-016-0059-Y/TABLES/5.
  24. J. M. Abowd and I. M. Schmutte, “An economic analysis of privacy protection and statistical accuracy as social choices†,” American Economic Review, vol. 109, no. 1, pp. 171–202, Jan. 2019, doi: 10.1257/aer.20170627.
  25. “Differential Privacy Defined - Privacy and Ethics | Coursera.” https://www.coursera.org/lecture/data-results/differential-privacy-defined-phj4C (accessed Feb. 25, 2023).
  26. P. Jain, M. Gyanchandani, and N. Khare, “Differential privacy: its technological prescriptive using big data,” J Big Data, vol. 5, no. 1, Dec. 2018, doi: 10.1186/s40537-018-0124-9.
  27. C. Dwork, F. McSherry, K. Nissim, and A. Smith, “Calibrating noise to sensitivity in private data analysis,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 3876 LNCS, pp. 265–284, 2006, doi: 10.1007/11681878_14/COVER.
  28. F. McSherry and K. Talwar, “Mechanism Design via Differential Privacy,” pp. 94–103, Apr. 2008, doi: 10.1109/FOCS.2007.66.
  29. M. Abadi et al., “Deep learning with differential privacy,” Proceedings of the ACM Conference on Computer and Communications Security, vol. 24-28-October-2016, pp. 308–318, Oct. 2016, doi: 10.1145/2976749.2978318.
  30. A. Ghosh, T. Roughgarden, and M. Sundararajan, “UNIVERSALLY UTILITY-MAXIMIZING PRIVACY MECHANISMS *,” vol. 41, no. 6, pp. 1673–1693, doi: 10.1137/09076828X.
  31. C. Dwork, A. Roth, C. Dwork, and A. Roth, “The Algorithmic Foundations of Differential Privacy,” Foundations and Trends R in Theoretical Computer Science, vol. 9, pp. 211–407, 2014, doi: 10.1561/0400000042.