题目: Differential Privacy in Big Data and Applications
时间:2015年08月04日上午09:30
地点:东五楼210
Abstract:
Privacy preserving on data mining and data release has attracted an increasing research interest over a number of decades. Differential privacy is one influential privacy notion that offers a rigorous and provable privacy guarantee for data mining and data release. Existing studies on differential privacy assume that in a data set, records are sampled independently. However, in real-world applications, records in a data set are rarely independent. The relationships among records are referred to as correlated information and the data set is defined as correlated data set. A differential privacy technique performed on a correlated data set will disclose more information than expected, and this is a serious privacy violation. Although recent research was concerned with this new privacy violation, it still calls for a solid solution for the correlated data set. Moreover, how to decrease the large amount of noise incurred via differential privacy in correlated data set is yet to be explored.
To fill the gap, this talk discusses an effective correlated differential privacy solution by defining the correlated sensitivity and designing a correlated data releasing mechanism. With consideration of the correlated levels between records, the proposed correlated sensitivity can significantly decrease the noise compared with traditional global sensitivity. The correlated data releasing mechanism correlated iteration mechanism is designed based on an iterative method to answer a large number of queries. Compared with the traditional method, the proposed correlated differential privacy solution enhances the privacy guarantee for a correlated data set with less accuracy cost. Experimental results show that the proposed solution outperforms traditional differential privacy in terms of mean square error on large group of queries. This also suggests the correlated differential privacy can successfully retain the utility while preserving the privacy.
Bio:
Dr. Tianqing Zhu received the B.Eng and M.Eng degrees in Computer Science from Wuhan University, China, in 2000 and 2004, respectively, and the PhD degree in Computer Science from Deakin University, Australia, in 2014. Dr. Tianqing Zhu is currently a continuing teaching scholar in the School of Information Technology, Deakin University, Melbourne, Australia. Before joining Deakin University, she served as a lecturer in Wuhan Polytechnic University, China from 2004 to 2011. Her research interests include privacy preserving, data mining and network security. She has won the best student paper award in PAKDD 2014 and was invited to give a tutorial on differential privacy in PAKDD 2015.