Application-specific Privacy for Shared Data

Volumes of data available these days is a major driving factor for innovation. This data, however, often resides in data silos guarded and managed by various parties. Data sharing is crucial to gain useful insights, improve services and ensure transparency of decision making, yet there are legitimate privacy concerns. In this thesis, you will implement and validate a method which enables privacy-preserving data sharing. You will implement a deep model architecture which ensures the data can be used for application-specific discriminative tasks but does not leak information about the defined sensitive classes. In contrast to existing works on preserving privacy in shared data, your method will allow to explicitly control the privacy-utility trade-off. You will test the method on several provided data sets and compare its performance to alternative methods

Download as PDF

Student Target Groups:

  • Students in ICE
  • Students in Computer Science

Thesis Type:

  • Master Thesis

Goal and Tasks:

  • Literature research on privacy in shared data
  • Discuss, design, implement and test a deep model architecture capable of preserving privacy of shared data
  • Test the obtained model on provided data sets. Compare the obtained results to at least one non-trivial benchmark. Explore the privacy-utility trade-off in different settings
  • Summarize your results in a written report

Recommended Prior Knowledge:

  • Good understanding of deep learning modelling, experience training a model on a GPU
  • Interest in data privacy and in working on a previously unsolved problem
  • Programming skills in Python, experience with PyTorch


  • a.s.a.p.