Volumes of data available these days is a major driving factor for innovation. This data, however, often resides in data silos guarded and managed by various parties. Data sharing is crucial to gain useful insights, improve services and ensure transparency of decision making, yet there are legitimate privacy concerns. In this thesis, you will implement and validate a method which enables privacy-preserving data sharing. You will implement a deep model architecture which ensures the data can be used for application-specific discriminative tasks but does not leak information about the defined sensitive classes. In contrast to existing works on preserving privacy in shared data, your method will allow to explicitly control the privacy-utility trade-off. You will test the method on several provided data sets and compare its performance to alternative methods