One of the tools is an internal video blurring one to mask critical objects in a playing video – such as someone’s face, tattoos, or vehicle number plates. And the other is a computational tool to let researchers make calculations over data without personally accessing the sensitive information in it.
Privacy Enhancing Technologies
Google is one of the few companies working on new-age Privacy Enhancing Technologies (PETs) that would let people process sensitive data without personally accessing it. While it’s good that it’s serving important people with such technologies, the company is now open-sourcing two from this bunch to let anyone use them. One of them is an internal video blur tool that would let anyone blur a part of the video while it’s playing. This would be helpful for videographers to “save time in blurring objects from a video while knowing that the underlying ML algorithm can perform detection across a video with high accuracy,” says Google. This tool is under the project called Magritte, which is now public on GitHub. It uses Machine Learning to detect objects in a running video and apply blur on them as soon as they appear. Items like vehicle number plates, tattoos, and sometimes the faces of people can be blurred. The next one is a Fully Homomorphic Encryption (FHE) Transpiler – that allows developers to compute encrypted data without accessing it personally! This is much needed in sensitive data areas like financial services, healthcare, and government, “where a robust security guarantee is of the highest importance.” Letting people process the sensitive data for analytics without touching it well. Both these tools are part of Google’s Protected Computing initiative that’s made to transform “how, when and where data is processed to technically ensure its privacy and safety”.