Exposing.ai is a research project about the origins and endpoints of biometric image datasets created “in the wild”. The project investigates how photos have unwittingly become part of an information supply chain powering the global biometrics industry.
Research from the Exposing.ai project hsa been featured in the Financial Times, New York Times, Nature, a US Government Accountability Report, the 2020 AI Index, several academic research papers, and has helped pushed forward an urgent discussion about the ethics of dataset collection into public discourse.
Automatic, private, open-source face redaction web app: try it here
DFACE.app
DFACE uses the YOLOV5 neural network object detection framework to run face detection in a web browser so photos never leave a user’s device. It can process up to 1,000 faces per image at down to 10x10 pixels per face with varying effects (color fill, blur, or emoji), and supports batch-processing multiple images. It is designed for activists and social media users to quickly and privately redact faces in imagery before posting to social media.
Part provocation, part education, Think Privacy is an ongoing campaign to raise awareness about emerging issues in an era of exuberant data collection.
The Think Privacy project officially launched at the New Museum Store in March 2016 in New York City. Several items from this collection are still available at the
New Museum Store.