A Memory Just For Me: Experimental Camera

  • Click to use camera & see code
  • Sample Caputured Images


    Can you tell the two humans apart? Guess their emotions? Guess what they're doing?

    Design Process

    Currently, many countries in the world are deploying live facial recognition cameras to monitor its citizen. The most well known country that uses this technology would be China, with cameras installed in every corner of the street. Recently, in 2020, the UK Metropolitan police deployed live facial recognition cameras for 6-8 hours a day in hopes of identifying and catching wanted suspects that have committed violent crimes. (Shaw, 2020)

    Purpose of Camera
    Even though the government promotes such intrusive technology as something needed for the safety of its citizen, many do not agree with the method. With the growth of technology, we slowly lose our privacy. The government or hackers can scrape the internet, especially social media, to gather a person's facial and personal data. Hence, I wanted to create a camera that obscures the image that was taken so that only the owner of the photo can tell, through memory, what the image is showing. I want the image to still somewhat keep its original form, so that it is still recognizable by the owner. However, I know that a person's memory can store just a limited amount of information, so I thought it may be a good idea to include the date the photo was taken. Hopefully, it will remind the image owner about when and where this image was taken.

    I want this camera to capture human subjects. This camera serves as a form of diary for the owner as it captures a certain period of time that may be meaningful for them. I believe that a person's body language can tell different stories and emotions related to the point in time the image was taken. On my sample images, I captured the moment where I was really excited to see my experimental camera working. If you look closely, there's an image where my flash turned on as I captured the image on my computer with my iPhone. The flash of light makes the it harder to tell what the captured image is. However, when I look back at the image, I can remember the feeling of joy and excitement I had when I saw my code working. The other person on my captured images is my partner, I captured the moments he tested my camera. I believe that it is really hard to tell both of us apart, but my partner and I are able to tell the differences immediately. These captured images may hold no meaning to others, but it captures an important moment in time for us. Most importantly, it obscures our faces and will make it hard for computers to identify our faces.

    Camera Design Drafts

    I've created three designs and pseudo codes with the concept mentioned above.




    After sharing my ideas with my classmates, we decided to combine a different elements from each design into 1. Although design 3 was what I really wanted to build initially, I agreed with my classmate that it won't obscure your face completely from facial recognition technologies. So, I decided to scrape the idea. Hence, I combined design 3's black frame and design 2's colored circles with design 1's mosaic circle image. I thought that a black and white camera would not be as exciting as a colorful one.

    Reflection

    Through the process of designing an unconventional camera, I learned that we can decide what the camera "sees" and "captures". I've always thought that a camera's purpose was to capture realistic images so that the point in time can be remembered forever. However, sometimes, meaningful memories do not have to be stored in a picture perfect image. Our mind can store memories as well as cameras do, too. We can trigger memories from the past through sound or smell, similarly, we can trigger memories through obscure images that were captured at an important point in time.


    This project also made me reflect on how codes affect the world. As the coder, I dictate what this camera sees and captures, it can not do anything beyond the limits I have created. It is scary how a developer's code can dictate how technology interacts with humans and how datas can be used. If developers are not cognisant of the ripple effects of their published code, their code can create further segregation in society. An example is from Netflix's documentary by Shalini Kantayya called Coded Bias. The documentary showed MIT researcher Joy Buolamwini's frustrations about the computer-vision software she obtained that did not recognise her face until she put on a white mask. The data fed to the computer-vision software did not include enough black and female subjects, hence the software to not recognize those faces. The machine cannot be blamed, as it did not write itself, but rather, it is the developer's responsibility to include representative data into their work.

    Citation
    Shaw Danny, "Met Police to deploy facial recognition cameras," Last modified January 30,2020, https://www.bbc.com/news/uk-51237665

    Chauvet Zelda, "Decoding Coded Bias for a more equitable artificial intelligence," Last modified April 21, 2021, https://genevasolutions.news/explorations/geneva-solutions-podcast-resilience-series/decoding-coded-bias-for-a-more-equitable-artificial-intelligence