For the third week, we decided to focus on the individualistic point of view of tracking. This meant tracking their own selves. We started looking at data decay as well, which we thought would be an interesting start for people to trust surveillance.
Currently in the coronavirus pandemic, a 2m radius is considered to be the safe zone. It was a great way to start the micro-interaction part. It was interesting to track and trace human contact within their safe zone.
We started exploring different ideas about how we could use data decay to build trust. One idea that resonated with us was the idea that data can be decayed if not within the 2m radius.
Looking at ‘The virtue of forgetting in a digital age’, we looked at the right to be forgotten. 'Do we want a future that is forever unforgiving because it is unforgetting?'
Forgetting is not only central to our human experience, it is important for many other living beings, perhaps for life in general. We also forget as a society which gives individuals who have failed a second chance.
David Brin’s, 'transparent society', encouraged us to think about how the government and the people are linked to the surveillance state and how that leads back to the cognition of the mind.
We started working on an initial idea which linked back to the 2m radius and data decay. When two individuals cross into each other’s 2m radius, the system will collect information of their location. In 30 days, if both individuals do not get symptoms for COVID, then this information will self delete. If one person tests positive, then the system will notify both individuals and once both participants have recovered, the system will again self delete the information.
We also looked at a few examples of data decay which are digital.
For this idea to work, we looked at a few tracking technologies we could use. We researched Bluetooth contact tracing and anonymous routes. Randomly generated contact numbers could help keep users anonymous.
Rasker’s Apps gone rogue talk about maintaining personal privacy in an epidemic. It spoke about selective broadcasting, Unicasting, participatory sharing as well as the pull model. The participatory sharing was a very interesting concept as it required users to independently seek the information and assess their own exposure risks through uploads and searches.
Feedback
John and Alistair liked the idea of a radius decay. They talked to us about the elderly and poor people and the fact that they will not be able to use this technology. We should consider outliers while designing as well. What about people who cannot maintain a social distance like places in India- slums? We could also think about a digital radius and think about it from an abuser’s point of view. How would we weaponize it?
Takeaway
We probably reached an idea quite quickly. We should have thought about the impacts our design concept would have before designing. My takeaway would be to not reach an idea so quickly and always have an abuser test which would help us think about how to stop the weaponization of our concept.
References
Gowder, P., 1998. The Transparent Society by David Brin; Data Smog by David Shenk.
Mayer-Schonberger, V., 2011. Delete. Princeton: Princeton University Press.
Raskar, R et al. (2020). Apps Gone Rogue: Maintaining Personal Privacy in an Epidemic. arXiv:2003, [online] Volume 1, pages 1-13. Available at: https://arxiv.org/pdf/2003.08567.pdf
Comments