When we design tech products, it is fairly easy to see users’ decision making through data collection. When we design experiences, we can use behavioural economics as a theory to analyze how consumers, for example, behave and make decisions in a retail store. This study would then help us make more conscious decisions for the way we design a retail space, products, changing rooms and people we hire.
The ethical concern is that for the consumer, this data collection happens unconsciously. The retail store doesn’t offer us a cookie when we come through the door and says if we eat it, our data will be collected. When we design off screen experiences and as operating systems become physical experiences, we should consider how to collect the trail of existence in a more ethical and human centric way. What does the future of safe spaces look like?
Study shows that people are way more honest using Google search when they wouldn’t even share it with their best friends. Why are we inclined to collect so much data from strangers we don’t know. Why are we asking people personal questions online when we won’t be able to do it in real life?
Dear Data is a year-long, analog data drawing project by Giorgia Lupi and Stefanie Posavec. By collecting and hand drawing their personal data and sending it to each other in the form of postcards, they became friends. We wanted to implement this technique of collecting data as a way to relate to it on a more humanistic level. Participants explored different ways of collecting and visualizing data to get a better understanding of themselves.
Exploring two women’s friendship across continents by collecting data from their every day lives. Created by Giorgia Lupi and Stefanie Posavec.
An interview with Leigh Gallagher highlighting racism and sexual preferences when it comes to collecting data from surveys.
Exploring the future of machine learning and how it can be used in social science.
These Microsoft guidelines are aimed at helping you to design a bot that builds trust in the company and service that the bot represents.
The new EU regulation concerning data protection and privacy.
We design for humans but we need to think of us being a part of the experiences we design to achieve overall wellness in the long term.
When the algorithms we design for AI are able to pass as human consciousness, will they adopt a new role as our cohabitants?
At the core of building trust and a relationship with the user, designers need to be held accountable for creating products that clear pathways to support and empower the user.