Design for well being is a major part of human centred design. However, big tech has been criticized for creating products without making conscious decisions that downgrades humans. Why are we depending on online services that are funded by third parties who wish to manipulate us? Why do these services depend on behaviour modification for profit?
We can find answers and solutions to these questions by using cognitive psychology and critical design as a tool to maximize the idea of time well spent and reduce screen time. Implications of our interaction with screens can already be seen through our evolved emotions regarding envy, FOMO and boredom. Could technological advances in AI and robotics lead to the emergence of new emotions that were not only previously unquantified, unnamed, and unidentified, but also un-felt?
Centre for Humane Technology, co-founded by Tristan Harris, aims to understand our most vulnerable human instincts so we can design compassionately to protect them from abuse. He sites that the way forward is to envision a world where Humane Technology is the default for all technology products and services.
Realigning technology and wrap it around a more subtle and sophisticated view of human nature.
Tim Leberecht questions How digital technology, specifically AI and robotics, affect our emotions.
An interview with Leigh Gallagher highlighting racism and sexual preferences when it comes to collecting data from surveys.
To understand ourselves, our creativity and emotions, we must grapple with our pre-human existence. Written by Edward Wilson.
How different emotions evolved at different times according to modern evolutionary theory.
Tristan Harris’ manifesto on how designers are making the world more distracted.
Without the challenges and risks of everyday life and those of uncertainty, it becomes all too easy to become set into a mediocre but perfectly functioning traps of displeasure.
If we are giving human qualities to AI systems, we should at least have hierarchy of needs for it as an infrastructure to design from.
When the algorithms we design for AI are able to pass as human consciousness, will they adopt a new role as our cohabitants?
Giving more agency to users by providing them ownership of their own data will help us speculate the future of safe spaces as devices turn into immersive experiences.