Paradoxical Guilty While Cleaning
Review
I was inspired by the post ‘Shame on You’. Her content asks whether robots feel guilty when they take over human jobs or commit mistakes. The following is my opinion on her question.
In general, robots have a feature to reboot themselves when there is a malfunction. This action can be interpreted as a process where the robots are aware of their wrongness and purify themselves.
Guilty is derived from conscience, but do robots have a conscience? This idea is debatable since conscience is related to the autonomy of oneself in deciding whether a chosen occasion is right or wrong. In this sense, do they have autonomy? Aren’t they just executing programmed actions, affecting them to distinguish what is right or wrong?
With the current technology, there’s a lot to go to implant the complexity of emotions into the robots, and I would say the robots around us have emotions. Instead, the expression of emotions is merely a visualisation of success or failure as well as a catalyst to enhance interactions between humans and robots.
Paradoxical Cleaning Robot
Background
I combined the chosen topic ‘guiltiness of robots’ with a hypothetical robot in accordance with an environmental issue. This idea was derived from one of my ongoing projects, beach cleaning robots, and problems lying in the production process.
Design thinking
Precisely, This robot with AI cleans litter that is visible around us -both indoor and outdoor- and feels guilty when it is not as productive compared to its impact on nature.
In general, there are so many things going on to produce one robot and activate it, such as a significant amount of CO2 emission and water consumption to manage the data centre for AI. Also, the steps to produce robots require diverse materials and entail ai another CO2 emission. In that case, is a robot actually cleaning the environment instead of polluting nature?
Based on these questions, the robot reflects itself whether its performance outweighs the pollution it creates while producing or maintaining itself.
Challenges
Although it requires data inputs made by humans as a basic AI model, there will be more data collected from its experiences as time goes by, and the robot will use the accumulated data for self-reflection.
Favourite details
Eventually, when the robot gets more penalties than the X amount, it will reboot; reboot as means of a death sentence.
All rights reserved ©NayoungJung2022