Artificial honesty & exterior 

Honesty

Boet’s post about honesty inspired me to question what makes this trait so painfully hard for human beings. Boet presented strong arguments highlighting the embodied nature of communication and the complexities involved in truth-telling. He also emphasized that conveying experiences truthfully extends beyond mere desire, acknowledging the emotional and psychological toll of sharing traumatic or shameful experiences. While the post ponders about the emotional and experiential aspects of truth-telling, I wanted to look into the philosophical concepts that make us struggle.


As it was already mentioned, honesty is not just about a mere desire to tell the truth, it is not linear. Instead, most theories suggest that honesty is more of a spectrum with varying degrees and shades. For example, social contract theory by Rousseau, Hobbes and Locke states that people’s moral and political responsibilities depend on an agreement to form the society in which they live. 


Another worth-mentioning theory is proposed by Immanuel Kant and called Categorical Imperative. Although he proposed several formulations, there is one particularly relevant to this discussion. The formula of humanity asserts: “Act in such a way that you treat humanity, whether in your own person or in the person of any other, always at the same time as an end and never merely as a means to an end”. The formula has profound implication on how we humans communicate. It categorically forbids actions that exploit, deceive or manipulate others, as these actions fail to respect the autonomy and dignity of the individual’s involved. In the contexts of honesty, this principle reinforces the moral dark side of any kind of deceit.



So, if our relationship with others poses certain rules, could that be the source of pain? Boet’s post mentions the feeling of being vulnerable and exposed while telling the truth. But that does not come only from within, but also from the exterior. We could assume that the struggle to express bad news is tied to a denial to bring those news to the society for several reasons. For example, the news could modify the society, which might seem unacceptable. We could also ponder that it would be as hard to deliver bad news if the receiver has to handle some sort of damage. The formula of humanity corners us and dictates that even if the intention is to treat someone as an end in themselves, we always must choose honesty and respect the others capacity to make informed decision on the truth. Therefore, although a lot of the struggle comes from within, society does not really help but just makes us feel helpless.


Lastly, it is important to mention that honesty goes hand in hand with effective communicative interaction, which suggests its compliance under a framework by H. P. Grice. The author proposes an assumption where a person adheres to four conversational maxims: quality (trying to be truthful), quantity (provide the right amount of information), relevance and manner (avoid ambiguity and obscurity). 


The framework formulated by Grice does portray particular aspects of honesty, which seem to be, in a way, linear. I could imagine an artificial creature that takes those parameters in consideration when starting to have a chat with someone. The creature could have different moods based on which the values of the four maxims would differ.


My vision of an honest robot

Talking about a more feasible robot, I would aspire to target vulnerability through exposure and dependency on the input from the individual interacting with the robot. Picture a robot as a sleek black cube with a screen and two buttons for interactions. Drawing inspiration from Boet's concept, this robot's mission is to foster trust. It attempts to do so by monitoring the user's heartbeat while they respond to questions like "Can I trust you?" Based on these interactions, an algorithm guides the robot through various scenarios.


For example, the robot could express its growing trust by illuminating from within the dark cube, symbolizing its willingness to be open and vulnerable. Alternatively, it might try to trust cautiously, adjusting its internal light to find a delicate balance, representing its attempt to open up slowly. If uncertain, it might reduce the light again, showing hesitation.


In another scenario, the robot might "overshare" by suddenly turning its lights to full brightness, then shutting down completely, like a burnt-out bulb, indicating it shared too much too quickly. During these moments, texts could appear on the screen, such as "I am sorry, this is too hard for me" or "I am sorry, I cannot do this anymore," adding a layer of emotional depth and communication to the interaction. 

The opaque black material conceals the robot's inner workings until it decides to "trust" someone by becoming transparent, metaphorically illustrating the risks and exposure individuals face in moments of honesty.


Sincerely,
N8