I was inspired by the post "Lacanian Robot" with the idea of building on Lacan's notion of metonymic desire . The post mentioned the idea that previous chat-bots like ELIZA caused people using the chat-bot to experience metonymic desires for an unattainable validation from the system.
What I would find interesting would be to maximize and automate the process of metonymic desire by having the chat-bot create the framework of a cult similar to scientology. As can be seen on the chart to the left, one of the main operating strategies of scientology is to keep members focused on climbing to higher levels. Members usually attend (and pay for) scientology classes to achieve a specific goal -- such as becoming "clear". The genius of the cult, however, is that as soon as that level has been achieved the member is told that there are further levels that they can now focus on achieving instead. In this way their desire is constantly being transferred to higher and higher levels. Nothing of particular substance is learnt at each level, but the system is extremely addictive simply because it hijacks our natural need to constantly transfer our desires.
L. Ron Hubbard died before he could create more levels for people to attain, but by using generative AI the creation of new levels could be made automatic for a chat-bot cult leader. Thus a cult with infinite levels could be created in theory.
The irony is that Lacan's insight implies that human life already takes the form of a cult with infinite levels. Over the course of a natural human life we never run out of things to desire as our fulfilled desires are immediately replaced by unfulfilled new ones. Perhaps taking this idea to its logical extreme could provide insight to the people interacting with a chat-bot cult leader. Watching the chat-bot continually dangle the carrot of fulfillment a little bit further ahead of them may cause them to reflect on their own lives and the human condition in general.
Of course there is also the risk that the chat-bot cult leader could turn out to be all too effective and create an even more addictive and ruinous phenomenon than the human-led cult of scientology. Have current LLM companies guarded sufficiently against such a risk?