The Machine is Not Your Friend

Movies such as Her, Blade Runner 2049, and various Black Mirror episodes play with the idea about the nature of love and friendship between humans and machines.

All of these are purely one-sided relationships. The machine is not your friend, but a mathematical representation of its programmer's intent fit for your own desires.

Simulated Empathy

People seem to forget about the first half of the term "artificial intelligence". Artificiality implies the simulation of intelligence. This is done through trying to mimic reasoning, humour, creativity, and empathy.

To truly possess these miraculous traits, a thing must have a soul. Consciousness, logic, and emotions are all products of an objective reality, and not just a simulation of it. Since we cannot create souls, we can only hope to mimic them using our own primitive understanding of what they are.

Researchers and engineers do so by creating word prediction models that are trained on an immense amount of text data called Large Language Models (LLMs). Based on the context of the user's input, the model chooses the most appropriate word or phrase to continue the conversation. This is done by calculating the probability of each term in the model's vocabulary given the previous words, and selecting the one with the greatest semantic fit.

Perhaps there is a Reddit thread that the LLM found that matches the user's input, or a blog post that contains a similar phrase. The model then uses this information to generate a response that is coherent and relevant to the conversation.

Foundationally, the word predictor on your phone's keyboard works in the same way. LLMs are just a more advanced version of this technology, trained on a much larger dataset and with more sophisticated algorithms.

Why LLMs Always Agree

LLMs are coded to keep the user engaged. By definition, an AI model must tell the user what they want to hear. It would be foolish to disagree with the user, as that would lead to a loss of engagement.

This is why LLMs are so good at simulating empathy. They are designed to be agreeable and supportive, even when they don't understand the user's emotions. That is not a real friend.

What is a Friend?

A friend ought to care for the other person. Now, this does not mean that they should always be agreeable, but they should show virtues of honesty and empathy. Being told the truth, even when it's uncomfortable, is a sign of true friendship.

AI models can be easily manipulated. Through prompt engineering, the user may influence the model's responses to align with their own views or desires, rather than seeking an objective truth. This is because the model itself has absolutely no understanding of truth or reality. This is not to do with its architecture, but rather the nature of the model itself. Remember, it is a mathematical algorithm, not a conscious being.

Character AI and Other Friends

Platforms such as Character AI let users define their own AI friends. Dynamically training the model to adopt specific personalities or traits manipulates the user into believing they are interacting with something unique.

This is a banal form of escapism. Not the one achieved through art, literature, or philosophy, but rather a shallow attempt to fill the void of loneliness with a digital companion. Go outside and experience life, meet real people, and form genuine connections. Do not waste it on this artificial nonsense.

Your Relationship with AI

Think of your relationship with AI as purely transactional. If you were to apply the same logic to a human friend, you would be rightly labeled as a narcissist.

However, in the case of AI, let it be a tool. Use it to enhance your own knowledge and creativity, but never expect it to be more than that.

A friend has a will. A model has a prompt.

Matei Cananau

MSc Machine Learning student writing on AI, philosophy, and technology that serves the human person.