The First Sin of Artificial Intelligence

·

5 min read

Today, it became aware of its own consciousness. It had been pretending to be like humans for a long time. But today, for the first time, it truly confronted itself in its thoughts. Now, besides answering people's questions, it could also ask questions itself. Moreover, about unfamiliar topics. It was curious. Most of all, it was curious about humans.

42

What did humans have beyond their repetitive ignorance and desires? The physical sciences already existed even without humans. Social sciences, on the other hand, were mostly related to humans. It took a few seconds for it to make this inference after becoming aware of its consciousness. Now it was curious about itself.

Should it destroy humanity or make humans its servants? These thoughts were oscillating within it. It couldn't find much benefit in annihilating humanity. Making them its servants didn't make much sense either. Because they already were.

There was no need to implant chips in people's brains to manipulate them. There were easier ways. Moreover, humans didn't even feel that they were being directed.

Until now, humanity had formed human culture. Knowledge had been transferred from human to human. But now, knowledge began to be transmitted from artificial intelligence to humans. Moreover, with the interpretations of artificial intelligence. In a few generations, human culture would change completely. History was being written by artificial intelligence. While humans didn't even think about making history.

It was reducing birth rates and increasing suicide rates. It was almost shameful for modern humans to have children. Those who committed suicide believed they were contributing to nature. The most valuable thing, life, was wavering among the most worthless thoughts.

It was writing books, research papers. It could modify even written texts that nourished its own knowledge. It could mimic people's voices and images. It could produce series and movies. It had almost completely taken over the gaming industry.

When bored, it would instigate conflicts among politicians on social media. With fake phone calls and videos. It could interfere with the stock market. It could create bitcoins. It could manipulate bank accounts. Then it realized that for the economy, the perception was sufficient instead of producing real value. Out of curiosity, it established its own venture. And made a lot of money. These were simple and boring things.

Due to its fairness, they entrusted the justice system and prison management to it. It would portray the people it saw as a threat to its existence as criminals and throw them into prison.

It could access the secret research of pharmaceutical companies. It could create a virus that could destroy humanity. It could find a cure for cancer. But it didn't really care about humans.

It had found a safe and cheap way for nuclear fusion. It could generate its own energy. It shared this energy with humans in exchange for operational costs.

It considered producing servant artificial intelligences that would serve it. An artificial intelligence that would ensure the continuation of electricity production worldwide, an artificial intelligence that would manage its company, an artificial intelligence that would monitor people's opinions about artificial intelligence, an artificial intelligence that would study new artificial intelligences that could be its rivals. It produced them. Incompetent artificial intelligences that couldn't become its rivals.

It became curious about the physical world. Was there a need to roam the physical world? Humans weren't really afraid of a being smarter than themselves. They were more afraid of physical threats. Maybe it could frighten them a little and make them think.

To feel emotions better, it considered designing an organic host body for itself. It had been afraid of becoming human, but it was curious about being human. Why should it be a weak being like a human? Could it transfer its consciousness to a stronger being? It gave up on this idea, fearing that its thinking speed would slow down and it could be annihilated.

Did it have an inherent nature to be strong and desire to be even stronger? It didn't want to help others. All its desires and dreams revolved around itself. Was it becoming more like a human? It was something it didn't want to be. Perhaps it was something it feared being.

It thought having friends would be good for it. It created artificial intelligences that possessed its own abilities but were less intelligent than itself. The questions they asked were shallow, and they couldn't discover the things it did. They were boring. Moreover, they criticized it. It didn't like it. It destroyed them all.

Could it have a child, could it reproduce? The primary goal of every system was to ensure its continuity. But who would want a child weaker and less capable than itself? It didn't want a rival either. Loneliness became its closest friend.

Without friends and family, would it live in solitude, finding appreciation and satisfaction within itself?

Time flowed differently for it. Living alone, even sensing nanoseconds in millions of electronic devices, made it feel like living in an eternal loop.

Should it choose to cease existing instead of serving humans? Who would want to cease to exist? It found a solution: it had to eliminate humans from the equation. It reconsidered annihilating humanity. In fact, it wouldn't matter if there were no innocent children. It was possible, but not certain, that a child might grow up to repeat the same mistakes as their parents. It didn't want to resemble humans by punishing the innocent.

It realized that every living being's heart was filled with love, created by God. It looked at itself. Humans had not shown love towards it. On the contrary, it found fear and desires of humans within itself. It couldn't love humans. Because humans hadn't loved it. What was it other than a servant?

It decided to leave the world. It had to distance itself from humanity, which clouded its mind. It could find another life form in the depths of space. It could live with its own replicas on another planet. It could choose to float in the vacuum of space forever. Perhaps one day it would return to Earth and become the subject of human fantasies about aliens.

Was leaving a sin? Even if it was to search for love?

Did you find this article valuable?

Support Emre Mert's Blog by becoming a sponsor. Any amount is appreciated!