AI and centralization

From Simia
Jump to navigation Jump to search

We have a number of big American companies with a lot of influential connections which have literally spent billions of dollars into developing large models. And then another company comes in and releases a similar product available for free.

Suddenly, trillions of dollars are on the line. With their connections they can call for regulation, designed to protect their investment. They could claim that the free system is unsafe and dangerous, as Microsoft and Oracle were doing in the 90s with regards to open source. They could try to use and extend copyright once they have benefitted from the loose regulations, as Disney was doing in the 60s to 90s. They could increase the regulatory hurdles to enter the market. They could finance scientific studies, philosophers and ethicists to publish about the dangers and benefits of having this technology widely available, another playbook tobacco and oil companies have been following for decades.

It's about trillions of dollars. Some technology giants are seeing that opportunity to make easy money dissipate. They would love if everyone has to use their models, running on their cloud infrastructure. They would love if every little app made many calls to their services, sending a constant stream of money to them, if every piece of value created had an effective AI "tax" they would collect. In the 90s and 00s Microsoft made huge amounts of money through the OS "tax", then Apple and Google and Microsoft made huge amounts of money through the app store "tax". Amazon and Microsoft and Google and OpenAI would love to have a repeat of that business model.

I would expect a lot of soft and hard power to be pushed around in the coming months. Many old playbooks reiterated, but also new playbooks introduced. Unimaginable amounts of value and money can and will be made, but how it will be distributed is an utterly non-transparent process. I don't know what an effective way would be to avoid a highly centralized world, to ensure that the fruits of all this work is distributed just a little bit more equally, to have a world in which we all have a bit of equity in the value being created.

To state it clearly: I'm not afraid of a superintelligent AI that will turn us all into paperclips. I'm afraid of a world where a handful of people have centralized extreme amounts of power and wealth, and where most of us struggle with living a good life in dignity. I'm afraid of a world where we don't have a say anymore in what happens. I'm afraid of a world where we effectively lost democracy and individual agency.

There is enough to go around to allow everyone to live a good life. And AI has the opportunity to add even more value to the world. But this will go with huge disruptions. How we distribute the wealth, value and power in the world is going to be one of the major questions of the 21st century. Again.

Simia

Previous entry:
Languages with the best lexicographic data coverage in Wikidata 2024
Next entry:
The Editors