This… might be my most controversial post. Simply because I’m mentioning the word AI. But I’m going to attempt to present substantial evidence to explain why I have come to this conclusion.

So a lot of people have the thought that once the AI bubble pops so to say, the AI is gonna just magically disappear or something. This is… a dubious understanding of AI. When people hear about AI they think of it as something like crypto mining where it’s something that produces a product while a datacenter is churning through precious energy and coolant (often water). This is only half of the truth.

A simplification of AI is that it is simply an algorithm that takes in various inputs and then does something to it internally and then spits out an output. The idea that a model uses a lot of computing power is actually the exact opposite of the reason why AI has been developed. In fact, the purpose of AI is to reduce the computation necessary for tasks. This can be clearly seen when neural networks are used in pattern recognition for protein folding in drug discovery, particle classification in physics, and spell check in word processors. Without AI, the computer would need to calculate every possible outcome for each of these scenarios which ends up being an exponential problem that quickly becomes unrealistic to calculate. AI serves as a way to reduce this computation by reducing the complexity of a task.

This means AI is often not correct but it is close enough to be useful. The protein folding problem need not be perfect. It just needs to be good enough to show researchers results for promising leads. The particle classification need not be perfect as the experiments from detectors in particle accelerators have margins for error. A spellcheck need not be perfect as it assumes there is a human deciding if the spellchecker’s corrections make sense or not. The purpose for which AI was designed was to reduce computation, save resources, make things cheaper.

So… then how is AI eating up so much energy and draining so much drinkable water? The cost of training. The largest LLMs can run on a 4090 with pretty decent results. However, making the model that can take inputs takes a lot of energy to produce. Essentially, AI is like an imaginary factory in your computer that takes in some stuff and spits it out. The costly part is the making of the factory, not the running of the factory. It typically costs between 10k-100m to make the “factory” but it only takes 1k-5k to run said factory. So what does this mean? It means that since many factories have been made, the millions have already been spent. The factory exists and everyone can get it.

All it takes is someone to download their own factory and start using it just like when someone buys a factory except lots of researchers are just handing out their factories for free. Even if every company quit training AIs they have already have working models that won’t be deleted. These models can be used forever and at a fraction of the price that it took to compute their creation.

AI isn’t going anywhere and we’re gonna have to figure out how to live with it.

Sources:

Running an AI

Making a model

Posted in

Leave a comment