When you purchase through links on our site, we may earn an affiliate commission.Heres how it works.

Let me be clear here: AI isn’t dying we know that.

It just doesn’t.

A person holding out their hand with a digital AI symbol.

Not exactly surprising given you’re trying to compete with multiple server blades operating in real-time.

I’m sorry, your RTX 4090 just ain’t going to cut it, my friend.

And to be clear, that’s a four-year-old speaker at this point, not exactly cutting-edge tech.

RTX 3080 GPU installed in a PC

In my mind, local AI feels like it falls very much in the same camp as that.

That’s not necessarily a bad thing or the end of the world either.

There is a silver lining for us off-the-grid folk, and it all hinges on GPU manufacturers.

An Nvidia RTX 4090 in its retail packaging

Naturally, AI programming, particularly machine learning, predominantly operates through parallel computing.

However, developing a GPU from the ground up takes time a long time.

For a brand-new architecture, we’re talking several years.

All of that prior to the thawing of the most recent AI winter and the arrival of ChatGPT.

You might also like