Jump to content
  • Sign Up
×
×
  • Create New...

[AI]AI capabilities are growing faster than hardware: Can decentralisation close the gap?


Recommended Posts

  • Diamond Member

AI capabilities have exploded over the past two years, with large language models (LLMs) such as ChatGPT, Dall-E, and Midjourney becoming everyday use tools. As you’re reading this article, generative AI programs are responding to emails, writing marketing copies, recording songs, and creating images from simple inputs. 

What’s even more remarkable to witness is the rate at which both individuals and companies are embracing the AI ecosystem. A recent

This is the hidden content, please
by McKinsey revealed that the number of companies that have adopted generative AI in at least one business function doubled within a year to 65%, up from 33% at the beginning of 2023. 

However, like most technological advancements, this nascent area of innovation is not short of challenges. Training and running AI programs is resource intensive endeavour, and as things stand, big tech seems to have an upper hand which creates the risk of AI centralisation. 

The computational limitation in AI development 

According to an

This is the hidden content, please
by the World Economic Forum, there is an accelerating demand for AI compute; the computational power required to sustain AI development is currently growing at an annual rate of between 26% and 36%.   

Another recent study by Epoch AI confirms this trajectory, with projections showing that it will soon cost billions of dollars to train or run AI programs. 

“The cost of the largest AI training runs is growing by a factor of two to three per year since 2016, and that puts billion-dollar price tags on the horizon by 2027, maybe sooner,”

This is the hidden content, please
Epoch AI staff researcher, Ben Cottier. 

In my opinion, we’re already at this point.

This is the hidden content, please
invested $10 billion in OpenAI last year and, more recently, news emerged that the two entities are planning to build a data center that will host a supercomputer powered by millions of specialised chips. The cost? A whopping $100 billion, which is ten times more than the initial investment. 

Well,

This is the hidden content, please
is not the only big tech that’s on a spending spree to boost its AI computing resources. Other companies in the AI arms race, including
This is the hidden content, please
, Alphabet, and Nvidia are all directing a significant amount of funding to AI research and development. 

While we can agree that the outcome could match the amount of money being invested, it is hard to ignore the fact that AI development is currently a ‘big tech’ sport. Only these deep-pocketed companies have the ability to fund AI projects to the tune of tens or hundreds of billions. 

It begs the question; what can be done to avoid the same pitfalls that Web2 innovations are facing as a result of a handful of companies controlling innovation? 

Stanford’s HAI Vice Director and Faculty Director of Research, James Landay, is one of the experts who has previously

This is the hidden content, please
in on this scenario. According to Landay, the rush for GPU resources and the prioritisation by big tech companies to use their AI computational power in-house will trigger the demand for computing power, ultimately pushing stakeholders to develop cheaper hardware solutions.

In China, the government is already stepping up to support AI startups following the chip wars with the US that have limited ******** companies from seamlessly accessing crucial chips. Local governments within China

This is the hidden content, please
subsidies earlier this year, pledging to offer computing vouchers for AI startups ranging between $140,000 and $280,000. This effort is aimed at reducing the costs associated with computing power.

Decentralising AI computing costs

Looking at the current state of AI computing, one theme is constant — the industry is currently centralised. Big tech companies control the majority of the computing power as well as AI programs. The more things change, the more they remain the same. 

On the brighter side, this time, things might actually change for good, thanks to decentralised computing infrastructures such as the

This is the hidden content, please
Layer 1 blockchain. This L1 blockchain uses an advanced mining mechanism dubbed the useful Proof-of-Work (PoW); unlike Bitcoin’s typical PoW which uses energy for the sole purpose of securing the network, Qubic’s uPoW utilizes its computational power for productive AI tasks such as training neural networks. 

In simpler terms, Qubic is decentralising the sourcing of AI computational power by moving away from the current paradigm where innovators are limited to the hardware they own or have rented from big tech. Instead, this L1 is tapping into its network of miners which could run into the tens of thousands to provide computational power. 

Although a bit more technical than leaving big tech to handle the backend side of things, a decentralised approach to sourcing for AI computing power is more economical. But more importantly, it would only be fair if AI innovations would be driven by more stakeholders as opposed to the current state where the industry seems to rely on a few players. 

What happens if all of them go down? Make matters worse, these tech companies have proven untrustworthy with life-changing tech advancements. 

Today, most people are up in arms against data privacy violations, not to mention other affiliated issues such as societal manipulation. With decentralised AI innovations, it will be easier to check on the developments while reducing the cost of entry.  

Conclusion 

AI innovations are just getting started, but the challenge of accessing computational power is still a headwind. To add to it, Big tech currently controls most of the resources which is a big challenge to the rate of innovation, not to mention the fact that these same companies could end up having more power over our data – the digital gold.  

However, with the advent of decentralised infrastructures, the entire AI ecosystem stands a better chance of reducing computational costs and eliminating big tech control over one of the most valuable technologies of the 21st century.

The post

This is the hidden content, please
appeared first on
This is the hidden content, please
.

This is the hidden content, please


Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.