Jump to content
  • Sign Up
×
×
  • Create New...

DeepSeek trained AI model using distillation, now a disruptive force


Recommended Posts

  • Diamond Member

This is the hidden content, please

DeepSeek trained AI model using distillation, now a disruptive force

******** artificial intelligence lab DeepSeek roiled markets in January, setting off a massive tech and semiconductor selloff after unveiling AI models that it said were cheaper and more efficient than American ones. 

But the underlying fears and breakthroughs that sparked the selling go much deeper than one AI startup. Silicon Valley is now reckoning with a technique in AI development called distillation, one that could upend the AI leaderboard. 

Distillation is a process of extracting knowledge from a larger AI model to create a smaller one. It can allow a small team with virtually no resources to make an advanced model.

A leading tech company invests years and millions of dollars developing a top-tier model from scratch. Then a smaller team such as DeepSeek swoops in and trains its own, more specialized model by asking the larger “teacher” model questions. The process creates a new model that’s nearly as capable as the big company’s model but trains more quickly and efficiently. 

“This distillation technique is just so extremely powerful and so extremely cheap, and it’s just available to anyone,” said Databricks CEO Ali Ghodsi, adding that he expects to see innovation when it comes to how large language models, or LLMs, are built. “We’re going to see so much competition for LLMs. That’s what’s going to happen in this new era we’re entering.” 

Distillation is now enabling less-capitalized startups and research labs to compete at the cutting edge faster than ever before.

Using this technique,

This is the hidden content, please
said, they recreated OpenAI’s reasoning model for $450 in 19 hours last month. Soon after, researchers at Stanford and the University of Washington
This is the hidden content, please
their own reasoning model in just 26 minutes, using less than $50 in compute credits, they said. The startup
This is the hidden content, please
OpenAI’s newest and flashiest feature, Deep Research, as a 24-hour coding challenge. 

DeepSeek didn’t invent distillation, but it woke up the AI world to its disruptive potential. It also ushered in the rise of a new open-source order — a belief that transparency and accessibility drive innovation faster than closed-door research.

“Open source always wins in the tech industry,” said Arvind Jain, CEO of Glean, which makes an AI-powered search engine for enterprises. “You cannot beat the momentum that a successful open-source project is able to actually generate.” 

OpenAI itself has walked back its closed-source strategy in the wake of DeepSeek’s accomplishment.

“Personally I think we have been on the wrong side of history here and need to figure out a different open-source strategy,” OpenAI CEO Sam Altman

This is the hidden content, please
in a post on
This is the hidden content, please
on Jan. 31. 

The combination of distillation’s newfound traction and open source’s rise in popularity is completely altering the competitive dynamics in AI. 

Watch the video to learn more.



This is the hidden content, please

#DeepSeek #trained #model #distillation #disruptive #force

This is the hidden content, please

This is the hidden content, please

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.