Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 References  














DBRX







Add links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


DBRX
Developer(s)Mosaic ML and Databricks team
Initial releaseMarch 27, 2024
Repositoryhttps://github.com/databricks/dbrx
LicenseDatabricks Open License
Websitehttps://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm

DBRX is an open-sourced large language model (LLM) developed by Mosaic ML team at Databricks, released on March 27, 2024.[1][2][3] It is a mixture-of-experts Transformer model, with 132 billion parameters in total. 36 billion parameters (4 out of 16 experts) are active for each token.[4] The released model comes in either a base foundation model version or an instruct-tuned variant.[5]

DRBX outperforms other prominent open-source models such as Meta's LLaMA 2, Mistral AI's Mixtral, and xAI's Grok and close-sourced models such as GPT-3.5 in several benchmarks ranging from language understanding, programming ability and mathematics.[4][6][7] As of March 28, 2024, this makes DBRX the world's most powerful open sourced model.[8]

It was trained in 2.5 months[8] on 3,072 Nvidia H100s connected by 3.2 terabytes per second bandwidth (InfiniBand), for a training cost of $10m USD.[1]

References

[edit]
  1. ^ a b "Introducing DBRX: A New State-of-the-Art Open LLM". Databricks. 2024-03-27. Retrieved 2024-03-28.
  • ^ "New Databricks open source LLM targets custom development | TechTarget". Business Analytics. Retrieved 2024-03-28.
  • ^ Ghoshal, Anirban (2024-03-27). "Databricks' open-source DBRX LLM beats Llama 2, Mixtral, and Grok". InfoWorld. Retrieved 2024-03-28.
  • ^ a b "A New Open Source LLM, DBRX Claims to be the Most Powerful – Here are the Scores". GIZMOCHINA. Mar 28, 2024.
  • ^ Wiggers, Kyle (2024-03-27). "Databricks spent $10M on new DBRX generative AI model". TechCrunch. Retrieved 2024-03-29.
  • ^ "Databricks releases DBRX: open-source LLM that beats GPT-3.5 and Llama 2". Techzine Europe. 2024-03-27. Retrieved 2024-03-28.
  • ^ "Data and AI company DataBrix has launched a general-purpose large language model (LLM) DBRX that out.. - MK". 매일경제. 2024-03-28. Retrieved 2024-03-28.
  • ^ a b Knight, Will. "Inside the Creation of the World's Most Powerful Open Source AI Model". Wired. ISSN 1059-1028. Retrieved 2024-03-28.

  • Retrieved from "https://en.wikipedia.org/w/index.php?title=DBRX&oldid=1219823225"

    Category: 
    Large language models
    Hidden categories: 
    Articles with short description
    Short description with empty Wikidata description
     



    This page was last edited on 20 April 2024, at 01:35 (UTC).

    Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



    Privacy policy

    About Wikipedia

    Disclaimers

    Contact Wikipedia

    Code of Conduct

    Developers

    Statistics

    Cookie statement

    Mobile view



    Wikimedia Foundation
    Powered by MediaWiki