Databricks, a data and AI startup, has launched DBRX, a flexible large language model (LLM) that outperforms open-source models on industry benchmarks. DBRX democratizes the training of unique LLMs for organizations globally, allowing for quick customization and deployment. Developed by Mosaic AI and trained on NVIDIA DGX Cloud, DBRX uses MegaBlocks to improve its architecture, offering leading performance and double the efficiency of competing LLMs. While DBRX outperforms some open-source LLMs, it falls behind OpenAI’s GPT-4 in certain aspects. DBRX may “hallucinate” due to training, and lacks the ability to evaluate or produce visuals. Despite concerns about bias and copyright in training data, Databricks is working to address these issues. DBRX is free for research and commercial use, and can be easily incorporated into processes via GitHub and Hugging Face. Databricks recently acquired Lilac, a research business focused on data comprehension and manipulation, to enhance its generative AI capabilities. The acquisition marks Databricks’ expansion beyond data solutions into all areas of generative AI. With investments in Mistral, another generative AI firm, Databricks is committed to providing comprehensive data management and generative AI solutions.
Source link
Source link: https://www.techtimes.com/amp/articles/303001/20240327/databricks-unveils-10-million-generative-ai-model-dbrx-what-makes.htm
GIPHY App Key not set. Please check settings