Financial Personal
Snowflake Teams Up with Meta to Host and Optimize New Flagship Model Family in Snowflake Cortex AI
Snowflake’s AI Research Team, in collaboration with the open source community, launches a Massive LLM Inference and Fine-Tuning System Stack — establishing a new state-of-the-art solution for open source inference and fine-tuning systems for multi-hu
![Snowflake Teams Up with Meta to Host and Optimize New Flagship Model Family in Snowflake Cortex AI](https://img.easybranches.com/uploads/news/2024/07/snowflake-teams-up-with-meta-to-host-and-optimize-new-flagship-model-family-in-snowflake-cortex-ai.jpg)
No-Headquarters/BOZEMAN, Mont. -- Snowflake (NYSE: SNOW), the AI Data Cloud company, today announced that it will host the Llama 3.1 collection of multilingual open source large language models (LLMs) in Snowflake Cortex AI for enterprises to easily harness and build powerful AI applications at scale. This offering includes Meta’s largest and most powerful open source LLM, Llama 3.1 405B, with Snowflake developing and open sourcing the inference system stack to enable real-time, high-throughput inference and further democratize powerful natural language processing and generation applications. Snowflake’s industry-leading AI Research Team has optimized Llama 3.1 405B for both inference and fine-tuning, supporting a massive 128K context window from day one, while enabling real-time inference with up to 3x lower end-to-end latency and 1.4x higher throughput than existing open source solutions. Moreover, it allows for fine-tuning on the massive model using just a single GPU node — eliminating costs and complexity for developers and users — all within Cortex AI. Read More
Related
Share this page
Guest Posts by Easy BranchesGet Reliable Matka Guessing Forum with our Satta Matka Expert and Get all Matka Chart For Free.