Snowflake Cortex AI delivers “simple, effective, and reliable” enterprise AI to thousands of organizations.
Snowflake announced new innovations and enhancements to Snowflake Cortex AI at its annual Snowflake Summit 2024 user conference.
These innovations included chat experiences allowing organizations to develop chatbots in minutes.
Snowflake also said it is further democratizing how any user can customize AI for specific industry use cases through a new no-code interactive interface, access to cutting-edge LLMs, and fine-tuning without server.
The path to model operationalization is accelerated with an integrated experience for ML through Snowflake ML – enabling developers to create, discover and govern models and features throughout the ML lifecycle.
Snowflake’s unified platform for genAI and ML is designed to enable every part of the business to extract more value from their data, while enabling comprehensive security, governance and control to deliver “responsible” AI and reliable” on a large scale.
Baris Gultekin, Head of AI at Snowflake, said: “Our latest advancements in Snowflake Cortex AI remove barriers to entry so all organizations can harness AI to build powerful AI applications at scale and unlock unique differentiation with their business data in AI Data Cloud. .”
Snowflake is unveiling two new chat features, Snowflake Cortex Analyst and Snowflake Cortex Search – both coming soon in public preview – allowing users to develop these chatbots in minutes on their structured and unstructured data, without operational complexity.
Cortex Analyst, built with Meta’s Llama 3 and Mistral Large templates, allows businesses to securely build applications on top of their analytical data in Snowflake.
Additionally, Cortex Search leverages industry-leading scraping and ranking technology from Neeva (acquired by Snowflake in May 2023) as well as Snowflake Arctic integration, so users can build applications from documents and other textual datasets through an enterprise-grade hybrid system. search – a combination of vector and text – as a service. Snowflake also Snowflake Cortex Guard as “coming soon” leveraging Meta’s Llama Guard.
With Cortex Guard, Snowflake says it further unlocks trustworthy AI for businesses, helping customers ensure available models are safe and usable.
In addition to enabling the easy development of personalized chat experiences, Snowflake offers its customers pre-built AI-driven experiences powered by Snowflake’s world-class models. With Document AI (also coming soon), users can easily extract content such as invoice amounts or contract terms from documents using Snowflake’s industry-leading multi-modal LLM, Snowflake Arctic-TILT, which outperforms GPT-4 and scored high on the DocVQA benchmark test. – the standard for answering questions about visual documents.
Organizations including Northern Trust are leveraging Document AI to intelligently process documents at scale to reduce operational costs with greater efficiency. Snowflake is also developing its revolutionary text-to-SQL assistant, Snowflake Copilot, which combines the strengths of Mistral Large with Snowflake’s proprietary SQL generation model to accelerate the productivity of every SQL user.
Other initiatives include:
Snowflake AI and ML Studio
Snowflake Cortex AI provides customers with a robust set of cutting-edge models from leading vendors including Google, Meta, Mistral AI, and Reka, in addition to Snowflake’s leading open source LLM, Snowflake Arctic, to accelerate the development of AI.
Snowflake further democratizes how any user can integrate these powerful models into their enterprise data with the new Snowflake AI & ML Studio – an interactive, no-code interface for teams to get started with AI development and production their AI applications faster.
Precise cortex tuning
To help organizations further improve LLM performance and deliver more personalized experiences, Snowflake introduces Cortex Fine-Tuning, accessible through AI & ML Studio or a simple SQL function. This serverless customization is available for a subset of Meta and Mistral AI models. These refined models can then be easily used via a Cortex AI function, with access managed using Snowflake role-based access controls.
Streamline model and feature management with unified, governed MLOps via Snowflake ML
Once ML and LLM models are developed, most organizations struggle to continuously operate them in production on evolving datasets. Snowflake ML brings MLOps capabilities to AI Data Cloud, so teams can seamlessly discover, manage, and govern their features, models, and metadata throughout the ML lifecycle, from data preprocessing to model management. These centralized MLOps capabilities also integrate with the rest of the Snowflake platform, including Snowflake Notebooks and Snowpark ML for a simple end-to-end experience.
Snowflake Model Registry and Feature Store
Snowflake’s MLOps suite of capabilities includes the Snowflake Model Registry, which allows users to govern access and use of all types of AI models so they can deliver more personalized experiences and cost-effective automations. with complete confidence and efficiency. Additionally, Snowflake announces the Snowflake Feature Store, an integrated solution for data scientists and ML engineers to create, store, manage, and serve consistent ML features for model training and inference, and ML Lineage, so that teams can track usage of features, datasets, and models throughout the end-to-end ML lifecycle.
New innovations to its single, unified platform were also announced at Snowflake Summit 2024, which provide thousands of organizations with increased flexibility and interoperability for their data; new tools that accelerate how developers create in AI Data Cloud; a new collaboration with NVIDIA that customers and partners can leverage to build custom AI data applications in Snowflake; The Polaris Catalog, a fully open, vendor-neutral catalog implementation for Apache Iceberg.
Click below to share this article