Updates

CServe Now on Snowflake: Deploy Secure, Optimized LLMs For Less

This groundbreaking solution allows you to self-host LLMs with 81% lower compute costs, enhanced security, and flexible model support.

CentML's CServe Now on Snowflake

Try CServe on Snowflake

Today, we are happy to announce that CServe is now available on Snowflake. With our first Snowflake native app, CServe Mixtral-8x7b LLM enables Snowflake customers to self-host the Mixtral-8x7b model on Snowpark Container Services

With CServe’s advanced optimizations, we reduce the compute requirement to serve this model, enabling you to deploy on just the medium compute node (GPU_NV_M) of Snowpark Container Services. In other words, with CentML CServe, you can self-host Mixtral-8x7b 81% cheaper.

In addition to cost benefits, self-hosting LLMs offer the following benefits to enterprises:

  • Security and Privacy: The model is deployed and served inside your Snowflake account so that your data never leaves your Snowflake security perimeter.
  • Model Flexibility: CentML supports all open-source LLMs and any custom fine-tuned models. For additional model support, please reach out to support@centml.ai.
  • Efficient Resource Utilization: By reducing the computational power needed, we unlock significant cost savings. No need for large SPCS nodes to run very large LLMs.

Be sure to take advantage of the 30-day free trial and experience how CServe on Snowflake can elevate your AI deployment.


Examples

Sentiment Analysis: Return a sentiment for a given text and support sentiment scores.

Here are step-by-step instructions on how to reproduce the example from the demo:

1. After installing the CServe Native App you will have access to all its functions and procedures. Before using LLM functions we need to provision compute resources:

CALL CSERVE.SETUP.CREATE_SERVICE([]);

This will create all necessary Compute Pools and Services for application to use. 

2. You can monitor the Compute Pools and Services spin up process using these commands:

-- compute pools
SHOW COMPUTE POOLS;
-- services
CALL CSERVE.SETUP.SERVICE_STATUS('SERVICES.CSERVE_API_MIXTRAL_56B_SERVICE');

3. It takes around 12 minutes for all compute resources to be in the Ready state

4. We will be using the “SAMPLE_REVIEWS” table with sample iPad customer reviews and the “CSERVE.SERVICES.COMPLETE_MIXTRAL_56B” function to perform sentiment analysis.

5. In the following query we are using prompt engineering to get the sentiment score and the key reason for it:

SELECT
    T.REVIEW,
    CSERVE.SERVICES.COMPLETE_MIXTRAL_56B(
        -- MODEL
        'Mixtral-56B',
        [{'role': 'user', 'content': CONCAT('Perform sentiment analysis on the iPad Pro review provided below. Determine the overall sentiment and identify the primary reason contributing to this sentiment. Return your analysis in JSON format. \n\n Example Review: \n\n"I just got the new iPad Pro and am impressed with the processing speed and the quality of the display. However, the cost seems excessively high given the competition. \n\n Example JSon "{"sentiment": "neutral", "key_reason": "high cost compared to competitors"} \n\n Actual Review: \n\n', T.REVIEW, '\n\n Actual Json:')}],
        { 'max_tokens': 35,
        'temperature': 0.0 }
    ) :choices [0] :messages :: VARCHAR AS GENERATED_SENTIMENT,
FROM
    SAMPLE_REVIEWS AS T;

We are providing the model name, prompt, max tokens and temperature as input arguments to a function.

6. Here is one example of a returned results:

{
"sentiment": "positive",
    "key_reason": "excellent screen quality, responsive Apple Pencil, and good battery life"
}

Since it is a regular Snowflake function you can perform any additional data transformations on top of its output.

Translation: Translate given text from any supported language to any other.

CServe LLM Complete function can also be used to translate content to any language:

1. Follow steps 1-3 from previous example

2. As a sample data source we will be using the “Prepper Open Data Bank – Japanese Corporate Data” data provider.

3. Add data source to your Snowflake account by pressing “Get” button and specifying a local database name: “JAPANESE_CORPORATE_DATA

4. Now we can query the sample data data and use CServe function to translate content:

SELECT
    T.PATENT_FI_DETAIL,
    CSERVE.SERVICES.COMPLETE_MIXTRAL_56B(
        -- MODEL
        'Mixtral-56B',
        [{'role': 'user', 'content': CONCAT('Translate this text from Japanese to English: ', T.PATENT_FI_DETAIL, '\n\n Translation:')}],
        { 'max_tokens': 35,
        'temperature': 0.0 }
    ) :choices [0] :messages :: VARCHAR AS TRANSLATED_TEXT,
FROM JAPANESE_CORPORATE_DATA.E_PODB.CORP_PATENT AS T
LIMIT 10;

5. As a result you will see the original data column and its translation

6. You can apply the same approach to any column and translate it into English

Conclusion

CServe’s availability on Snowflake marks a significant step forward in enabling secure, cost-effective, and flexible deployment of large language models. By self-hosting models like Mixtral-8x7b within your Snowflake account, you not only ensure your data remains secure but also benefit from substantial cost savings and efficient resource utilization.

We encourage you to explore the capabilities of CServe on Snowflake. With support for all open-source and custom fine-tuned models, you have the flexibility to tailor AI solutions to your specific needs. Our step-by-step examples on sentiment analysis and translation are designed to help you get started quickly.

Experience the future of AI integration within Snowflake, and let us support you on this exciting journey.

Start your 30 day free trial now!

Share this

Get started

Let's make your LLM better! Book a Demo