Product
Product

Create And Deploy LLM Apps in Less Than 10 Minutes

byon March 28, 2023

We’re excited to announce that Spellbook is now fully available for self-serve signups! 

Spellbook is Scale’s platform for large language model (LLM) apps. We believe the future of data science and machine learning is people building lightweight applications on top of large language models. 

Speed and infrastructure quality is key–how easy is it to create a new LLM app, how fast can teams experiment, and how quickly can they get feedback on whether that experiment is better than a baseline or not? Once those experiments have signs of life, how efficiently can teams fine tune them, hone them with expert feedback, and launch them into production? 

I’ll walk through how you can create your own production-ready LLM app in less than 10 minutes. Follow along at https://spellbook.scale.com

App Creation

As an easy example, let’s get started by creating a new marketing copy generation app that will take an event, size, and demographic, and generate an event description. 

App setup page in Spellbook
Example dataset with event, size, and demographic data.

Quickly set up your prompt and see how your prompt runs on multiple rows of data. 

Evaluate Your LLM App

Make sure your model is outputting good generation with our human evaluation pipeline. Define evaluation criteria on how a human should determine whether your model output is good or bad. Here, we want to ensure our event description is relevant.

Define human evaluation criteria for your model output.

You’ll get your results back in the form of a “Hit rate” along with downloadable results you can use to fine tune your model!

Receive evaluation results back after humans have finished reviewing.

Deploying Your App

Once you have a model you’re satisfied with, you can deploy your model to production in one click through our deployments support. 

Create a new deployment for an App Variant.

You’ll receive an API endpoint that you can integrate directly into production, with built-in monitoring! 

Integrate your LLM app into production using our code snippets, and monitor your requests and latency.

As you improve your LLM apps, you can keep swapping them out in your deployments tab by selecting the most preferable variant. This ensures that once you’ve gotten an endpoint integrated, you don’t have to update your code again. 

Swap out your deployments with updated variants as you improve the quality of your LLM app.

Try Spellbook

As LLMs become more powerful, the ability to quickly deploy enterprise quality LLM apps to production and quickly iterate based on user feedback is crucial. Try Spellbook today to create your first apps! 

 


The future of your industry starts here.