Build LLM applications with Rust
let exec = Executor::new_default();
let res = Step::for_prompt(prompt!(
"You are a robot assistant for making personalized greetings",
"Make a personalized greeting for Joe"
))
.run(&Parameters::new(), &exec)
.await?;
println!("{}", res);
Features
A robust and scaleable framework for building LLM applications
ChatGPT models
ChatGPT models are supported out of the box. Just deploy and start chatting.
Get started quickly
llm-chain comes with examples ready to be deployed. Just pick the one you need, adapt it and deploy it.
LLaMA support
Full support for LLaMA and deriviate models.
Tools
Give your AI access to the real world with tools. Have your AI run Python scripts, bash and more.
Summarize text
Our map-reduce chain allows you to easily summarize text.
Vector store support
Give your LLMs memory using our vector store system.