Transform, enrich and persist lots of data
Load data from various sources, transform it, enrich it with metadata, and persist it with lazy, asynchronous, parallel pipelines.
Swiftide is a Rust native library for building LLM applications. Large language models are amazing, but need context to solve real problems. Swiftide allows you to ingest, transform and index large amounts of data fast, and then query that data so it it can be injected into prompts. This process is called Retrieval Augmented Generation.
Production LLM applications deal with large amounts of data, concurrent LLM requests and structured and unstructured transformations of data. Rust is great at this. The goal of Swiftide is to build indexing and query pipelines easily, experiment and verify, then ship it right to production.
Transform, enrich and persist lots of data
Load data from various sources, transform it, enrich it with metadata, and persist it with lazy, asynchronous, parallel pipelines.
Transform code and text
Chunk, transform, and augment code with build in transformers. Swiftide uses tree-sitter to interpret and augment code. Text documents, be it markdown, html or unstructured, is also supported.
Experimental query pipeline
Augment queries with retrieved data using the streaming query pipeline and generate a response.
Customizable templated prompts
Customize and bring your own prompts, build on Tera
, a jinja style templating library.
Many existing integrations
Qdrant, OpenAI, Groq, AWS Bedrock, Redis, FastEmbed, Spider and many more.
Easy to extend
Write your own loaders, transformers, and storages by extending straight forward traits.
Written in Rust
Fast, safe, and efficient. Built with Rust’s async and streaming features.
Part of Bosun.ai
Part of Bosun.ai and actively used in production.
Reference
Full API documentation available on docs.rs