Use this SDK to add realtime video, audio and data features to your Rust app. By connecting to LiveKit Cloud or a self-hosted server, you can quickly build applications such as multi-modal AI, live streaming, or video calls with just a few lines of code.
Receiving tracks
Publishing tracks
Data channels
Simulcast
SVC codecs (AV1/VP9)
Adaptive Streaming
Dynacast
Hardware video enc/dec
VideoToolbox for MacOS/iOS
Supported Platforms
Windows
MacOS
Linux
iOS
Android
livekit-api
: Server APIs and auth token generation
livekit
: LiveKit real-time SDK
livekit-ffi
: Internal crate, used to generate bindings for other languages
livekit-protocol
: LiveKit protocol generated code
When adding the SDK as a dependency to your project, make sure to add the
necessary rustflags
to your cargo config, otherwise linking may fail.
Also, please refer to the list of the supported platform toolkits.
Currently, Tokio is required to use this SDK, however we plan to make the async executor runtime agnostic.
use livekit_api::access_token;use std::env;fn create_token() -> Result{let api_key = env::var("LIVEKIT_API_KEY").expect("LIVEKIT_API_KEY is not set");let api_secret = env::var("LIVEKIT_API_SECRET").expect("LIVEKIT_API_SECRET is not set");let token = access_token::AccessToken::with_api_key(&api_key, &api_secret).with_identity("rust-bot").with_name("Rust Bot").with_grants(access_token::VideoGrants { room_join: true, room: "my-room".to_string(), ..Default::default()}).to_jwt();return token}
use livekit_api::services::room::{CreateRoomOptions, RoomClient};#[tokio::main]async fn main() {let room_service = RoomClient::new("http://localhost:7880").unwrap();let room = room_service.create_room("my_room", CreateRoomOptions::default()).await.unwrap();println!("Created room: {:?}", room);}
use livekit::prelude::*;#[tokio::main]async fn main() -> Result<()> {let (room, mut room_events) = Room::connect(&url, &token).await?;while let Some(event) = room_events.recv().await {match event {RoomEvent::TrackSubscribed { track, publication, participant } => {// ...}_ => {}}}Ok(())}
...use futures::StreamExt; // this trait is required for iterating on audio & video framesuse livekit::prelude::*;match event {RoomEvent::TrackSubscribed { track, publication, participant } => {match track {RemoteTrack::Audio(audio_track) => {let rtc_track = audio_track.rtc_track();let mut audio_stream = NativeAudioStream::new(rtc_track);tokio::spawn(async move {// Receive the audio frames in a new taskwhile let Some(audio_frame) = audio_stream.next().await {log::info!("received audio frame - {audio_frame:#?}");}});},RemoteTrack::Video(video_track) => {let rtc_track = video_track.rtc_track();let mut video_stream = NativeVideoStream::new(rtc_track);tokio::spawn(async move {// Receive the video frames in a new taskwhile let Some(video_frame) = video_stream.next().await {log::info!("received video frame - {video_frame:#?}");}});},}},_ => {}}
basic room: simple example connecting to a room.
wgpu_room: complete example app with video rendering using wgpu and egui.
mobile: mobile app targeting iOS and Android
play_from_disk: publish audio from a wav file
save_to_disk: save received audio to a wav file
LiveKit aims to provide an open source, end-to-end WebRTC stack that works everywhere. We have two goals in mind with this SDK:
Build a standalone, cross-platform LiveKit client SDK for Rustaceans.
Build a common core for other platform-specific SDKs (e.g. Unity, Unreal, iOS, Android)
Regarding (2), we've already developed a number of client SDKs for several platforms and encountered a few challenges in the process:
There's a significant amount of business/control logic in our signaling protocol and WebRTC. Currently, this logic needs to be implemented in every new platform we support.
Interactions with media devices and encoding/decoding are specific to each platform and framework.
For multi-platform frameworks (e.g. Unity, Flutter, React Native), the aforementioned tasks proved to be extremely painful.
Thus, we posited a Rust SDK, something we wanted build anyway, encapsulating all our business logic and platform-specific APIs into a clean set of abstractions, could also serve as the foundation for our other SDKs!
We'll first use it as a basis for our Unity SDK (under development), but over time, it will power our other SDKs, as well.
LiveKit Ecosystem | |
---|---|
Realtime SDKs | React Components · Browser · Swift Components · iOS/macOS/visionOS · Android · Flutter · React Native · Rust · Node.js · Python · Unity (web) · Unity (beta) |
Server APIs | Node.js · Golang · Ruby · Java/Kotlin · Python · Rust · PHP (community) |
Agents Frameworks | Python · Playground |
Services | LiveKit server · Egress · Ingress · SIP |
Resources | Docs · Example apps · Cloud · Self-hosting · CLI |