Run models all in one Sandbox 🏖️
Available now on fal.ai

Lucy Realtime 2Transform the World Live


Pure Diffusion. Real-Time Performance.

Emergent Physics

No Depth Maps. No 3D. Just Learning.

Lucy 2 is a pure diffusion model. Physical behavior emerges from learned visual dynamics, not engineered geometry. Cloth separates and folds when unzipped, helmets reveal hair when removed, and fingers respect contact geometry. The model learned these interactions by watching how the world evolves in video.

Infinite Streaming

Zero Drift, Zero Degradation

Smart History Augmentation trains the model on its own imperfect outputs and penalizes quality drift. Lucy 2 recognizes implausible states and self-corrects, pulling generation back toward stable, high-fidelity output. Streams run for hours without identity collapse.

Built for Speed

30fps at 1080p, Near-Zero Latency

Every stage of the pipeline is optimized for real-time bounds: mega-kernels minimize launch overhead, the model architecture is shaped by cycle-level hardware benchmarks, and a custom WebRTC pipeline handles bidirectional video transport with minimal buffering worldwide.


For Developers

A few lines of code.
Real-time video.

fal.ai handles the infrastructure: fast inference, auto-scaling, and a developer-friendly API. No GPUs to manage.

  • Serverless: scales to zero, scales to millions
  • Pay per second, no minimums
  • Python and JavaScript SDKs, plus REST API
import fal_client

connection = fal_client.realtime.connect(
    "decart/lucy-realtime-2/realtime",
    on_result=lambda result: print(result),
)

connection.send({})

# See API docs for full WebRTC streaming setup
FAQ

Common questions about Lucy Realtime 2

What can I do with Lucy Realtime 2?

Lucy 2 supports real-time character swaps, virtual try-on (clothing, accessories, outfits), product placement, motion control, and full environment transformation. All transformations are guided by text prompts and reference images while the video streams live at 30fps in 1080p.

How does Lucy 2 handle physics and 3D structure?

Lucy 2 is a pure diffusion model. It does not use depth maps, 3D meshes, or hybrid pipelines. Physical behavior emerges from learned visual dynamics. For example, unzipping a jacket causes cloth to separate and fold naturally, and removing a helmet handles object separation and reveals the hair beneath. The model learned these behaviors by observing how the world evolves in video, without explicit physics engines.

Does the video quality degrade over time?

No. Most autoregressive video models degrade as small artifacts compound frame by frame. Lucy 2 uses a technique called Smart History Augmentation: during training, the model is exposed to its own imperfect outputs and penalized when quality drifts. This teaches it to self-correct. Lucy 2 can run indefinitely without identity collapse or visual degradation. Streams can persist for hours.

What makes Lucy 2 run in real time?

Real-time performance comes from optimizations at every layer: mega-kernels that reduce launch overhead and keep model activations close to tensor cores, a custom model architecture tailored via cycle-level microbenchmarks of the accelerator, and a custom WebRTC pipeline for minimal buffering and bidirectional video transport worldwide. See the Decart blog post for full technical details.

Can I use Lucy 2 for robotics?

Yes. Lucy 2 is designed as a real-time data augmentation and simulation engine for robotics. A single real-world demonstration can be expanded into thousands of plausible variations with different textures, lighting conditions, object geometries, and backgrounds, while preserving physical consistency. This enables more robust visual-language-action (VLA) and imitation learning policies.

How much does Lucy Realtime 2 cost on fal.ai?

Pricing is $0.02 per second with no minimums or subscriptions. You only pay for what you stream.

How do I get started with the API?

Install the fal.ai SDK (Python or JavaScript), grab an API key from your dashboard, and make your first request. The API is serverless, so there are no GPUs to manage and no infrastructure to set up. Check the API documentation for all available parameters and WebRTC streaming setup.

Can I use Lucy Realtime 2 for commercial projects?

Yes. Output generated through the fal.ai API can be used in commercial projects. Check fal.ai's terms of service for full details on usage rights and licensing.

Ready to transform?

Start transforming live video with Lucy Realtime 2 on fal.ai.