FAQ
Does the editor upload my media?
No. All editing and export run locally in your browser. Projects and imported files are stored in IndexedDB under a private database. You can inspect and clear storage at /storage
. Network calls are limited to optional stock media (Pexels/Tenor), AI proxy, and transcription.
How are projects persisted?
Projects auto‑save to IndexedDB. Use Export JSON (.klippy.json
) to save a snapshot or Export Bundle (.klippy.bundle.json[.gz]
) to include assets. Importing bundles rewrites fileIds and rehydrates media.
Which export engines are used?
- Desktop: FFmpeg WebAssembly (Standard tab) for MP4/WebM/GIF with full overlays and audio mix.
- Mobile: MediaBunny (preferred) or MediaRecorder/WebCodecs fallback (Transparent tab UI is optimized for this).
- Transparency: Canvas pipeline → WebM with alpha (MP4 does not support alpha; conversion composites on background).
Why is export slow on my machine?
- Ensure cross‑origin isolation (COOP/COEP) so FFmpeg can multi‑thread. See
public/_headers
andvercel.json
. - Try 720p and the Fast speed mode; use content‑only duration; close other heavy tabs.
- On mobile, prefer the Transparent tab (MediaBunny/WebCodecs) and avoid FFmpeg.
Can I export transparent video?
Yes. Use the Transparent tab to export WebM with alpha. MP4 (H.264) has no alpha channel; conversion to MP4 will composite on a background color.
I get 403 “Unauthorized origin”. How do I fix it?
Set NEXT_PUBLIC_SITE_URL
to your deployed origin and include it in ALLOWED_ORIGINS
(www/non‑www). For Electron or file://
contexts, set ALLOW_NULL_ORIGIN=true
if you trust the shell. This check applies to /api/*
routes (Pexels/Tenor/Meme/AI/Transcription).
Tenor/Pexels searches fail. What should I check?
- Set
TENOR_API_KEY
andPEXELS_API_KEY
on the server (not public). - Confirm CSP
connect-src
allowstenor.googleapis.com
andapi.pexels.com
(middleware config does this by default). - Large media previews stream via
/api/*-proxy
; ensure those routes aren’t blocked by your host.
How does transcription (captions) work?
Use /api/transcription
to proxy audio to your Cloudflare Worker (Whisper) or a custom service. The client extracts audio (WAV mono 16k) and uploads it. The Worker returns { text, segments[] }
, which the editor turns into a caption track. You can switch to OpenAI Whisper or a local Faster‑Whisper server by changing the server route. See “Captioning” docs.
Does it work on mobile?
Yes for basic editing and mobile export, but heavy desktop features (e.g., FFmpeg Standard export) are restricted. The Transparent tab and mobile export panel are tuned for iOS/Android with conservative defaults.
What’s the maximum project size/length?
Browser memory limits vary. For long 1080p+ exports, FFmpeg WASM can hit OOM. The app will try segmented or MediaBunny paths. If you need guaranteed long form renders, see the Remotion Lambda section (server rendering).
Where are logs and errors?
- Export dialog shows a live log and progress. Errors are mapped to clearer messages where possible.
- API errors are returned as JSON;
/api/health
shows whether keys/tokens are configured. - Enable Sentry (optional) to collect client/server errors in production.
Can I host this anywhere?
It’s a Next.js app. You can deploy on Vercel or any Node host. Requirements: serve /wasm/*
with application/wasm
, include COOP/COEP on editor routes for best performance, and set the required environment variables. Cloudflare Pages is supported via public/_headers
.
Which codecs are supported?
H.264 (MP4) and VP8/VP9 (WebM). HEVC/AV1 hardware encoders are not used. For professional alpha or intermediates, extend the engine to target ProRes 4444/APNG/WebP as needed (some support exists in internal utilities).
Why do I see a black first frame sometimes?
The engines warm up video decoders and seek before encoding, but some sources may still need a frame. Try short fades at t=0 or trim a few frames from the head. The app retries the first frame to reduce the chance of a black frame.
What Node version should I use?
Node 18–20 LTS is recommended. Pin engines in package.json
(e.g., "engines": {"node": ">=18 <21"}
) to avoid CI drift.
I cannot export MP4 with transparency. Is that a bug?
No — H.264 does not support an alpha channel. Use WebM with alpha (Transparent tab) or export to a professional alpha codec in a custom pipeline.
Is there a general copy/paste between projects?
No global clipboard. Use Duplicate (D), templates, or export/import .klippy JSON/bundles to move work between timelines.
How do I render on a server?
Export a project bundle and render it with a Remotion Lambda renderer (separate repo). The editor ships with documentation, not the renderer itself. See Rendering > Server rendering.
Do I need a Remotion license?
This project uses Remotion building blocks (for example caption parsing and optional Remotion‑based rendering paths you may choose to integrate). Licensing depends on how you use Remotion in your product. The summary below is not legal advice — always consult the official Remotion docs and license terms:
- Client‑only exports (FFmpeg WebAssembly in the browser) do not involve Remotion rendering services and typically don't require a Remotion rendering license.
- Embedding Remotion Player or rendering compositions using
@remotion/lambda
/ Remotion Cloud is subject to Remotion's license and pricing. Commercial/production use may require purchasing a license or a paid plan. - If you build a SaaS that renders videos for end‑users using Remotion, review the current license options and terms.
See Remotion's licensing page for the latest details: Licensing.