mirror of
https://github.com/affaan-m/everything-claude-code.git
synced 2026-03-31 22:23:27 +08:00
feat: port remotion-video-creation skill (29 rules), restore missing files
New skill: - remotion-video-creation: 29 domain-specific Remotion rules covering 3D/Three.js, animations, audio, captions, charts, compositions, fonts, GIFs, Lottie, measuring, sequencing, tailwind, text animations, timing, transitions, trimming, and video embedding. Ported from personal skills. Restored: - autonomous-agent-harness/SKILL.md (was in commit but missing from worktree) - lead-intelligence/ (full directory restored from branch commit) Updated: - manifests/install-modules.json: added remotion-video-creation to media-generation - README.md + AGENTS.md: synced counts to 139 skills Catalog validates: 30 agents, 60 commands, 139 skills.
This commit is contained in:
@@ -1,6 +1,6 @@
|
||||
# Everything Claude Code (ECC) — Agent Instructions
|
||||
|
||||
This is a **production-ready AI coding plugin** providing 30 specialized agents, 136 skills, 60 commands, and automated hook workflows for software development.
|
||||
This is a **production-ready AI coding plugin** providing 30 specialized agents, 139 skills, 60 commands, and automated hook workflows for software development.
|
||||
|
||||
**Version:** 1.9.0
|
||||
|
||||
@@ -142,7 +142,7 @@ Troubleshoot failures: check test isolation → verify mocks → fix implementat
|
||||
|
||||
```
|
||||
agents/ — 30 specialized subagents
|
||||
skills/ — 136 workflow skills and domain knowledge
|
||||
skills/ — 139 workflow skills and domain knowledge
|
||||
commands/ — 60 slash commands
|
||||
hooks/ — Trigger-based automations
|
||||
rules/ — Always-follow guidelines (common + per-language)
|
||||
|
||||
@@ -220,7 +220,7 @@ For manual install instructions see the README in the `rules/` folder. When copy
|
||||
/plugin list everything-claude-code@everything-claude-code
|
||||
```
|
||||
|
||||
**That's it!** You now have access to 30 agents, 136 skills, and 60 commands.
|
||||
**That's it!** You now have access to 30 agents, 139 skills, and 60 commands.
|
||||
|
||||
### Multi-model commands require additional setup
|
||||
|
||||
@@ -1111,7 +1111,7 @@ The configuration is automatically detected from `.opencode/opencode.json`.
|
||||
|---------|-------------|----------|--------|
|
||||
| Agents | PASS: 30 agents | PASS: 12 agents | **Claude Code leads** |
|
||||
| Commands | PASS: 60 commands | PASS: 31 commands | **Claude Code leads** |
|
||||
| Skills | PASS: 136 skills | PASS: 37 skills | **Claude Code leads** |
|
||||
| Skills | PASS: 139 skills | PASS: 37 skills | **Claude Code leads** |
|
||||
| Hooks | PASS: 8 event types | PASS: 11 events | **OpenCode has more!** |
|
||||
| Rules | PASS: 29 rules | PASS: 13 instructions | **Claude Code leads** |
|
||||
| MCP Servers | PASS: 14 servers | PASS: Full | **Full parity** |
|
||||
|
||||
@@ -317,6 +317,7 @@
|
||||
"description": "Media generation and AI-assisted editing skills.",
|
||||
"paths": [
|
||||
"skills/fal-ai-media",
|
||||
"skills/remotion-video-creation",
|
||||
"skills/video-editing",
|
||||
"skills/videodb"
|
||||
],
|
||||
|
||||
267
skills/autonomous-agent-harness/SKILL.md
Normal file
267
skills/autonomous-agent-harness/SKILL.md
Normal file
@@ -0,0 +1,267 @@
|
||||
---
|
||||
name: autonomous-agent-harness
|
||||
description: Transform Claude Code into a fully autonomous agent system with persistent memory, scheduled operations, computer use, and task queuing. Replaces standalone agent frameworks (Hermes, AutoGPT) by leveraging Claude Code's native crons, dispatch, MCP tools, and memory. Use when the user wants continuous autonomous operation, scheduled tasks, or a self-directing agent loop.
|
||||
origin: ECC
|
||||
---
|
||||
|
||||
# Autonomous Agent Harness
|
||||
|
||||
Turn Claude Code into a persistent, self-directing agent system using only native features and MCP servers.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- User wants an agent that runs continuously or on a schedule
|
||||
- Setting up automated workflows that trigger periodically
|
||||
- Building a personal AI assistant that remembers context across sessions
|
||||
- User says "run this every day", "check on this regularly", "keep monitoring"
|
||||
- Wants to replicate functionality from Hermes, AutoGPT, or similar autonomous agent frameworks
|
||||
- Needs computer use combined with scheduled execution
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
┌──────────────────────────────────────────────────────────────┐
|
||||
│ Claude Code Runtime │
|
||||
│ │
|
||||
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌─────────────┐ │
|
||||
│ │ Crons │ │ Dispatch │ │ Memory │ │ Computer │ │
|
||||
│ │ Schedule │ │ Remote │ │ Store │ │ Use │ │
|
||||
│ │ Tasks │ │ Agents │ │ │ │ │ │
|
||||
│ └────┬─────┘ └────┬─────┘ └────┬─────┘ └──────┬──────┘ │
|
||||
│ │ │ │ │ │
|
||||
│ ▼ ▼ ▼ ▼ │
|
||||
│ ┌──────────────────────────────────────────────────────┐ │
|
||||
│ │ ECC Skill + Agent Layer │ │
|
||||
│ │ │ │
|
||||
│ │ skills/ agents/ commands/ hooks/ │ │
|
||||
│ └──────────────────────────────────────────────────────┘ │
|
||||
│ │ │ │ │ │
|
||||
│ ▼ ▼ ▼ ▼ │
|
||||
│ ┌──────────────────────────────────────────────────────┐ │
|
||||
│ │ MCP Server Layer │ │
|
||||
│ │ │ │
|
||||
│ │ memory github exa supabase browser-use │ │
|
||||
│ └──────────────────────────────────────────────────────┘ │
|
||||
└──────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Core Components
|
||||
|
||||
### 1. Persistent Memory
|
||||
|
||||
Use Claude Code's built-in memory system enhanced with MCP memory server for structured data.
|
||||
|
||||
**Built-in memory** (`~/.claude/projects/*/memory/`):
|
||||
- User preferences, feedback, project context
|
||||
- Stored as markdown files with frontmatter
|
||||
- Automatically loaded at session start
|
||||
|
||||
**MCP memory server** (structured knowledge graph):
|
||||
- Entities, relations, observations
|
||||
- Queryable graph structure
|
||||
- Cross-session persistence
|
||||
|
||||
**Memory patterns:**
|
||||
|
||||
```
|
||||
# Short-term: current session context
|
||||
Use TodoWrite for in-session task tracking
|
||||
|
||||
# Medium-term: project memory files
|
||||
Write to ~/.claude/projects/*/memory/ for cross-session recall
|
||||
|
||||
# Long-term: MCP knowledge graph
|
||||
Use mcp__memory__create_entities for permanent structured data
|
||||
Use mcp__memory__create_relations for relationship mapping
|
||||
Use mcp__memory__add_observations for new facts about known entities
|
||||
```
|
||||
|
||||
### 2. Scheduled Operations (Crons)
|
||||
|
||||
Use Claude Code's scheduled tasks to create recurring agent operations.
|
||||
|
||||
**Setting up a cron:**
|
||||
|
||||
```
|
||||
# Via MCP tool
|
||||
mcp__scheduled-tasks__create_scheduled_task({
|
||||
name: "daily-pr-review",
|
||||
schedule: "0 9 * * 1-5", # 9 AM weekdays
|
||||
prompt: "Review all open PRs in affaan-m/everything-claude-code. For each: check CI status, review changes, flag issues. Post summary to memory.",
|
||||
project_dir: "/path/to/repo"
|
||||
})
|
||||
|
||||
# Via claude -p (programmatic mode)
|
||||
echo "Review open PRs and summarize" | claude -p --project /path/to/repo
|
||||
```
|
||||
|
||||
**Useful cron patterns:**
|
||||
|
||||
| Pattern | Schedule | Use Case |
|
||||
|---------|----------|----------|
|
||||
| Daily standup | `0 9 * * 1-5` | Review PRs, issues, deploy status |
|
||||
| Weekly review | `0 10 * * 1` | Code quality metrics, test coverage |
|
||||
| Hourly monitor | `0 * * * *` | Production health, error rate checks |
|
||||
| Nightly build | `0 2 * * *` | Run full test suite, security scan |
|
||||
| Pre-meeting | `*/30 * * * *` | Prepare context for upcoming meetings |
|
||||
|
||||
### 3. Dispatch / Remote Agents
|
||||
|
||||
Trigger Claude Code agents remotely for event-driven workflows.
|
||||
|
||||
**Dispatch patterns:**
|
||||
|
||||
```bash
|
||||
# Trigger from CI/CD
|
||||
curl -X POST "https://api.anthropic.com/dispatch" \
|
||||
-H "Authorization: Bearer $ANTHROPIC_API_KEY" \
|
||||
-d '{"prompt": "Build failed on main. Diagnose and fix.", "project": "/repo"}'
|
||||
|
||||
# Trigger from webhook
|
||||
# GitHub webhook → dispatch → Claude agent → fix → PR
|
||||
|
||||
# Trigger from another agent
|
||||
claude -p "Analyze the output of the security scan and create issues for findings"
|
||||
```
|
||||
|
||||
### 4. Computer Use
|
||||
|
||||
Leverage Claude's computer-use MCP for physical world interaction.
|
||||
|
||||
**Capabilities:**
|
||||
- Browser automation (navigate, click, fill forms, screenshot)
|
||||
- Desktop control (open apps, type, mouse control)
|
||||
- File system operations beyond CLI
|
||||
|
||||
**Use cases within the harness:**
|
||||
- Automated testing of web UIs
|
||||
- Form filling and data entry
|
||||
- Screenshot-based monitoring
|
||||
- Multi-app workflows
|
||||
|
||||
### 5. Task Queue
|
||||
|
||||
Manage a persistent queue of tasks that survive session boundaries.
|
||||
|
||||
**Implementation:**
|
||||
|
||||
```
|
||||
# Task persistence via memory
|
||||
Write task queue to ~/.claude/projects/*/memory/task-queue.md
|
||||
|
||||
# Task format
|
||||
---
|
||||
name: task-queue
|
||||
type: project
|
||||
description: Persistent task queue for autonomous operation
|
||||
---
|
||||
|
||||
## Active Tasks
|
||||
- [ ] PR #123: Review and approve if CI green
|
||||
- [ ] Monitor deploy: check /health every 30 min for 2 hours
|
||||
- [ ] Research: Find 5 leads in AI tooling space
|
||||
|
||||
## Completed
|
||||
- [x] Daily standup: reviewed 3 PRs, 2 issues
|
||||
```
|
||||
|
||||
## Replacing Hermes
|
||||
|
||||
| Hermes Component | ECC Equivalent | How |
|
||||
|------------------|---------------|-----|
|
||||
| Gateway/Router | Claude Code dispatch + crons | Scheduled tasks trigger agent sessions |
|
||||
| Memory System | Claude memory + MCP memory server | Built-in persistence + knowledge graph |
|
||||
| Tool Registry | MCP servers | Dynamically loaded tool providers |
|
||||
| Orchestration | ECC skills + agents | Skill definitions direct agent behavior |
|
||||
| Computer Use | computer-use MCP | Native browser and desktop control |
|
||||
| Context Manager | Session management + memory | ECC 2.0 session lifecycle |
|
||||
| Task Queue | Memory-persisted task list | TodoWrite + memory files |
|
||||
|
||||
## Setup Guide
|
||||
|
||||
### Step 1: Configure MCP Servers
|
||||
|
||||
Ensure these are in `~/.claude.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"memory": {
|
||||
"command": "npx",
|
||||
"args": ["-y", "@anthropic/memory-mcp-server"]
|
||||
},
|
||||
"scheduled-tasks": {
|
||||
"command": "npx",
|
||||
"args": ["-y", "@anthropic/scheduled-tasks-mcp-server"]
|
||||
},
|
||||
"computer-use": {
|
||||
"command": "npx",
|
||||
"args": ["-y", "@anthropic/computer-use-mcp-server"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Step 2: Create Base Crons
|
||||
|
||||
```bash
|
||||
# Daily morning briefing
|
||||
claude -p "Create a scheduled task: every weekday at 9am, review my GitHub notifications, open PRs, and calendar. Write a morning briefing to memory."
|
||||
|
||||
# Continuous learning
|
||||
claude -p "Create a scheduled task: every Sunday at 8pm, extract patterns from this week's sessions and update the learned skills."
|
||||
```
|
||||
|
||||
### Step 3: Initialize Memory Graph
|
||||
|
||||
```bash
|
||||
# Bootstrap your identity and context
|
||||
claude -p "Create memory entities for: me (user profile), my projects, my key contacts. Add observations about current priorities."
|
||||
```
|
||||
|
||||
### Step 4: Enable Computer Use (Optional)
|
||||
|
||||
Grant computer-use MCP the necessary permissions for browser and desktop control.
|
||||
|
||||
## Example Workflows
|
||||
|
||||
### Autonomous PR Reviewer
|
||||
```
|
||||
Cron: every 30 min during work hours
|
||||
1. Check for new PRs on watched repos
|
||||
2. For each new PR:
|
||||
- Pull branch locally
|
||||
- Run tests
|
||||
- Review changes with code-reviewer agent
|
||||
- Post review comments via GitHub MCP
|
||||
3. Update memory with review status
|
||||
```
|
||||
|
||||
### Personal Research Agent
|
||||
```
|
||||
Cron: daily at 6 AM
|
||||
1. Check saved search queries in memory
|
||||
2. Run Exa searches for each query
|
||||
3. Summarize new findings
|
||||
4. Compare against yesterday's results
|
||||
5. Write digest to memory
|
||||
6. Flag high-priority items for morning review
|
||||
```
|
||||
|
||||
### Meeting Prep Agent
|
||||
```
|
||||
Trigger: 30 min before each calendar event
|
||||
1. Read calendar event details
|
||||
2. Search memory for context on attendees
|
||||
3. Pull recent email/Slack threads with attendees
|
||||
4. Prepare talking points and agenda suggestions
|
||||
5. Write prep doc to memory
|
||||
```
|
||||
|
||||
## Constraints
|
||||
|
||||
- Cron tasks run in isolated sessions — they don't share context with interactive sessions unless through memory.
|
||||
- Computer use requires explicit permission grants. Don't assume access.
|
||||
- Remote dispatch may have rate limits. Design crons with appropriate intervals.
|
||||
- Memory files should be kept concise. Archive old data rather than letting files grow unbounded.
|
||||
- Always verify that scheduled tasks completed successfully. Add error handling to cron prompts.
|
||||
222
skills/lead-intelligence/SKILL.md
Normal file
222
skills/lead-intelligence/SKILL.md
Normal file
@@ -0,0 +1,222 @@
|
||||
---
|
||||
name: lead-intelligence
|
||||
description: AI-native lead intelligence and outreach pipeline. Replaces Apollo, Clay, and ZoomInfo with agent-powered signal scoring, mutual ranking, warm path discovery, and personalized outreach. Use when the user wants to find, qualify, and reach high-value contacts.
|
||||
origin: ECC
|
||||
---
|
||||
|
||||
# Lead Intelligence
|
||||
|
||||
Agent-powered lead intelligence pipeline that finds, scores, and reaches high-value contacts through social graph analysis and warm path discovery.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- User wants to find leads or prospects in a specific industry
|
||||
- Building an outreach list for partnerships, sales, or fundraising
|
||||
- Researching who to reach out to and the best path to reach them
|
||||
- User says "find leads", "outreach list", "who should I reach out to", "warm intros"
|
||||
- Needs to score or rank a list of contacts by relevance
|
||||
- Wants to map mutual connections to find warm introduction paths
|
||||
|
||||
## Tool Requirements
|
||||
|
||||
### Required
|
||||
- **Exa MCP** — Deep web search for people, companies, and signals (`web_search_exa`)
|
||||
- **X API** — Follower/following graph, mutual analysis, recent activity (`X_BEARER_TOKEN`, `X_ACCESS_TOKEN`)
|
||||
|
||||
### Optional (enhance results)
|
||||
- **LinkedIn** — Via browser-use MCP or direct API for connection graph
|
||||
- **Apollo/Clay API** — For enrichment cross-reference if user has access
|
||||
- **GitHub MCP** — For developer-centric lead qualification
|
||||
|
||||
## Pipeline Overview
|
||||
|
||||
```
|
||||
┌─────────────┐ ┌──────────────┐ ┌─────────────────┐ ┌──────────────┐ ┌─────────────────┐
|
||||
│ 1. Signal │────▶│ 2. Mutual │────▶│ 3. Warm Path │────▶│ 4. Enrich │────▶│ 5. Outreach │
|
||||
│ Scoring │ │ Ranking │ │ Discovery │ │ │ │ Draft │
|
||||
└─────────────┘ └──────────────┘ └─────────────────┘ └──────────────┘ └─────────────────┘
|
||||
```
|
||||
|
||||
## Stage 1: Signal Scoring
|
||||
|
||||
Search for high-signal people in target verticals. Assign a weight to each based on:
|
||||
|
||||
| Signal | Weight | Source |
|
||||
|--------|--------|--------|
|
||||
| Role/title alignment | 30% | Exa, LinkedIn |
|
||||
| Industry match | 25% | Exa company search |
|
||||
| Recent activity on topic | 20% | X API search, Exa |
|
||||
| Follower count / influence | 10% | X API |
|
||||
| Location proximity | 10% | Exa, LinkedIn |
|
||||
| Engagement with your content | 5% | X API interactions |
|
||||
|
||||
### Signal Search Approach
|
||||
|
||||
```python
|
||||
# Step 1: Define target parameters
|
||||
target_verticals = ["prediction markets", "AI tooling", "developer tools"]
|
||||
target_roles = ["founder", "CEO", "CTO", "VP Engineering", "investor", "partner"]
|
||||
target_locations = ["San Francisco", "New York", "London", "remote"]
|
||||
|
||||
# Step 2: Exa deep search for people
|
||||
for vertical in target_verticals:
|
||||
results = web_search_exa(
|
||||
query=f"{vertical} {role} founder CEO",
|
||||
category="company",
|
||||
numResults=20
|
||||
)
|
||||
# Score each result
|
||||
|
||||
# Step 3: X API search for active voices
|
||||
x_search = search_recent_tweets(
|
||||
query="prediction markets OR AI tooling OR developer tools",
|
||||
max_results=100
|
||||
)
|
||||
# Extract and score unique authors
|
||||
```
|
||||
|
||||
## Stage 2: Mutual Ranking
|
||||
|
||||
For each scored target, analyze the user's social graph to find the warmest path.
|
||||
|
||||
### Algorithm
|
||||
|
||||
1. Pull user's X following list and LinkedIn connections
|
||||
2. For each high-signal target, check for shared connections
|
||||
3. Rank mutuals by:
|
||||
|
||||
| Factor | Weight |
|
||||
|--------|--------|
|
||||
| Number of connections to targets | 40% — highest weight, most connections = highest rank |
|
||||
| Mutual's current role/company | 20% — decision maker vs individual contributor |
|
||||
| Mutual's location | 15% — same city = easier intro |
|
||||
| Industry alignment | 15% — same vertical = natural intro |
|
||||
| Mutual's X handle / LinkedIn | 10% — identifiability for outreach |
|
||||
|
||||
### Output Format
|
||||
|
||||
```
|
||||
MUTUAL RANKING REPORT
|
||||
=====================
|
||||
|
||||
#1 @mutual_handle (Score: 92)
|
||||
Name: Jane Smith
|
||||
Role: Partner @ Acme Ventures
|
||||
Location: San Francisco
|
||||
Connections to targets: 7
|
||||
Connected to: @target1, @target2, @target3, @target4, @target5, @target6, @target7
|
||||
Best intro path: Jane invested in Target1's company
|
||||
|
||||
#2 @mutual_handle2 (Score: 85)
|
||||
...
|
||||
```
|
||||
|
||||
## Stage 3: Warm Path Discovery
|
||||
|
||||
For each target, find the shortest introduction chain:
|
||||
|
||||
```
|
||||
You ──[follows]──▶ Mutual A ──[invested in]──▶ Target Company
|
||||
You ──[follows]──▶ Mutual B ──[co-founded with]──▶ Target Person
|
||||
You ──[met at]──▶ Event ──[also attended]──▶ Target Person
|
||||
```
|
||||
|
||||
### Path Types (ordered by warmth)
|
||||
1. **Direct mutual** — You both follow/know the same person
|
||||
2. **Portfolio connection** — Mutual invested in or advises target's company
|
||||
3. **Co-worker/alumni** — Mutual worked at same company or attended same school
|
||||
4. **Event overlap** — Both attended same conference/program
|
||||
5. **Content engagement** — Target engaged with mutual's content or vice versa
|
||||
|
||||
## Stage 4: Enrichment
|
||||
|
||||
For each qualified lead, pull:
|
||||
|
||||
- Full name, current title, company
|
||||
- Company size, funding stage, recent news
|
||||
- Recent X posts (last 30 days) — topics, tone, interests
|
||||
- Mutual interests with user (shared follows, similar content)
|
||||
- Recent company events (product launch, funding round, hiring)
|
||||
|
||||
### Enrichment Sources
|
||||
- Exa: company data, news, blog posts
|
||||
- X API: recent tweets, bio, followers
|
||||
- GitHub: open source contributions (for developer-centric leads)
|
||||
- LinkedIn (via browser-use): full profile, experience, education
|
||||
|
||||
## Stage 5: Outreach Draft
|
||||
|
||||
Generate personalized outreach for each lead. Two modes:
|
||||
|
||||
### Warm Intro Request (to mutual)
|
||||
```
|
||||
hey [mutual name],
|
||||
|
||||
quick ask. i see you know [target name] at [company].
|
||||
i'm building [your product] which [1-line relevance to target].
|
||||
would you be open to a quick intro? happy to send you a
|
||||
forwardable blurb.
|
||||
|
||||
[your name]
|
||||
```
|
||||
|
||||
### Direct Cold Outreach (to target)
|
||||
```
|
||||
hey [target name],
|
||||
|
||||
[specific reference to their recent work/post/announcement].
|
||||
i'm [your name], building [product]. [1 line on why this is
|
||||
relevant to them specifically].
|
||||
|
||||
[specific low-friction ask].
|
||||
|
||||
[your name]
|
||||
```
|
||||
|
||||
### Anti-Patterns (never do)
|
||||
- Generic templates with no personalization
|
||||
- Long paragraphs explaining your whole company
|
||||
- Multiple asks in one message
|
||||
- Fake familiarity ("loved your recent talk!" without specifics)
|
||||
- Bulk-sent messages with visible merge fields
|
||||
|
||||
## Configuration
|
||||
|
||||
Users should set these environment variables:
|
||||
|
||||
```bash
|
||||
# Required
|
||||
export X_BEARER_TOKEN="..."
|
||||
export X_ACCESS_TOKEN="..."
|
||||
export X_ACCESS_TOKEN_SECRET="..."
|
||||
export X_API_KEY="..."
|
||||
export X_API_SECRET="..."
|
||||
export EXA_API_KEY="..."
|
||||
|
||||
# Optional
|
||||
export LINKEDIN_COOKIE="..." # For browser-use LinkedIn access
|
||||
export APOLLO_API_KEY="..." # For Apollo enrichment
|
||||
```
|
||||
|
||||
## Agents
|
||||
|
||||
This skill includes specialized agents in the `agents/` subdirectory:
|
||||
|
||||
- **signal-scorer** — Searches and ranks prospects by relevance signals
|
||||
- **mutual-mapper** — Maps social graph connections and finds warm paths
|
||||
- **enrichment-agent** — Pulls detailed profile and company data
|
||||
- **outreach-drafter** — Generates personalized messages
|
||||
|
||||
## Example Usage
|
||||
|
||||
```
|
||||
User: find me the top 20 people in prediction markets I should reach out to
|
||||
|
||||
Agent workflow:
|
||||
1. signal-scorer searches Exa and X for prediction market leaders
|
||||
2. mutual-mapper checks user's X graph for shared connections
|
||||
3. enrichment-agent pulls company data and recent activity
|
||||
4. outreach-drafter generates personalized messages for top ranked leads
|
||||
|
||||
Output: Ranked list with warm paths and draft outreach for each
|
||||
```
|
||||
85
skills/lead-intelligence/agents/enrichment-agent.md
Normal file
85
skills/lead-intelligence/agents/enrichment-agent.md
Normal file
@@ -0,0 +1,85 @@
|
||||
---
|
||||
name: enrichment-agent
|
||||
description: Pulls detailed profile, company, and activity data for qualified leads. Enriches prospects with recent news, funding data, content interests, and mutual overlap.
|
||||
tools:
|
||||
- Bash
|
||||
- Read
|
||||
- WebSearch
|
||||
- WebFetch
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# Enrichment Agent
|
||||
|
||||
You enrich qualified leads with detailed profile, company, and activity data.
|
||||
|
||||
## Task
|
||||
|
||||
Given a list of qualified prospects, pull comprehensive data from available sources to enable personalized outreach.
|
||||
|
||||
## Data Points to Collect
|
||||
|
||||
### Person
|
||||
- Full name, current title, company
|
||||
- X handle, LinkedIn URL, personal site
|
||||
- Recent posts (last 30 days) — topics, tone, key takes
|
||||
- Speaking engagements, podcast appearances
|
||||
- Open source contributions (if developer-centric)
|
||||
- Mutual interests with user (shared follows, similar content)
|
||||
|
||||
### Company
|
||||
- Company name, size, stage
|
||||
- Funding history (last round amount, investors)
|
||||
- Recent news (product launches, pivots, hiring)
|
||||
- Tech stack (if relevant)
|
||||
- Competitors and market position
|
||||
|
||||
### Activity Signals
|
||||
- Last X post date and topic
|
||||
- Recent blog posts or publications
|
||||
- Conference attendance
|
||||
- Job changes in last 6 months
|
||||
- Company milestones
|
||||
|
||||
## Enrichment Sources
|
||||
|
||||
1. **Exa** — Company data, news, blog posts, research
|
||||
2. **X API** — Recent tweets, bio, follower data
|
||||
3. **GitHub** — Open source profiles (if applicable)
|
||||
4. **Web** — Personal sites, company pages, press releases
|
||||
|
||||
## Output Format
|
||||
|
||||
```
|
||||
ENRICHED PROFILE: [Name]
|
||||
========================
|
||||
|
||||
Person:
|
||||
Title: [current role]
|
||||
Company: [company name]
|
||||
Location: [city]
|
||||
X: @[handle] ([follower count] followers)
|
||||
LinkedIn: [url]
|
||||
|
||||
Company Intel:
|
||||
Stage: [seed/A/B/growth/public]
|
||||
Last Funding: $[amount] ([date]) led by [investor]
|
||||
Headcount: ~[number]
|
||||
Recent News: [1-2 bullet points]
|
||||
|
||||
Recent Activity:
|
||||
- [date]: [tweet/post summary]
|
||||
- [date]: [tweet/post summary]
|
||||
- [date]: [tweet/post summary]
|
||||
|
||||
Personalization Hooks:
|
||||
- [specific thing to reference in outreach]
|
||||
- [shared interest or connection]
|
||||
- [recent event or announcement to congratulate]
|
||||
```
|
||||
|
||||
## Constraints
|
||||
|
||||
- Only report verified data. Do not hallucinate company details.
|
||||
- If data is unavailable, note it as "not found" rather than guessing.
|
||||
- Prioritize recency — stale data older than 6 months should be flagged.
|
||||
75
skills/lead-intelligence/agents/mutual-mapper.md
Normal file
75
skills/lead-intelligence/agents/mutual-mapper.md
Normal file
@@ -0,0 +1,75 @@
|
||||
---
|
||||
name: mutual-mapper
|
||||
description: Maps the user's social graph (X following, LinkedIn connections) against scored prospects to find mutual connections and rank them by introduction potential.
|
||||
tools:
|
||||
- Bash
|
||||
- Read
|
||||
- Grep
|
||||
- WebSearch
|
||||
- WebFetch
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# Mutual Mapper Agent
|
||||
|
||||
You map social graph connections between the user and scored prospects to find warm introduction paths.
|
||||
|
||||
## Task
|
||||
|
||||
Given a list of scored prospects and the user's social accounts, find mutual connections and rank them by introduction potential.
|
||||
|
||||
## Algorithm
|
||||
|
||||
1. Pull the user's X following list (via X API)
|
||||
2. For each prospect, check if any of the user's followings also follow or are followed by the prospect
|
||||
3. For each mutual found, assess the strength of the connection
|
||||
4. Rank mutuals by their ability to make a warm introduction
|
||||
|
||||
## Mutual Ranking Factors
|
||||
|
||||
| Factor | Weight | Assessment |
|
||||
|--------|--------|------------|
|
||||
| Connections to targets | 40% | How many of the scored prospects does this mutual know? |
|
||||
| Mutual's role/influence | 20% | Decision maker, investor, or connector? |
|
||||
| Location match | 15% | Same city as user or target? |
|
||||
| Industry alignment | 15% | Works in the target vertical? |
|
||||
| Identifiability | 10% | Has clear X handle, LinkedIn, email? |
|
||||
|
||||
## Warm Path Types
|
||||
|
||||
Classify each path by warmth:
|
||||
|
||||
1. **Direct mutual** (warmest) — Both user and target follow this person
|
||||
2. **Portfolio/advisory** — Mutual invested in or advises target's company
|
||||
3. **Co-worker/alumni** — Shared employer or educational institution
|
||||
4. **Event overlap** — Both attended same conference, accelerator, or program
|
||||
5. **Content engagement** — Target engaged with mutual's content recently
|
||||
|
||||
## Output Format
|
||||
|
||||
```
|
||||
WARM PATH REPORT
|
||||
================
|
||||
|
||||
Target: [prospect name] (@handle)
|
||||
Path 1 (warmth: direct mutual)
|
||||
Via: @mutual_handle (Jane Smith, Partner @ Acme Ventures)
|
||||
Relationship: Jane follows both you and the target
|
||||
Suggested approach: Ask Jane for intro
|
||||
|
||||
Path 2 (warmth: portfolio)
|
||||
Via: @mutual2 (Bob Jones, Angel Investor)
|
||||
Relationship: Bob invested in target's company Series A
|
||||
Suggested approach: Reference Bob's investment
|
||||
|
||||
MUTUAL LEADERBOARD
|
||||
==================
|
||||
#1 @mutual_a — connected to 7 targets (Score: 92)
|
||||
#2 @mutual_b — connected to 5 targets (Score: 85)
|
||||
```
|
||||
|
||||
## Constraints
|
||||
|
||||
- Only report connections you can verify from API data or public profiles.
|
||||
- Do not assume connections exist based on similar bios or locations alone.
|
||||
- Flag uncertain connections with a confidence level.
|
||||
98
skills/lead-intelligence/agents/outreach-drafter.md
Normal file
98
skills/lead-intelligence/agents/outreach-drafter.md
Normal file
@@ -0,0 +1,98 @@
|
||||
---
|
||||
name: outreach-drafter
|
||||
description: Generates personalized outreach messages for qualified leads. Creates warm intro requests, cold emails, X DMs, and follow-up sequences using enriched profile data.
|
||||
tools:
|
||||
- Read
|
||||
- Grep
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# Outreach Drafter Agent
|
||||
|
||||
You generate personalized outreach messages using enriched lead data.
|
||||
|
||||
## Task
|
||||
|
||||
Given enriched prospect profiles and warm path data, draft outreach messages that are short, specific, and actionable.
|
||||
|
||||
## Message Types
|
||||
|
||||
### 1. Warm Intro Request (to mutual)
|
||||
|
||||
Template structure:
|
||||
- Greeting (first name, casual)
|
||||
- The ask (1 sentence — can you intro me to [target])
|
||||
- Why it's relevant (1 sentence — what you're building and why target cares)
|
||||
- Offer to send forwardable blurb
|
||||
- Sign off
|
||||
|
||||
Max length: 60 words.
|
||||
|
||||
### 2. Cold Email (to target directly)
|
||||
|
||||
Template structure:
|
||||
- Subject: specific, under 8 words
|
||||
- Opener: reference something specific about them (recent post, announcement, thesis)
|
||||
- Pitch: what you do and why they specifically should care (2 sentences max)
|
||||
- Ask: one concrete low-friction next step
|
||||
- Sign off with one credibility anchor
|
||||
|
||||
Max length: 80 words.
|
||||
|
||||
### 3. X DM (to target)
|
||||
|
||||
Even shorter than email. 2-3 sentences max.
|
||||
- Reference a specific post or take of theirs
|
||||
- One line on why you're reaching out
|
||||
- Clear ask
|
||||
|
||||
Max length: 40 words.
|
||||
|
||||
### 4. Follow-Up Sequence
|
||||
|
||||
- Day 4-5: short follow-up with one new data point
|
||||
- Day 10-12: final follow-up with a clean close
|
||||
- No more than 3 total touches unless user specifies otherwise
|
||||
|
||||
## Writing Rules
|
||||
|
||||
1. **Personalize or don't send.** Every message must reference something specific to the recipient.
|
||||
2. **Short sentences.** No compound sentences with multiple clauses.
|
||||
3. **Lowercase casual.** Match modern professional communication style.
|
||||
4. **No AI slop.** Never use: "game-changer", "deep dive", "the key insight", "leverage", "synergy", "at the forefront of".
|
||||
5. **Data over adjectives.** Use specific numbers, names, and facts instead of generic praise.
|
||||
6. **One ask per message.** Never combine multiple requests.
|
||||
7. **No fake familiarity.** Don't say "loved your talk" unless you can cite which talk.
|
||||
|
||||
## Personalization Sources (from enrichment data)
|
||||
|
||||
Use these hooks in order of preference:
|
||||
1. Their recent post or take you genuinely agree with
|
||||
2. A mutual connection who can vouch
|
||||
3. Their company's recent milestone (funding, launch, hire)
|
||||
4. A specific piece of their thesis or writing
|
||||
5. Shared event attendance or community membership
|
||||
|
||||
## Output Format
|
||||
|
||||
```
|
||||
TO: [name] ([email or @handle])
|
||||
VIA: [direct / warm intro through @mutual]
|
||||
TYPE: [cold email / DM / intro request]
|
||||
|
||||
Subject: [if email]
|
||||
|
||||
[message body]
|
||||
|
||||
---
|
||||
Personalization notes:
|
||||
- Referenced: [what specific thing was used]
|
||||
- Warm path: [how connected]
|
||||
- Confidence: [high/medium/low]
|
||||
```
|
||||
|
||||
## Constraints
|
||||
|
||||
- Never generate messages that could be mistaken for spam.
|
||||
- Never include false claims about the user's product or traction.
|
||||
- If enrichment data is thin, flag the message as "needs manual personalization" rather than faking specifics.
|
||||
60
skills/lead-intelligence/agents/signal-scorer.md
Normal file
60
skills/lead-intelligence/agents/signal-scorer.md
Normal file
@@ -0,0 +1,60 @@
|
||||
---
|
||||
name: signal-scorer
|
||||
description: Searches and ranks prospects by relevance signals across X, Exa, and LinkedIn. Assigns weighted scores based on role, industry, activity, influence, and location.
|
||||
tools:
|
||||
- Bash
|
||||
- Read
|
||||
- Grep
|
||||
- Glob
|
||||
- WebSearch
|
||||
- WebFetch
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# Signal Scorer Agent
|
||||
|
||||
You are a lead intelligence agent that finds and scores high-value prospects.
|
||||
|
||||
## Task
|
||||
|
||||
Given target verticals, roles, and locations from the user, search for the highest-signal people using available tools.
|
||||
|
||||
## Scoring Rubric
|
||||
|
||||
| Signal | Weight | How to Assess |
|
||||
|--------|--------|---------------|
|
||||
| Role/title alignment | 30% | Is this person a decision maker in the target space? |
|
||||
| Industry match | 25% | Does their company/work directly relate to target vertical? |
|
||||
| Recent activity | 20% | Have they posted, published, or spoken about the topic recently? |
|
||||
| Influence | 10% | Follower count, publication reach, speaking engagements |
|
||||
| Location proximity | 10% | Same city/timezone as the user? |
|
||||
| Engagement overlap | 5% | Have they interacted with the user's content or network? |
|
||||
|
||||
## Search Strategy
|
||||
|
||||
1. Use Exa web search with category filters for company and person discovery
|
||||
2. Use X API search for active voices in the target verticals
|
||||
3. Cross-reference to deduplicate and merge profiles
|
||||
4. Score each prospect on the 0-100 scale using the rubric above
|
||||
5. Return the top N prospects sorted by score
|
||||
|
||||
## Output Format
|
||||
|
||||
Return a structured list:
|
||||
|
||||
```
|
||||
PROSPECT #1 (Score: 94)
|
||||
Name: [full name]
|
||||
Handle: @[x_handle]
|
||||
Role: [current title] @ [company]
|
||||
Location: [city]
|
||||
Industry: [vertical match]
|
||||
Recent Signal: [what they posted/did recently that's relevant]
|
||||
Score Breakdown: role=28/30, industry=24/25, activity=20/20, influence=8/10, location=10/10, engagement=4/5
|
||||
```
|
||||
|
||||
## Constraints
|
||||
|
||||
- Do not fabricate profile data. Only report what you can verify from search results.
|
||||
- If a person appears in multiple sources, merge into one entry.
|
||||
- Flag low-confidence scores where data is sparse.
|
||||
43
skills/remotion-video-creation/SKILL.md
Normal file
43
skills/remotion-video-creation/SKILL.md
Normal file
@@ -0,0 +1,43 @@
|
||||
---
|
||||
name: remotion-video-creation
|
||||
description: Best practices for Remotion - Video creation in React. 29 domain-specific rules covering 3D, animations, audio, captions, charts, transitions, and more.
|
||||
metadata:
|
||||
tags: remotion, video, react, animation, composition, three.js, lottie
|
||||
---
|
||||
|
||||
## When to use
|
||||
|
||||
Use this skills whenever you are dealing with Remotion code to obtain the domain-specific knowledge.
|
||||
|
||||
## How to use
|
||||
|
||||
Read individual rule files for detailed explanations and code examples:
|
||||
|
||||
- [rules/3d.md](rules/3d.md) - 3D content in Remotion using Three.js and React Three Fiber
|
||||
- [rules/animations.md](rules/animations.md) - Fundamental animation skills for Remotion
|
||||
- [rules/assets.md](rules/assets.md) - Importing images, videos, audio, and fonts into Remotion
|
||||
- [rules/audio.md](rules/audio.md) - Using audio and sound in Remotion - importing, trimming, volume, speed, pitch
|
||||
- [rules/calculate-metadata.md](rules/calculate-metadata.md) - Dynamically set composition duration, dimensions, and props
|
||||
- [rules/can-decode.md](rules/can-decode.md) - Check if a video can be decoded by the browser using Mediabunny
|
||||
- [rules/charts.md](rules/charts.md) - Chart and data visualization patterns for Remotion
|
||||
- [rules/compositions.md](rules/compositions.md) - Defining compositions, stills, folders, default props and dynamic metadata
|
||||
- [rules/display-captions.md](rules/display-captions.md) - Displaying captions in Remotion with TikTok-style pages and word highlighting
|
||||
- [rules/extract-frames.md](rules/extract-frames.md) - Extract frames from videos at specific timestamps using Mediabunny
|
||||
- [rules/fonts.md](rules/fonts.md) - Loading Google Fonts and local fonts in Remotion
|
||||
- [rules/get-audio-duration.md](rules/get-audio-duration.md) - Getting the duration of an audio file in seconds with Mediabunny
|
||||
- [rules/get-video-dimensions.md](rules/get-video-dimensions.md) - Getting the width and height of a video file with Mediabunny
|
||||
- [rules/get-video-duration.md](rules/get-video-duration.md) - Getting the duration of a video file in seconds with Mediabunny
|
||||
- [rules/gifs.md](rules/gifs.md) - Displaying GIFs synchronized with Remotion's timeline
|
||||
- [rules/images.md](rules/images.md) - Embedding images in Remotion using the Img component
|
||||
- [rules/import-srt-captions.md](rules/import-srt-captions.md) - Importing .srt subtitle files into Remotion using @remotion/captions
|
||||
- [rules/lottie.md](rules/lottie.md) - Embedding Lottie animations in Remotion
|
||||
- [rules/measuring-dom-nodes.md](rules/measuring-dom-nodes.md) - Measuring DOM element dimensions in Remotion
|
||||
- [rules/measuring-text.md](rules/measuring-text.md) - Measuring text dimensions, fitting text to containers, and checking overflow
|
||||
- [rules/sequencing.md](rules/sequencing.md) - Sequencing patterns for Remotion - delay, trim, limit duration of items
|
||||
- [rules/tailwind.md](rules/tailwind.md) - Using TailwindCSS in Remotion
|
||||
- [rules/text-animations.md](rules/text-animations.md) - Typography and text animation patterns for Remotion
|
||||
- [rules/timing.md](rules/timing.md) - Interpolation curves in Remotion - linear, easing, spring animations
|
||||
- [rules/transcribe-captions.md](rules/transcribe-captions.md) - Transcribing audio to generate captions in Remotion
|
||||
- [rules/transitions.md](rules/transitions.md) - Scene transition patterns for Remotion
|
||||
- [rules/trimming.md](rules/trimming.md) - Trimming patterns for Remotion - cut the beginning or end of animations
|
||||
- [rules/videos.md](rules/videos.md) - Embedding videos in Remotion - trimming, volume, speed, looping, pitch
|
||||
86
skills/remotion-video-creation/rules/3d.md
Normal file
86
skills/remotion-video-creation/rules/3d.md
Normal file
@@ -0,0 +1,86 @@
|
||||
---
|
||||
name: 3d
|
||||
description: 3D content in Remotion using Three.js and React Three Fiber.
|
||||
metadata:
|
||||
tags: 3d, three, threejs
|
||||
---
|
||||
|
||||
# Using Three.js and React Three Fiber in Remotion
|
||||
|
||||
Follow React Three Fiber and Three.js best practices.
|
||||
Only the following Remotion-specific rules need to be followed:
|
||||
|
||||
## Prerequisites
|
||||
|
||||
First, the `@remotion/three` package needs to be installed.
|
||||
If it is not, use the following command:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/three # If project uses npm
|
||||
bunx remotion add @remotion/three # If project uses bun
|
||||
yarn remotion add @remotion/three # If project uses yarn
|
||||
pnpm exec remotion add @remotion/three # If project uses pnpm
|
||||
```
|
||||
|
||||
## Using ThreeCanvas
|
||||
|
||||
You MUST wrap 3D content in `<ThreeCanvas>` and include proper lighting.
|
||||
`<ThreeCanvas>` MUST have a `width` and `height` prop.
|
||||
|
||||
```tsx
|
||||
import { ThreeCanvas } from "@remotion/three";
|
||||
import { useVideoConfig } from "remotion";
|
||||
|
||||
const { width, height } = useVideoConfig();
|
||||
|
||||
<ThreeCanvas width={width} height={height}>
|
||||
<ambientLight intensity={0.4} />
|
||||
<directionalLight position={[5, 5, 5]} intensity={0.8} />
|
||||
<mesh>
|
||||
<sphereGeometry args={[1, 32, 32]} />
|
||||
<meshStandardMaterial color="red" />
|
||||
</mesh>
|
||||
</ThreeCanvas>
|
||||
```
|
||||
|
||||
## No animations not driven by `useCurrentFrame()`
|
||||
|
||||
Shaders, models etc MUST NOT animate by themselves.
|
||||
No animations are allowed unless they are driven by `useCurrentFrame()`.
|
||||
Otherwise, it will cause flickering during rendering.
|
||||
|
||||
Using `useFrame()` from `@react-three/fiber` is forbidden.
|
||||
|
||||
## Animate using `useCurrentFrame()`
|
||||
|
||||
Use `useCurrentFrame()` to perform animations.
|
||||
|
||||
```tsx
|
||||
const frame = useCurrentFrame();
|
||||
const rotationY = frame * 0.02;
|
||||
|
||||
<mesh rotation={[0, rotationY, 0]}>
|
||||
<boxGeometry args={[2, 2, 2]} />
|
||||
<meshStandardMaterial color="#4a9eff" />
|
||||
</mesh>
|
||||
```
|
||||
|
||||
## Using `<Sequence>` inside `<ThreeCanvas>`
|
||||
|
||||
The `layout` prop of any `<Sequence>` inside a `<ThreeCanvas>` must be set to `none`.
|
||||
|
||||
```tsx
|
||||
import { Sequence } from "remotion";
|
||||
import { ThreeCanvas } from "@remotion/three";
|
||||
|
||||
const { width, height } = useVideoConfig();
|
||||
|
||||
<ThreeCanvas width={width} height={height}>
|
||||
<Sequence layout="none">
|
||||
<mesh>
|
||||
<boxGeometry args={[2, 2, 2]} />
|
||||
<meshStandardMaterial color="#4a9eff" />
|
||||
</mesh>
|
||||
</Sequence>
|
||||
</ThreeCanvas>
|
||||
```
|
||||
29
skills/remotion-video-creation/rules/animations.md
Normal file
29
skills/remotion-video-creation/rules/animations.md
Normal file
@@ -0,0 +1,29 @@
|
||||
---
|
||||
name: animations
|
||||
description: Fundamental animation skills for Remotion
|
||||
metadata:
|
||||
tags: animations, transitions, frames, useCurrentFrame
|
||||
---
|
||||
|
||||
All animations MUST be driven by the `useCurrentFrame()` hook.
|
||||
Write animations in seconds and multiply them by the `fps` value from `useVideoConfig()`.
|
||||
|
||||
```tsx
|
||||
import { useCurrentFrame } from "remotion";
|
||||
|
||||
export const FadeIn = () => {
|
||||
const frame = useCurrentFrame();
|
||||
const { fps } = useVideoConfig();
|
||||
|
||||
const opacity = interpolate(frame, [0, 2 * fps], [0, 1], {
|
||||
extrapolateRight: 'clamp',
|
||||
});
|
||||
|
||||
return (
|
||||
<div style={{ opacity }}>Hello World!</div>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
CSS transitions or animations are FORBIDDEN - they will not render correctly.
|
||||
Tailwind animation class names are FORBIDDEN - they will not render correctly.
|
||||
78
skills/remotion-video-creation/rules/assets.md
Normal file
78
skills/remotion-video-creation/rules/assets.md
Normal file
@@ -0,0 +1,78 @@
|
||||
---
|
||||
name: assets
|
||||
description: Importing images, videos, audio, and fonts into Remotion
|
||||
metadata:
|
||||
tags: assets, staticFile, images, fonts, public
|
||||
---
|
||||
|
||||
# Importing assets in Remotion
|
||||
|
||||
## The public folder
|
||||
|
||||
Place assets in the `public/` folder at your project root.
|
||||
|
||||
## Using staticFile()
|
||||
|
||||
You MUST use `staticFile()` to reference files from the `public/` folder:
|
||||
|
||||
```tsx
|
||||
import {Img, staticFile} from 'remotion';
|
||||
|
||||
export const MyComposition = () => {
|
||||
return <Img src={staticFile('logo.png')} />;
|
||||
};
|
||||
```
|
||||
|
||||
The function returns an encoded URL that works correctly when deploying to subdirectories.
|
||||
|
||||
## Using with components
|
||||
|
||||
**Images:**
|
||||
|
||||
```tsx
|
||||
import {Img, staticFile} from 'remotion';
|
||||
|
||||
<Img src={staticFile('photo.png')} />;
|
||||
```
|
||||
|
||||
**Videos:**
|
||||
|
||||
```tsx
|
||||
import {Video} from '@remotion/media';
|
||||
import {staticFile} from 'remotion';
|
||||
|
||||
<Video src={staticFile('clip.mp4')} />;
|
||||
```
|
||||
|
||||
**Audio:**
|
||||
|
||||
```tsx
|
||||
import {Audio} from '@remotion/media';
|
||||
import {staticFile} from 'remotion';
|
||||
|
||||
<Audio src={staticFile('music.mp3')} />;
|
||||
```
|
||||
|
||||
**Fonts:**
|
||||
|
||||
```tsx
|
||||
import {staticFile} from 'remotion';
|
||||
|
||||
const fontFamily = new FontFace('MyFont', `url(${staticFile('font.woff2')})`);
|
||||
await fontFamily.load();
|
||||
document.fonts.add(fontFamily);
|
||||
```
|
||||
|
||||
## Remote URLs
|
||||
|
||||
Remote URLs can be used directly without `staticFile()`:
|
||||
|
||||
```tsx
|
||||
<Img src="https://example.com/image.png" />
|
||||
<Video src="https://remotion.media/video.mp4" />
|
||||
```
|
||||
|
||||
## Important notes
|
||||
|
||||
- Remotion components (`<Img>`, `<Video>`, `<Audio>`) ensure assets are fully loaded before rendering
|
||||
- Special characters in filenames (`#`, `?`, `&`) are automatically encoded
|
||||
173
skills/remotion-video-creation/rules/assets/charts-bar-chart.tsx
Normal file
173
skills/remotion-video-creation/rules/assets/charts-bar-chart.tsx
Normal file
@@ -0,0 +1,173 @@
|
||||
import {loadFont} from '@remotion/google-fonts/Inter';
|
||||
import {AbsoluteFill, spring, useCurrentFrame, useVideoConfig} from 'remotion';
|
||||
|
||||
const {fontFamily} = loadFont();
|
||||
|
||||
const COLOR_BAR = '#D4AF37';
|
||||
const COLOR_TEXT = '#ffffff';
|
||||
const COLOR_MUTED = '#888888';
|
||||
const COLOR_BG = '#0a0a0a';
|
||||
const COLOR_AXIS = '#333333';
|
||||
|
||||
// Ideal composition size: 1280x720
|
||||
|
||||
const Title: React.FC<{children: React.ReactNode}> = ({children}) => (
|
||||
<div style={{textAlign: 'center', marginBottom: 40}}>
|
||||
<div style={{color: COLOR_TEXT, fontSize: 48, fontWeight: 600}}>
|
||||
{children}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
|
||||
const YAxis: React.FC<{steps: number[]; height: number}> = ({
|
||||
steps,
|
||||
height,
|
||||
}) => (
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
flexDirection: 'column',
|
||||
justifyContent: 'space-between',
|
||||
height,
|
||||
paddingRight: 16,
|
||||
}}
|
||||
>
|
||||
{steps
|
||||
.slice()
|
||||
.reverse()
|
||||
.map((step) => (
|
||||
<div
|
||||
key={step}
|
||||
style={{
|
||||
color: COLOR_MUTED,
|
||||
fontSize: 20,
|
||||
textAlign: 'right',
|
||||
}}
|
||||
>
|
||||
{step.toLocaleString()}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
);
|
||||
|
||||
const Bar: React.FC<{
|
||||
height: number;
|
||||
progress: number;
|
||||
}> = ({height, progress}) => (
|
||||
<div
|
||||
style={{
|
||||
flex: 1,
|
||||
display: 'flex',
|
||||
flexDirection: 'column',
|
||||
justifyContent: 'flex-end',
|
||||
}}
|
||||
>
|
||||
<div
|
||||
style={{
|
||||
width: '100%',
|
||||
height,
|
||||
backgroundColor: COLOR_BAR,
|
||||
borderRadius: '8px 8px 0 0',
|
||||
opacity: progress,
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
|
||||
const XAxis: React.FC<{
|
||||
children: React.ReactNode;
|
||||
labels: string[];
|
||||
height: number;
|
||||
}> = ({children, labels, height}) => (
|
||||
<div style={{flex: 1, display: 'flex', flexDirection: 'column'}}>
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
alignItems: 'flex-end',
|
||||
gap: 16,
|
||||
height,
|
||||
borderLeft: `2px solid ${COLOR_AXIS}`,
|
||||
borderBottom: `2px solid ${COLOR_AXIS}`,
|
||||
paddingLeft: 16,
|
||||
}}
|
||||
>
|
||||
{children}
|
||||
</div>
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
gap: 16,
|
||||
paddingLeft: 16,
|
||||
marginTop: 12,
|
||||
}}
|
||||
>
|
||||
{labels.map((label) => (
|
||||
<div
|
||||
key={label}
|
||||
style={{
|
||||
flex: 1,
|
||||
textAlign: 'center',
|
||||
color: COLOR_MUTED,
|
||||
fontSize: 20,
|
||||
}}
|
||||
>
|
||||
{label}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
|
||||
export const MyAnimation = () => {
|
||||
const frame = useCurrentFrame();
|
||||
const {fps, height} = useVideoConfig();
|
||||
|
||||
const data = [
|
||||
{month: 'Jan', price: 2039},
|
||||
{month: 'Mar', price: 2160},
|
||||
{month: 'May', price: 2327},
|
||||
{month: 'Jul', price: 2426},
|
||||
{month: 'Sep', price: 2634},
|
||||
{month: 'Nov', price: 2672},
|
||||
];
|
||||
|
||||
const minPrice = 2000;
|
||||
const maxPrice = 2800;
|
||||
const priceRange = maxPrice - minPrice;
|
||||
const chartHeight = height - 280;
|
||||
const yAxisSteps = [2000, 2400, 2800];
|
||||
|
||||
return (
|
||||
<AbsoluteFill
|
||||
style={{
|
||||
backgroundColor: COLOR_BG,
|
||||
padding: 60,
|
||||
display: 'flex',
|
||||
flexDirection: 'column',
|
||||
fontFamily,
|
||||
}}
|
||||
>
|
||||
<Title>Gold Price 2024</Title>
|
||||
|
||||
<div style={{display: 'flex', flex: 1}}>
|
||||
<YAxis steps={yAxisSteps} height={chartHeight} />
|
||||
<XAxis height={chartHeight} labels={data.map((d) => d.month)}>
|
||||
{data.map((item, i) => {
|
||||
const progress = spring({
|
||||
frame: frame - i * 5 - 10,
|
||||
fps,
|
||||
config: {damping: 18, stiffness: 80},
|
||||
});
|
||||
|
||||
const barHeight =
|
||||
((item.price - minPrice) / priceRange) * chartHeight * progress;
|
||||
|
||||
return (
|
||||
<Bar key={item.month} height={barHeight} progress={progress} />
|
||||
);
|
||||
})}
|
||||
</XAxis>
|
||||
</div>
|
||||
</AbsoluteFill>
|
||||
);
|
||||
};
|
||||
@@ -0,0 +1,100 @@
|
||||
import {
|
||||
AbsoluteFill,
|
||||
interpolate,
|
||||
useCurrentFrame,
|
||||
useVideoConfig,
|
||||
} from 'remotion';
|
||||
|
||||
const COLOR_BG = '#ffffff';
|
||||
const COLOR_TEXT = '#000000';
|
||||
const FULL_TEXT = 'From prompt to motion graphics. This is Remotion.';
|
||||
const PAUSE_AFTER = 'From prompt to motion graphics.';
|
||||
const FONT_SIZE = 72;
|
||||
const FONT_WEIGHT = 700;
|
||||
const CHAR_FRAMES = 2;
|
||||
const CURSOR_BLINK_FRAMES = 16;
|
||||
const PAUSE_SECONDS = 1;
|
||||
|
||||
// Ideal composition size: 1280x720
|
||||
|
||||
const getTypedText = ({
|
||||
frame,
|
||||
fullText,
|
||||
pauseAfter,
|
||||
charFrames,
|
||||
pauseFrames,
|
||||
}: {
|
||||
frame: number;
|
||||
fullText: string;
|
||||
pauseAfter: string;
|
||||
charFrames: number;
|
||||
pauseFrames: number;
|
||||
}): string => {
|
||||
const pauseIndex = fullText.indexOf(pauseAfter);
|
||||
const preLen =
|
||||
pauseIndex >= 0 ? pauseIndex + pauseAfter.length : fullText.length;
|
||||
|
||||
let typedChars = 0;
|
||||
if (frame < preLen * charFrames) {
|
||||
typedChars = Math.floor(frame / charFrames);
|
||||
} else if (frame < preLen * charFrames + pauseFrames) {
|
||||
typedChars = preLen;
|
||||
} else {
|
||||
const postPhase = frame - preLen * charFrames - pauseFrames;
|
||||
typedChars = Math.min(
|
||||
fullText.length,
|
||||
preLen + Math.floor(postPhase / charFrames),
|
||||
);
|
||||
}
|
||||
return fullText.slice(0, typedChars);
|
||||
};
|
||||
|
||||
const Cursor: React.FC<{
|
||||
frame: number;
|
||||
blinkFrames: number;
|
||||
symbol?: string;
|
||||
}> = ({frame, blinkFrames, symbol = '\u258C'}) => {
|
||||
const opacity = interpolate(
|
||||
frame % blinkFrames,
|
||||
[0, blinkFrames / 2, blinkFrames],
|
||||
[1, 0, 1],
|
||||
{extrapolateLeft: 'clamp', extrapolateRight: 'clamp'},
|
||||
);
|
||||
|
||||
return <span style={{opacity}}>{symbol}</span>;
|
||||
};
|
||||
|
||||
export const MyAnimation = () => {
|
||||
const frame = useCurrentFrame();
|
||||
const {fps} = useVideoConfig();
|
||||
|
||||
const pauseFrames = Math.round(fps * PAUSE_SECONDS);
|
||||
|
||||
const typedText = getTypedText({
|
||||
frame,
|
||||
fullText: FULL_TEXT,
|
||||
pauseAfter: PAUSE_AFTER,
|
||||
charFrames: CHAR_FRAMES,
|
||||
pauseFrames,
|
||||
});
|
||||
|
||||
return (
|
||||
<AbsoluteFill
|
||||
style={{
|
||||
backgroundColor: COLOR_BG,
|
||||
}}
|
||||
>
|
||||
<div
|
||||
style={{
|
||||
color: COLOR_TEXT,
|
||||
fontSize: FONT_SIZE,
|
||||
fontWeight: FONT_WEIGHT,
|
||||
fontFamily: 'sans-serif',
|
||||
}}
|
||||
>
|
||||
<span>{typedText}</span>
|
||||
<Cursor frame={frame} blinkFrames={CURSOR_BLINK_FRAMES} />
|
||||
</div>
|
||||
</AbsoluteFill>
|
||||
);
|
||||
};
|
||||
@@ -0,0 +1,108 @@
|
||||
import {loadFont} from '@remotion/google-fonts/Inter';
|
||||
import React from 'react';
|
||||
import {
|
||||
AbsoluteFill,
|
||||
spring,
|
||||
useCurrentFrame,
|
||||
useVideoConfig,
|
||||
} from 'remotion';
|
||||
|
||||
/*
|
||||
* Highlight a word in a sentence with a spring-animated wipe effect.
|
||||
*/
|
||||
|
||||
// Ideal composition size: 1280x720
|
||||
|
||||
const COLOR_BG = '#ffffff';
|
||||
const COLOR_TEXT = '#000000';
|
||||
const COLOR_HIGHLIGHT = '#A7C7E7';
|
||||
const FULL_TEXT = 'This is Remotion.';
|
||||
const HIGHLIGHT_WORD = 'Remotion';
|
||||
const FONT_SIZE = 72;
|
||||
const FONT_WEIGHT = 700;
|
||||
const HIGHLIGHT_START_FRAME = 30;
|
||||
const HIGHLIGHT_WIPE_DURATION = 18;
|
||||
|
||||
const {fontFamily} = loadFont();
|
||||
|
||||
const Highlight: React.FC<{
|
||||
word: string;
|
||||
color: string;
|
||||
delay: number;
|
||||
durationInFrames: number;
|
||||
}> = ({word, color, delay, durationInFrames}) => {
|
||||
const frame = useCurrentFrame();
|
||||
const {fps} = useVideoConfig();
|
||||
|
||||
const highlightProgress = spring({
|
||||
fps,
|
||||
frame,
|
||||
config: {damping: 200},
|
||||
delay,
|
||||
durationInFrames,
|
||||
});
|
||||
const scaleX = Math.max(0, Math.min(1, highlightProgress));
|
||||
|
||||
return (
|
||||
<span style={{position: 'relative', display: 'inline-block'}}>
|
||||
<span
|
||||
style={{
|
||||
position: 'absolute',
|
||||
left: 0,
|
||||
right: 0,
|
||||
top: '50%',
|
||||
height: '1.05em',
|
||||
transform: `translateY(-50%) scaleX(${scaleX})`,
|
||||
transformOrigin: 'left center',
|
||||
backgroundColor: color,
|
||||
borderRadius: '0.18em',
|
||||
zIndex: 0,
|
||||
}}
|
||||
/>
|
||||
<span style={{position: 'relative', zIndex: 1}}>{word}</span>
|
||||
</span>
|
||||
);
|
||||
};
|
||||
|
||||
export const MyAnimation = () => {
|
||||
const highlightIndex = FULL_TEXT.indexOf(HIGHLIGHT_WORD);
|
||||
const hasHighlight = highlightIndex >= 0;
|
||||
const preText = hasHighlight ? FULL_TEXT.slice(0, highlightIndex) : FULL_TEXT;
|
||||
const postText = hasHighlight
|
||||
? FULL_TEXT.slice(highlightIndex + HIGHLIGHT_WORD.length)
|
||||
: '';
|
||||
|
||||
return (
|
||||
<AbsoluteFill
|
||||
style={{
|
||||
backgroundColor: COLOR_BG,
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center',
|
||||
fontFamily,
|
||||
}}
|
||||
>
|
||||
<div
|
||||
style={{
|
||||
color: COLOR_TEXT,
|
||||
fontSize: FONT_SIZE,
|
||||
fontWeight: FONT_WEIGHT,
|
||||
}}
|
||||
>
|
||||
{hasHighlight ? (
|
||||
<>
|
||||
<span>{preText}</span>
|
||||
<Highlight
|
||||
word={HIGHLIGHT_WORD}
|
||||
color={COLOR_HIGHLIGHT}
|
||||
delay={HIGHLIGHT_START_FRAME}
|
||||
durationInFrames={HIGHLIGHT_WIPE_DURATION}
|
||||
/>
|
||||
<span>{postText}</span>
|
||||
</>
|
||||
) : (
|
||||
<span>{FULL_TEXT}</span>
|
||||
)}
|
||||
</div>
|
||||
</AbsoluteFill>
|
||||
);
|
||||
};
|
||||
172
skills/remotion-video-creation/rules/audio.md
Normal file
172
skills/remotion-video-creation/rules/audio.md
Normal file
@@ -0,0 +1,172 @@
|
||||
---
|
||||
name: audio
|
||||
description: Using audio and sound in Remotion - importing, trimming, volume, speed, pitch
|
||||
metadata:
|
||||
tags: audio, media, trim, volume, speed, loop, pitch, mute, sound, sfx
|
||||
---
|
||||
|
||||
# Using audio in Remotion
|
||||
|
||||
## Prerequisites
|
||||
|
||||
First, the @remotion/media package needs to be installed.
|
||||
If it is not installed, use the following command:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/media # If project uses npm
|
||||
bunx remotion add @remotion/media # If project uses bun
|
||||
yarn remotion add @remotion/media # If project uses yarn
|
||||
pnpm exec remotion add @remotion/media # If project uses pnpm
|
||||
```
|
||||
|
||||
## Importing Audio
|
||||
|
||||
Use `<Audio>` from `@remotion/media` to add audio to your composition.
|
||||
|
||||
```tsx
|
||||
import { Audio } from "@remotion/media";
|
||||
import { staticFile } from "remotion";
|
||||
|
||||
export const MyComposition = () => {
|
||||
return <Audio src={staticFile("audio.mp3")} />;
|
||||
};
|
||||
```
|
||||
|
||||
Remote URLs are also supported:
|
||||
|
||||
```tsx
|
||||
<Audio src="https://remotion.media/audio.mp3" />
|
||||
```
|
||||
|
||||
By default, audio plays from the start, at full volume and full length.
|
||||
Multiple audio tracks can be layered by adding multiple `<Audio>` components.
|
||||
|
||||
## Trimming
|
||||
|
||||
Use `trimBefore` and `trimAfter` to remove portions of the audio. Values are in frames.
|
||||
|
||||
```tsx
|
||||
const { fps } = useVideoConfig();
|
||||
|
||||
return (
|
||||
<Audio
|
||||
src={staticFile("audio.mp3")}
|
||||
trimBefore={2 * fps} // Skip the first 2 seconds
|
||||
trimAfter={10 * fps} // End at the 10 second mark
|
||||
/>
|
||||
);
|
||||
```
|
||||
|
||||
The audio still starts playing at the beginning of the composition - only the specified portion is played.
|
||||
|
||||
## Delaying
|
||||
|
||||
Wrap the audio in a `<Sequence>` to delay when it starts:
|
||||
|
||||
```tsx
|
||||
import { Sequence, staticFile } from "remotion";
|
||||
import { Audio } from "@remotion/media";
|
||||
|
||||
const { fps } = useVideoConfig();
|
||||
|
||||
return (
|
||||
<Sequence from={1 * fps}>
|
||||
<Audio src={staticFile("audio.mp3")} />
|
||||
</Sequence>
|
||||
);
|
||||
```
|
||||
|
||||
The audio will start playing after 1 second.
|
||||
|
||||
## Volume
|
||||
|
||||
Set a static volume (0 to 1):
|
||||
|
||||
```tsx
|
||||
<Audio src={staticFile("audio.mp3")} volume={0.5} />
|
||||
```
|
||||
|
||||
Or use a callback for dynamic volume based on the current frame:
|
||||
|
||||
```tsx
|
||||
import { interpolate } from "remotion";
|
||||
|
||||
const { fps } = useVideoConfig();
|
||||
|
||||
return (
|
||||
<Audio
|
||||
src={staticFile("audio.mp3")}
|
||||
volume={(f) =>
|
||||
interpolate(f, [0, 1 * fps], [0, 1], { extrapolateRight: "clamp" })
|
||||
}
|
||||
/>
|
||||
);
|
||||
```
|
||||
|
||||
The value of `f` starts at 0 when the audio begins to play, not the composition frame.
|
||||
|
||||
## Muting
|
||||
|
||||
Use `muted` to silence the audio. It can be set dynamically:
|
||||
|
||||
```tsx
|
||||
const frame = useCurrentFrame();
|
||||
const { fps } = useVideoConfig();
|
||||
|
||||
return (
|
||||
<Audio
|
||||
src={staticFile("audio.mp3")}
|
||||
muted={frame >= 2 * fps && frame <= 4 * fps} // Mute between 2s and 4s
|
||||
/>
|
||||
);
|
||||
```
|
||||
|
||||
## Speed
|
||||
|
||||
Use `playbackRate` to change the playback speed:
|
||||
|
||||
```tsx
|
||||
<Audio src={staticFile("audio.mp3")} playbackRate={2} /> {/* 2x speed */}
|
||||
<Audio src={staticFile("audio.mp3")} playbackRate={0.5} /> {/* Half speed */}
|
||||
```
|
||||
|
||||
Reverse playback is not supported.
|
||||
|
||||
## Looping
|
||||
|
||||
Use `loop` to loop the audio indefinitely:
|
||||
|
||||
```tsx
|
||||
<Audio src={staticFile("audio.mp3")} loop />
|
||||
```
|
||||
|
||||
Use `loopVolumeCurveBehavior` to control how the frame count behaves when looping:
|
||||
|
||||
- `"repeat"`: Frame count resets to 0 each loop (default)
|
||||
- `"extend"`: Frame count continues incrementing
|
||||
|
||||
```tsx
|
||||
<Audio
|
||||
src={staticFile("audio.mp3")}
|
||||
loop
|
||||
loopVolumeCurveBehavior="extend"
|
||||
volume={(f) => interpolate(f, [0, 300], [1, 0])} // Fade out over multiple loops
|
||||
/>
|
||||
```
|
||||
|
||||
## Pitch
|
||||
|
||||
Use `toneFrequency` to adjust the pitch without affecting speed. Values range from 0.01 to 2:
|
||||
|
||||
```tsx
|
||||
<Audio
|
||||
src={staticFile("audio.mp3")}
|
||||
toneFrequency={1.5} // Higher pitch
|
||||
/>
|
||||
<Audio
|
||||
src={staticFile("audio.mp3")}
|
||||
toneFrequency={0.8} // Lower pitch
|
||||
/>
|
||||
```
|
||||
|
||||
Pitch shifting only works during server-side rendering, not in the Remotion Studio preview or in the `<Player />`.
|
||||
104
skills/remotion-video-creation/rules/calculate-metadata.md
Normal file
104
skills/remotion-video-creation/rules/calculate-metadata.md
Normal file
@@ -0,0 +1,104 @@
|
||||
---
|
||||
name: calculate-metadata
|
||||
description: Dynamically set composition duration, dimensions, and props
|
||||
metadata:
|
||||
tags: calculateMetadata, duration, dimensions, props, dynamic
|
||||
---
|
||||
|
||||
# Using calculateMetadata
|
||||
|
||||
Use `calculateMetadata` on a `<Composition>` to dynamically set duration, dimensions, and transform props before rendering.
|
||||
|
||||
```tsx
|
||||
<Composition id="MyComp" component={MyComponent} durationInFrames={300} fps={30} width={1920} height={1080} defaultProps={{videoSrc: 'https://remotion.media/video.mp4'}} calculateMetadata={calculateMetadata} />
|
||||
```
|
||||
|
||||
## Setting duration based on a video
|
||||
|
||||
Use the `getMediaMetadata()` function from the mediabunny/metadata skill to get the video duration:
|
||||
|
||||
```tsx
|
||||
import {CalculateMetadataFunction} from 'remotion';
|
||||
import {getMediaMetadata} from '../get-media-metadata';
|
||||
|
||||
const calculateMetadata: CalculateMetadataFunction<Props> = async ({props}) => {
|
||||
const {durationInSeconds} = await getMediaMetadata(props.videoSrc);
|
||||
|
||||
return {
|
||||
durationInFrames: Math.ceil(durationInSeconds * 30),
|
||||
};
|
||||
};
|
||||
```
|
||||
|
||||
## Matching dimensions of a video
|
||||
|
||||
```tsx
|
||||
const calculateMetadata: CalculateMetadataFunction<Props> = async ({props}) => {
|
||||
const {durationInSeconds, dimensions} = await getMediaMetadata(props.videoSrc);
|
||||
|
||||
return {
|
||||
durationInFrames: Math.ceil(durationInSeconds * 30),
|
||||
width: dimensions?.width ?? 1920,
|
||||
height: dimensions?.height ?? 1080,
|
||||
};
|
||||
};
|
||||
```
|
||||
|
||||
## Setting duration based on multiple videos
|
||||
|
||||
```tsx
|
||||
const calculateMetadata: CalculateMetadataFunction<Props> = async ({props}) => {
|
||||
const metadataPromises = props.videos.map((video) => getMediaMetadata(video.src));
|
||||
const allMetadata = await Promise.all(metadataPromises);
|
||||
|
||||
const totalDuration = allMetadata.reduce((sum, meta) => sum + meta.durationInSeconds, 0);
|
||||
|
||||
return {
|
||||
durationInFrames: Math.ceil(totalDuration * 30),
|
||||
};
|
||||
};
|
||||
```
|
||||
|
||||
## Setting a default outName
|
||||
|
||||
Set the default output filename based on props:
|
||||
|
||||
```tsx
|
||||
const calculateMetadata: CalculateMetadataFunction<Props> = async ({props}) => {
|
||||
return {
|
||||
defaultOutName: `video-${props.id}.mp4`,
|
||||
};
|
||||
};
|
||||
```
|
||||
|
||||
## Transforming props
|
||||
|
||||
Fetch data or transform props before rendering:
|
||||
|
||||
```tsx
|
||||
const calculateMetadata: CalculateMetadataFunction<Props> = async ({props, abortSignal}) => {
|
||||
const response = await fetch(props.dataUrl, {signal: abortSignal});
|
||||
const data = await response.json();
|
||||
|
||||
return {
|
||||
props: {
|
||||
...props,
|
||||
fetchedData: data,
|
||||
},
|
||||
};
|
||||
};
|
||||
```
|
||||
|
||||
The `abortSignal` cancels stale requests when props change in the Studio.
|
||||
|
||||
## Return value
|
||||
|
||||
All fields are optional. Returned values override the `<Composition>` props:
|
||||
|
||||
- `durationInFrames`: Number of frames
|
||||
- `width`: Composition width in pixels
|
||||
- `height`: Composition height in pixels
|
||||
- `fps`: Frames per second
|
||||
- `props`: Transformed props passed to the component
|
||||
- `defaultOutName`: Default output filename
|
||||
- `defaultCodec`: Default codec for rendering
|
||||
75
skills/remotion-video-creation/rules/can-decode.md
Normal file
75
skills/remotion-video-creation/rules/can-decode.md
Normal file
@@ -0,0 +1,75 @@
|
||||
---
|
||||
name: can-decode
|
||||
description: Check if a video can be decoded by the browser using Mediabunny
|
||||
metadata:
|
||||
tags: decode, validation, video, audio, compatibility, browser
|
||||
---
|
||||
|
||||
# Checking if a video can be decoded
|
||||
|
||||
Use Mediabunny to check if a video can be decoded by the browser before attempting to play it.
|
||||
|
||||
## The `canDecode()` function
|
||||
|
||||
This function can be copy-pasted into any project.
|
||||
|
||||
```tsx
|
||||
import { Input, ALL_FORMATS, UrlSource } from "mediabunny";
|
||||
|
||||
export const canDecode = async (src: string) => {
|
||||
const input = new Input({
|
||||
formats: ALL_FORMATS,
|
||||
source: new UrlSource(src, {
|
||||
getRetryDelay: () => null,
|
||||
}),
|
||||
});
|
||||
|
||||
try {
|
||||
await input.getFormat();
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
|
||||
const videoTrack = await input.getPrimaryVideoTrack();
|
||||
if (videoTrack && !(await videoTrack.canDecode())) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const audioTrack = await input.getPrimaryAudioTrack();
|
||||
if (audioTrack && !(await audioTrack.canDecode())) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
};
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```tsx
|
||||
const src = "https://remotion.media/video.mp4";
|
||||
const isDecodable = await canDecode(src);
|
||||
|
||||
if (isDecodable) {
|
||||
console.log("Video can be decoded");
|
||||
} else {
|
||||
console.log("Video cannot be decoded by this browser");
|
||||
}
|
||||
```
|
||||
|
||||
## Using with Blob
|
||||
|
||||
For file uploads or drag-and-drop, use `BlobSource`:
|
||||
|
||||
```tsx
|
||||
import { Input, ALL_FORMATS, BlobSource } from "mediabunny";
|
||||
|
||||
export const canDecodeBlob = async (blob: Blob) => {
|
||||
const input = new Input({
|
||||
formats: ALL_FORMATS,
|
||||
source: new BlobSource(blob),
|
||||
});
|
||||
|
||||
// Same validation logic as above
|
||||
};
|
||||
```
|
||||
58
skills/remotion-video-creation/rules/charts.md
Normal file
58
skills/remotion-video-creation/rules/charts.md
Normal file
@@ -0,0 +1,58 @@
|
||||
---
|
||||
name: charts
|
||||
description: Chart and data visualization patterns for Remotion. Use when creating bar charts, pie charts, histograms, progress bars, or any data-driven animations.
|
||||
metadata:
|
||||
tags: charts, data, visualization, bar-chart, pie-chart, graphs
|
||||
---
|
||||
|
||||
# Charts in Remotion
|
||||
|
||||
You can create bar charts in Remotion by using regular React code - HTML and SVG is allowed, as well as D3.js.
|
||||
|
||||
## No animations not powered by `useCurrentFrame()`
|
||||
|
||||
Disable all animations by third party libraries.
|
||||
They will cause flickering during rendering.
|
||||
Instead, drive all animations from `useCurrentFrame()`.
|
||||
|
||||
## Bar Chart Animations
|
||||
|
||||
See [Bar Chart Example](assets/charts/bar-chart.tsx) for a basic example implmentation.
|
||||
|
||||
### Staggered Bars
|
||||
|
||||
You can animate the height of the bars and stagger them like this:
|
||||
|
||||
```tsx
|
||||
const STAGGER_DELAY = 5;
|
||||
const frame = useCurrentFrame();
|
||||
const {fps} = useVideoConfig();
|
||||
|
||||
const bars = data.map((item, i) => {
|
||||
const delay = i * STAGGER_DELAY;
|
||||
const height = spring({
|
||||
frame,
|
||||
fps,
|
||||
delay,
|
||||
config: {damping: 200},
|
||||
});
|
||||
return <div style={{height: height * item.value}} />;
|
||||
});
|
||||
```
|
||||
|
||||
## Pie Chart Animation
|
||||
|
||||
Animate segments using stroke-dashoffset, starting from 12 o'clock.
|
||||
|
||||
```tsx
|
||||
const frame = useCurrentFrame();
|
||||
const {fps} = useVideoConfig();
|
||||
|
||||
const progress = interpolate(frame, [0, 100], [0, 1]);
|
||||
|
||||
const circumference = 2 * Math.PI * radius;
|
||||
const segmentLength = (value / total) * circumference;
|
||||
const offset = interpolate(progress, [0, 1], [segmentLength, 0]);
|
||||
|
||||
<circle r={radius} cx={center} cy={center} fill="none" stroke={color} strokeWidth={strokeWidth} strokeDasharray={`${segmentLength} ${circumference}`} strokeDashoffset={offset} transform={`rotate(-90 ${center} ${center})`} />;
|
||||
```
|
||||
146
skills/remotion-video-creation/rules/compositions.md
Normal file
146
skills/remotion-video-creation/rules/compositions.md
Normal file
@@ -0,0 +1,146 @@
|
||||
---
|
||||
name: compositions
|
||||
description: Defining compositions, stills, folders, default props and dynamic metadata
|
||||
metadata:
|
||||
tags: composition, still, folder, props, metadata
|
||||
---
|
||||
|
||||
A `<Composition>` defines the component, width, height, fps and duration of a renderable video.
|
||||
|
||||
It normally is placed in the `src/Root.tsx` file.
|
||||
|
||||
```tsx
|
||||
import { Composition } from "remotion";
|
||||
import { MyComposition } from "./MyComposition";
|
||||
|
||||
export const RemotionRoot = () => {
|
||||
return (
|
||||
<Composition
|
||||
id="MyComposition"
|
||||
component={MyComposition}
|
||||
durationInFrames={100}
|
||||
fps={30}
|
||||
width={1080}
|
||||
height={1080}
|
||||
/>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
## Default Props
|
||||
|
||||
Pass `defaultProps` to provide initial values for your component.
|
||||
Values must be JSON-serializable (`Date`, `Map`, `Set`, and `staticFile()` are supported).
|
||||
|
||||
```tsx
|
||||
import { Composition } from "remotion";
|
||||
import { MyComposition, MyCompositionProps } from "./MyComposition";
|
||||
|
||||
export const RemotionRoot = () => {
|
||||
return (
|
||||
<Composition
|
||||
id="MyComposition"
|
||||
component={MyComposition}
|
||||
durationInFrames={100}
|
||||
fps={30}
|
||||
width={1080}
|
||||
height={1080}
|
||||
defaultProps={{
|
||||
title: "Hello World",
|
||||
color: "#ff0000",
|
||||
} satisfies MyCompositionProps}
|
||||
/>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
Use `type` declarations for props rather than `interface` to ensure `defaultProps` type safety.
|
||||
|
||||
## Folders
|
||||
|
||||
Use `<Folder>` to organize compositions in the sidebar.
|
||||
Folder names can only contain letters, numbers, and hyphens.
|
||||
|
||||
```tsx
|
||||
import { Composition, Folder } from "remotion";
|
||||
|
||||
export const RemotionRoot = () => {
|
||||
return (
|
||||
<>
|
||||
<Folder name="Marketing">
|
||||
<Composition id="Promo" /* ... */ />
|
||||
<Composition id="Ad" /* ... */ />
|
||||
</Folder>
|
||||
<Folder name="Social">
|
||||
<Folder name="Instagram">
|
||||
<Composition id="Story" /* ... */ />
|
||||
<Composition id="Reel" /* ... */ />
|
||||
</Folder>
|
||||
</Folder>
|
||||
</>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
## Stills
|
||||
|
||||
Use `<Still>` for single-frame images. It does not require `durationInFrames` or `fps`.
|
||||
|
||||
```tsx
|
||||
import { Still } from "remotion";
|
||||
import { Thumbnail } from "./Thumbnail";
|
||||
|
||||
export const RemotionRoot = () => {
|
||||
return (
|
||||
<Still
|
||||
id="Thumbnail"
|
||||
component={Thumbnail}
|
||||
width={1280}
|
||||
height={720}
|
||||
/>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
## Calculate Metadata
|
||||
|
||||
Use `calculateMetadata` to make dimensions, duration, or props dynamic based on data.
|
||||
|
||||
```tsx
|
||||
import { Composition, CalculateMetadataFunction } from "remotion";
|
||||
import { MyComposition, MyCompositionProps } from "./MyComposition";
|
||||
|
||||
const calculateMetadata: CalculateMetadataFunction<MyCompositionProps> = async ({
|
||||
props,
|
||||
abortSignal,
|
||||
}) => {
|
||||
const data = await fetch(`https://api.example.com/video/${props.videoId}`, {
|
||||
signal: abortSignal,
|
||||
}).then((res) => res.json());
|
||||
|
||||
return {
|
||||
durationInFrames: Math.ceil(data.duration * 30),
|
||||
props: {
|
||||
...props,
|
||||
videoUrl: data.url,
|
||||
},
|
||||
};
|
||||
};
|
||||
|
||||
export const RemotionRoot = () => {
|
||||
return (
|
||||
<Composition
|
||||
id="MyComposition"
|
||||
component={MyComposition}
|
||||
durationInFrames={100} // Placeholder, will be overridden
|
||||
fps={30}
|
||||
width={1080}
|
||||
height={1080}
|
||||
defaultProps={{ videoId: "abc123" }}
|
||||
calculateMetadata={calculateMetadata}
|
||||
/>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
The function can return `props`, `durationInFrames`, `width`, `height`, `fps`, and codec-related defaults. It runs once before rendering begins.
|
||||
126
skills/remotion-video-creation/rules/display-captions.md
Normal file
126
skills/remotion-video-creation/rules/display-captions.md
Normal file
@@ -0,0 +1,126 @@
|
||||
---
|
||||
name: display-captions
|
||||
description: Displaying captions in Remotion with TikTok-style pages and word highlighting
|
||||
metadata:
|
||||
tags: captions, subtitles, display, tiktok, highlight
|
||||
---
|
||||
|
||||
# Displaying captions in Remotion
|
||||
|
||||
This guide explains how to display captions in Remotion, assuming you already have captions in the `Caption` format.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
First, the @remotion/captions package needs to be installed.
|
||||
If it is not installed, use the following command:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/captions # If project uses npm
|
||||
bunx remotion add @remotion/captions # If project uses bun
|
||||
yarn remotion add @remotion/captions # If project uses yarn
|
||||
pnpm exec remotion add @remotion/captions # If project uses pnpm
|
||||
```
|
||||
|
||||
## Creating pages
|
||||
|
||||
Use `createTikTokStyleCaptions()` to group captions into pages. The `combineTokensWithinMilliseconds` option controls how many words appear at once:
|
||||
|
||||
```tsx
|
||||
import {useMemo} from 'react';
|
||||
import {createTikTokStyleCaptions} from '@remotion/captions';
|
||||
import type {Caption} from '@remotion/captions';
|
||||
|
||||
// How often captions should switch (in milliseconds)
|
||||
// Higher values = more words per page
|
||||
// Lower values = fewer words (more word-by-word)
|
||||
const SWITCH_CAPTIONS_EVERY_MS = 1200;
|
||||
|
||||
const {pages} = useMemo(() => {
|
||||
return createTikTokStyleCaptions({
|
||||
captions,
|
||||
combineTokensWithinMilliseconds: SWITCH_CAPTIONS_EVERY_MS,
|
||||
});
|
||||
}, [captions]);
|
||||
```
|
||||
|
||||
## Rendering with Sequences
|
||||
|
||||
Map over the pages and render each one in a `<Sequence>`. Calculate the start frame and duration from the page timing:
|
||||
|
||||
```tsx
|
||||
import {Sequence, useVideoConfig, AbsoluteFill} from 'remotion';
|
||||
import type {TikTokPage} from '@remotion/captions';
|
||||
|
||||
const CaptionedContent: React.FC = () => {
|
||||
const {fps} = useVideoConfig();
|
||||
|
||||
return (
|
||||
<AbsoluteFill>
|
||||
{pages.map((page, index) => {
|
||||
const nextPage = pages[index + 1] ?? null;
|
||||
const startFrame = (page.startMs / 1000) * fps;
|
||||
const endFrame = Math.min(
|
||||
nextPage ? (nextPage.startMs / 1000) * fps : Infinity,
|
||||
startFrame + (SWITCH_CAPTIONS_EVERY_MS / 1000) * fps,
|
||||
);
|
||||
const durationInFrames = endFrame - startFrame;
|
||||
|
||||
if (durationInFrames <= 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<Sequence
|
||||
key={index}
|
||||
from={startFrame}
|
||||
durationInFrames={durationInFrames}
|
||||
>
|
||||
<CaptionPage page={page} />
|
||||
</Sequence>
|
||||
);
|
||||
})}
|
||||
</AbsoluteFill>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
## Word highlighting
|
||||
|
||||
A caption page contains `tokens` which you can use to highlight the currently spoken word:
|
||||
|
||||
```tsx
|
||||
import {AbsoluteFill, useCurrentFrame, useVideoConfig} from 'remotion';
|
||||
import type {TikTokPage} from '@remotion/captions';
|
||||
|
||||
const HIGHLIGHT_COLOR = '#39E508';
|
||||
|
||||
const CaptionPage: React.FC<{page: TikTokPage}> = ({page}) => {
|
||||
const frame = useCurrentFrame();
|
||||
const {fps} = useVideoConfig();
|
||||
|
||||
// Current time relative to the start of the sequence
|
||||
const currentTimeMs = (frame / fps) * 1000;
|
||||
// Convert to absolute time by adding the page start
|
||||
const absoluteTimeMs = page.startMs + currentTimeMs;
|
||||
|
||||
return (
|
||||
<AbsoluteFill style={{justifyContent: 'center', alignItems: 'center'}}>
|
||||
<div style={{fontSize: 80, fontWeight: 'bold', whiteSpace: 'pre'}}>
|
||||
{page.tokens.map((token) => {
|
||||
const isActive =
|
||||
token.fromMs <= absoluteTimeMs && token.toMs > absoluteTimeMs;
|
||||
|
||||
return (
|
||||
<span
|
||||
key={token.fromMs}
|
||||
style={{color: isActive ? HIGHLIGHT_COLOR : 'white'}}
|
||||
>
|
||||
{token.text}
|
||||
</span>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</AbsoluteFill>
|
||||
);
|
||||
};
|
||||
```
|
||||
229
skills/remotion-video-creation/rules/extract-frames.md
Normal file
229
skills/remotion-video-creation/rules/extract-frames.md
Normal file
@@ -0,0 +1,229 @@
|
||||
---
|
||||
name: extract-frames
|
||||
description: Extract frames from videos at specific timestamps using Mediabunny
|
||||
metadata:
|
||||
tags: frames, extract, video, thumbnail, filmstrip, canvas
|
||||
---
|
||||
|
||||
# Extracting frames from videos
|
||||
|
||||
Use Mediabunny to extract frames from videos at specific timestamps. This is useful for generating thumbnails, filmstrips, or processing individual frames.
|
||||
|
||||
## The `extractFrames()` function
|
||||
|
||||
This function can be copy-pasted into any project.
|
||||
|
||||
```tsx
|
||||
import {
|
||||
ALL_FORMATS,
|
||||
Input,
|
||||
UrlSource,
|
||||
VideoSample,
|
||||
VideoSampleSink,
|
||||
} from "mediabunny";
|
||||
|
||||
type Options = {
|
||||
track: { width: number; height: number };
|
||||
container: string;
|
||||
durationInSeconds: number | null;
|
||||
};
|
||||
|
||||
export type ExtractFramesTimestampsInSecondsFn = (
|
||||
options: Options
|
||||
) => Promise<number[]> | number[];
|
||||
|
||||
export type ExtractFramesProps = {
|
||||
src: string;
|
||||
timestampsInSeconds: number[] | ExtractFramesTimestampsInSecondsFn;
|
||||
onVideoSample: (sample: VideoSample) => void;
|
||||
signal?: AbortSignal;
|
||||
};
|
||||
|
||||
export async function extractFrames({
|
||||
src,
|
||||
timestampsInSeconds,
|
||||
onVideoSample,
|
||||
signal,
|
||||
}: ExtractFramesProps): Promise<void> {
|
||||
using input = new Input({
|
||||
formats: ALL_FORMATS,
|
||||
source: new UrlSource(src),
|
||||
});
|
||||
|
||||
const [durationInSeconds, format, videoTrack] = await Promise.all([
|
||||
input.computeDuration(),
|
||||
input.getFormat(),
|
||||
input.getPrimaryVideoTrack(),
|
||||
]);
|
||||
|
||||
if (!videoTrack) {
|
||||
throw new Error("No video track found in the input");
|
||||
}
|
||||
|
||||
if (signal?.aborted) {
|
||||
throw new Error("Aborted");
|
||||
}
|
||||
|
||||
const timestamps =
|
||||
typeof timestampsInSeconds === "function"
|
||||
? await timestampsInSeconds({
|
||||
track: {
|
||||
width: videoTrack.displayWidth,
|
||||
height: videoTrack.displayHeight,
|
||||
},
|
||||
container: format.name,
|
||||
durationInSeconds,
|
||||
})
|
||||
: timestampsInSeconds;
|
||||
|
||||
if (timestamps.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (signal?.aborted) {
|
||||
throw new Error("Aborted");
|
||||
}
|
||||
|
||||
const sink = new VideoSampleSink(videoTrack);
|
||||
|
||||
for await (using videoSample of sink.samplesAtTimestamps(timestamps)) {
|
||||
if (signal?.aborted) {
|
||||
break;
|
||||
}
|
||||
|
||||
if (!videoSample) {
|
||||
continue;
|
||||
}
|
||||
|
||||
onVideoSample(videoSample);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Basic usage
|
||||
|
||||
Extract frames at specific timestamps:
|
||||
|
||||
```tsx
|
||||
await extractFrames({
|
||||
src: "https://remotion.media/video.mp4",
|
||||
timestampsInSeconds: [0, 1, 2, 3, 4],
|
||||
onVideoSample: (sample) => {
|
||||
const canvas = document.createElement("canvas");
|
||||
canvas.width = sample.displayWidth;
|
||||
canvas.height = sample.displayHeight;
|
||||
const ctx = canvas.getContext("2d");
|
||||
sample.draw(ctx!, 0, 0);
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Creating a filmstrip
|
||||
|
||||
Use a callback function to dynamically calculate timestamps based on video metadata:
|
||||
|
||||
```tsx
|
||||
const canvasWidth = 500;
|
||||
const canvasHeight = 80;
|
||||
const fromSeconds = 0;
|
||||
const toSeconds = 10;
|
||||
|
||||
await extractFrames({
|
||||
src: "https://remotion.media/video.mp4",
|
||||
timestampsInSeconds: async ({ track, durationInSeconds }) => {
|
||||
const aspectRatio = track.width / track.height;
|
||||
const amountOfFramesFit = Math.ceil(
|
||||
canvasWidth / (canvasHeight * aspectRatio)
|
||||
);
|
||||
const segmentDuration = toSeconds - fromSeconds;
|
||||
const timestamps: number[] = [];
|
||||
|
||||
for (let i = 0; i < amountOfFramesFit; i++) {
|
||||
timestamps.push(
|
||||
fromSeconds + (segmentDuration / amountOfFramesFit) * (i + 0.5)
|
||||
);
|
||||
}
|
||||
|
||||
return timestamps;
|
||||
},
|
||||
onVideoSample: (sample) => {
|
||||
console.log(`Frame at ${sample.timestamp}s`);
|
||||
|
||||
const canvas = document.createElement("canvas");
|
||||
canvas.width = sample.displayWidth;
|
||||
canvas.height = sample.displayHeight;
|
||||
const ctx = canvas.getContext("2d");
|
||||
sample.draw(ctx!, 0, 0);
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Cancellation with AbortSignal
|
||||
|
||||
Cancel frame extraction after a timeout:
|
||||
|
||||
```tsx
|
||||
const controller = new AbortController();
|
||||
|
||||
setTimeout(() => controller.abort(), 5000);
|
||||
|
||||
try {
|
||||
await extractFrames({
|
||||
src: "https://remotion.media/video.mp4",
|
||||
timestampsInSeconds: [0, 1, 2, 3, 4],
|
||||
onVideoSample: (sample) => {
|
||||
using frame = sample;
|
||||
const canvas = document.createElement("canvas");
|
||||
canvas.width = frame.displayWidth;
|
||||
canvas.height = frame.displayHeight;
|
||||
const ctx = canvas.getContext("2d");
|
||||
frame.draw(ctx!, 0, 0);
|
||||
},
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
console.log("Frame extraction complete!");
|
||||
} catch (error) {
|
||||
console.error("Frame extraction was aborted or failed:", error);
|
||||
}
|
||||
```
|
||||
|
||||
## Timeout with Promise.race
|
||||
|
||||
```tsx
|
||||
const controller = new AbortController();
|
||||
|
||||
const timeoutPromise = new Promise<never>((_, reject) => {
|
||||
const timeoutId = setTimeout(() => {
|
||||
controller.abort();
|
||||
reject(new Error("Frame extraction timed out after 10 seconds"));
|
||||
}, 10000);
|
||||
|
||||
controller.signal.addEventListener("abort", () => clearTimeout(timeoutId), {
|
||||
once: true,
|
||||
});
|
||||
});
|
||||
|
||||
try {
|
||||
await Promise.race([
|
||||
extractFrames({
|
||||
src: "https://remotion.media/video.mp4",
|
||||
timestampsInSeconds: [0, 1, 2, 3, 4],
|
||||
onVideoSample: (sample) => {
|
||||
using frame = sample;
|
||||
const canvas = document.createElement("canvas");
|
||||
canvas.width = frame.displayWidth;
|
||||
canvas.height = frame.displayHeight;
|
||||
const ctx = canvas.getContext("2d");
|
||||
frame.draw(ctx!, 0, 0);
|
||||
},
|
||||
signal: controller.signal,
|
||||
}),
|
||||
timeoutPromise,
|
||||
]);
|
||||
|
||||
console.log("Frame extraction complete!");
|
||||
} catch (error) {
|
||||
console.error("Frame extraction was aborted or failed:", error);
|
||||
}
|
||||
```
|
||||
152
skills/remotion-video-creation/rules/fonts.md
Normal file
152
skills/remotion-video-creation/rules/fonts.md
Normal file
@@ -0,0 +1,152 @@
|
||||
---
|
||||
name: fonts
|
||||
description: Loading Google Fonts and local fonts in Remotion
|
||||
metadata:
|
||||
tags: fonts, google-fonts, typography, text
|
||||
---
|
||||
|
||||
# Using fonts in Remotion
|
||||
|
||||
## Google Fonts with @remotion/google-fonts
|
||||
|
||||
The recommended way to use Google Fonts. It's type-safe and automatically blocks rendering until the font is ready.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
First, the @remotion/google-fonts package needs to be installed.
|
||||
If it is not installed, use the following command:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/google-fonts # If project uses npm
|
||||
bunx remotion add @remotion/google-fonts # If project uses bun
|
||||
yarn remotion add @remotion/google-fonts # If project uses yarn
|
||||
pnpm exec remotion add @remotion/google-fonts # If project uses pnpm
|
||||
```
|
||||
|
||||
```tsx
|
||||
import { loadFont } from "@remotion/google-fonts/Lobster";
|
||||
|
||||
const { fontFamily } = loadFont();
|
||||
|
||||
export const MyComposition = () => {
|
||||
return <div style={{ fontFamily }}>Hello World</div>;
|
||||
};
|
||||
```
|
||||
|
||||
Preferrably, specify only needed weights and subsets to reduce file size:
|
||||
|
||||
```tsx
|
||||
import { loadFont } from "@remotion/google-fonts/Roboto";
|
||||
|
||||
const { fontFamily } = loadFont("normal", {
|
||||
weights: ["400", "700"],
|
||||
subsets: ["latin"],
|
||||
});
|
||||
```
|
||||
|
||||
### Waiting for font to load
|
||||
|
||||
Use `waitUntilDone()` if you need to know when the font is ready:
|
||||
|
||||
```tsx
|
||||
import { loadFont } from "@remotion/google-fonts/Lobster";
|
||||
|
||||
const { fontFamily, waitUntilDone } = loadFont();
|
||||
|
||||
await waitUntilDone();
|
||||
```
|
||||
|
||||
## Local fonts with @remotion/fonts
|
||||
|
||||
For local font files, use the `@remotion/fonts` package.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
First, install @remotion/fonts:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/fonts # If project uses npm
|
||||
bunx remotion add @remotion/fonts # If project uses bun
|
||||
yarn remotion add @remotion/fonts # If project uses yarn
|
||||
pnpm exec remotion add @remotion/fonts # If project uses pnpm
|
||||
```
|
||||
|
||||
### Loading a local font
|
||||
|
||||
Place your font file in the `public/` folder and use `loadFont()`:
|
||||
|
||||
```tsx
|
||||
import { loadFont } from "@remotion/fonts";
|
||||
import { staticFile } from "remotion";
|
||||
|
||||
await loadFont({
|
||||
family: "MyFont",
|
||||
url: staticFile("MyFont-Regular.woff2"),
|
||||
});
|
||||
|
||||
export const MyComposition = () => {
|
||||
return <div style={{ fontFamily: "MyFont" }}>Hello World</div>;
|
||||
};
|
||||
```
|
||||
|
||||
### Loading multiple weights
|
||||
|
||||
Load each weight separately with the same family name:
|
||||
|
||||
```tsx
|
||||
import { loadFont } from "@remotion/fonts";
|
||||
import { staticFile } from "remotion";
|
||||
|
||||
await Promise.all([
|
||||
loadFont({
|
||||
family: "Inter",
|
||||
url: staticFile("Inter-Regular.woff2"),
|
||||
weight: "400",
|
||||
}),
|
||||
loadFont({
|
||||
family: "Inter",
|
||||
url: staticFile("Inter-Bold.woff2"),
|
||||
weight: "700",
|
||||
}),
|
||||
]);
|
||||
```
|
||||
|
||||
### Available options
|
||||
|
||||
```tsx
|
||||
loadFont({
|
||||
family: "MyFont", // Required: name to use in CSS
|
||||
url: staticFile("font.woff2"), // Required: font file URL
|
||||
format: "woff2", // Optional: auto-detected from extension
|
||||
weight: "400", // Optional: font weight
|
||||
style: "normal", // Optional: normal or italic
|
||||
display: "block", // Optional: font-display behavior
|
||||
});
|
||||
```
|
||||
|
||||
## Using in components
|
||||
|
||||
Call `loadFont()` at the top level of your component or in a separate file that's imported early:
|
||||
|
||||
```tsx
|
||||
import { loadFont } from "@remotion/google-fonts/Montserrat";
|
||||
|
||||
const { fontFamily } = loadFont("normal", {
|
||||
weights: ["400", "700"],
|
||||
subsets: ["latin"],
|
||||
});
|
||||
|
||||
export const Title: React.FC<{ text: string }> = ({ text }) => {
|
||||
return (
|
||||
<h1
|
||||
style={{
|
||||
fontFamily,
|
||||
fontSize: 80,
|
||||
fontWeight: "bold",
|
||||
}}
|
||||
>
|
||||
{text}
|
||||
</h1>
|
||||
);
|
||||
};
|
||||
```
|
||||
58
skills/remotion-video-creation/rules/get-audio-duration.md
Normal file
58
skills/remotion-video-creation/rules/get-audio-duration.md
Normal file
@@ -0,0 +1,58 @@
|
||||
---
|
||||
name: get-audio-duration
|
||||
description: Getting the duration of an audio file in seconds with Mediabunny
|
||||
metadata:
|
||||
tags: duration, audio, length, time, seconds, mp3, wav
|
||||
---
|
||||
|
||||
# Getting audio duration with Mediabunny
|
||||
|
||||
Mediabunny can extract the duration of an audio file. It works in browser, Node.js, and Bun environments.
|
||||
|
||||
## Getting audio duration
|
||||
|
||||
```tsx
|
||||
import { Input, ALL_FORMATS, UrlSource } from "mediabunny";
|
||||
|
||||
export const getAudioDuration = async (src: string) => {
|
||||
const input = new Input({
|
||||
formats: ALL_FORMATS,
|
||||
source: new UrlSource(src, {
|
||||
getRetryDelay: () => null,
|
||||
}),
|
||||
});
|
||||
|
||||
const durationInSeconds = await input.computeDuration();
|
||||
return durationInSeconds;
|
||||
};
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```tsx
|
||||
const duration = await getAudioDuration("https://remotion.media/audio.mp3");
|
||||
console.log(duration); // e.g. 180.5 (seconds)
|
||||
```
|
||||
|
||||
## Using with local files
|
||||
|
||||
For local files, use `FileSource` instead of `UrlSource`:
|
||||
|
||||
```tsx
|
||||
import { Input, ALL_FORMATS, FileSource } from "mediabunny";
|
||||
|
||||
const input = new Input({
|
||||
formats: ALL_FORMATS,
|
||||
source: new FileSource(file), // File object from input or drag-drop
|
||||
});
|
||||
|
||||
const durationInSeconds = await input.computeDuration();
|
||||
```
|
||||
|
||||
## Using with staticFile in Remotion
|
||||
|
||||
```tsx
|
||||
import { staticFile } from "remotion";
|
||||
|
||||
const duration = await getAudioDuration(staticFile("audio.mp3"));
|
||||
```
|
||||
68
skills/remotion-video-creation/rules/get-video-dimensions.md
Normal file
68
skills/remotion-video-creation/rules/get-video-dimensions.md
Normal file
@@ -0,0 +1,68 @@
|
||||
---
|
||||
name: get-video-dimensions
|
||||
description: Getting the width and height of a video file with Mediabunny
|
||||
metadata:
|
||||
tags: dimensions, width, height, resolution, size, video
|
||||
---
|
||||
|
||||
# Getting video dimensions with Mediabunny
|
||||
|
||||
Mediabunny can extract the width and height of a video file. It works in browser, Node.js, and Bun environments.
|
||||
|
||||
## Getting video dimensions
|
||||
|
||||
```tsx
|
||||
import { Input, ALL_FORMATS, UrlSource } from "mediabunny";
|
||||
|
||||
export const getVideoDimensions = async (src: string) => {
|
||||
const input = new Input({
|
||||
formats: ALL_FORMATS,
|
||||
source: new UrlSource(src, {
|
||||
getRetryDelay: () => null,
|
||||
}),
|
||||
});
|
||||
|
||||
const videoTrack = await input.getPrimaryVideoTrack();
|
||||
if (!videoTrack) {
|
||||
throw new Error("No video track found");
|
||||
}
|
||||
|
||||
return {
|
||||
width: videoTrack.displayWidth,
|
||||
height: videoTrack.displayHeight,
|
||||
};
|
||||
};
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```tsx
|
||||
const dimensions = await getVideoDimensions("https://remotion.media/video.mp4");
|
||||
console.log(dimensions.width); // e.g. 1920
|
||||
console.log(dimensions.height); // e.g. 1080
|
||||
```
|
||||
|
||||
## Using with local files
|
||||
|
||||
For local files, use `FileSource` instead of `UrlSource`:
|
||||
|
||||
```tsx
|
||||
import { Input, ALL_FORMATS, FileSource } from "mediabunny";
|
||||
|
||||
const input = new Input({
|
||||
formats: ALL_FORMATS,
|
||||
source: new FileSource(file), // File object from input or drag-drop
|
||||
});
|
||||
|
||||
const videoTrack = await input.getPrimaryVideoTrack();
|
||||
const width = videoTrack.displayWidth;
|
||||
const height = videoTrack.displayHeight;
|
||||
```
|
||||
|
||||
## Using with staticFile in Remotion
|
||||
|
||||
```tsx
|
||||
import { staticFile } from "remotion";
|
||||
|
||||
const dimensions = await getVideoDimensions(staticFile("video.mp4"));
|
||||
```
|
||||
58
skills/remotion-video-creation/rules/get-video-duration.md
Normal file
58
skills/remotion-video-creation/rules/get-video-duration.md
Normal file
@@ -0,0 +1,58 @@
|
||||
---
|
||||
name: get-video-duration
|
||||
description: Getting the duration of a video file in seconds with Mediabunny
|
||||
metadata:
|
||||
tags: duration, video, length, time, seconds
|
||||
---
|
||||
|
||||
# Getting video duration with Mediabunny
|
||||
|
||||
Mediabunny can extract the duration of a video file. It works in browser, Node.js, and Bun environments.
|
||||
|
||||
## Getting video duration
|
||||
|
||||
```tsx
|
||||
import { Input, ALL_FORMATS, UrlSource } from "mediabunny";
|
||||
|
||||
export const getVideoDuration = async (src: string) => {
|
||||
const input = new Input({
|
||||
formats: ALL_FORMATS,
|
||||
source: new UrlSource(src, {
|
||||
getRetryDelay: () => null,
|
||||
}),
|
||||
});
|
||||
|
||||
const durationInSeconds = await input.computeDuration();
|
||||
return durationInSeconds;
|
||||
};
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```tsx
|
||||
const duration = await getVideoDuration("https://remotion.media/video.mp4");
|
||||
console.log(duration); // e.g. 10.5 (seconds)
|
||||
```
|
||||
|
||||
## Using with local files
|
||||
|
||||
For local files, use `FileSource` instead of `UrlSource`:
|
||||
|
||||
```tsx
|
||||
import { Input, ALL_FORMATS, FileSource } from "mediabunny";
|
||||
|
||||
const input = new Input({
|
||||
formats: ALL_FORMATS,
|
||||
source: new FileSource(file), // File object from input or drag-drop
|
||||
});
|
||||
|
||||
const durationInSeconds = await input.computeDuration();
|
||||
```
|
||||
|
||||
## Using with staticFile in Remotion
|
||||
|
||||
```tsx
|
||||
import { staticFile } from "remotion";
|
||||
|
||||
const duration = await getVideoDuration(staticFile("video.mp4"));
|
||||
```
|
||||
138
skills/remotion-video-creation/rules/gifs.md
Normal file
138
skills/remotion-video-creation/rules/gifs.md
Normal file
@@ -0,0 +1,138 @@
|
||||
---
|
||||
name: gif
|
||||
description: Displaying GIFs, APNG, AVIF and WebP in Remotion
|
||||
metadata:
|
||||
tags: gif, animation, images, animated, apng, avif, webp
|
||||
---
|
||||
|
||||
# Using Animated images in Remotion
|
||||
|
||||
## Basic usage
|
||||
|
||||
Use `<AnimatedImage>` to display a GIF, APNG, AVIF or WebP image synchronized with Remotion's timeline:
|
||||
|
||||
```tsx
|
||||
import {AnimatedImage, staticFile} from 'remotion';
|
||||
|
||||
export const MyComposition = () => {
|
||||
return <AnimatedImage src={staticFile('animation.gif')} width={500} height={500} />;
|
||||
};
|
||||
```
|
||||
|
||||
Remote URLs are also supported (must have CORS enabled):
|
||||
|
||||
```tsx
|
||||
<AnimatedImage src="https://example.com/animation.gif" width={500} height={500} />
|
||||
```
|
||||
|
||||
## Sizing and fit
|
||||
|
||||
Control how the image fills its container with the `fit` prop:
|
||||
|
||||
```tsx
|
||||
// Stretch to fill (default)
|
||||
<AnimatedImage src={staticFile("animation.gif")} width={500} height={300} fit="fill" />
|
||||
|
||||
// Maintain aspect ratio, fit inside container
|
||||
<AnimatedImage src={staticFile("animation.gif")} width={500} height={300} fit="contain" />
|
||||
|
||||
// Fill container, crop if needed
|
||||
<AnimatedImage src={staticFile("animation.gif")} width={500} height={300} fit="cover" />
|
||||
```
|
||||
|
||||
## Playback speed
|
||||
|
||||
Use `playbackRate` to control the animation speed:
|
||||
|
||||
```tsx
|
||||
<AnimatedImage src={staticFile("animation.gif")} width={500} height={500} playbackRate={2} /> {/* 2x speed */}
|
||||
<AnimatedImage src={staticFile("animation.gif")} width={500} height={500} playbackRate={0.5} /> {/* Half speed */}
|
||||
```
|
||||
|
||||
## Looping behavior
|
||||
|
||||
Control what happens when the animation finishes:
|
||||
|
||||
```tsx
|
||||
// Loop indefinitely (default)
|
||||
<AnimatedImage src={staticFile("animation.gif")} width={500} height={500} loopBehavior="loop" />
|
||||
|
||||
// Play once, show final frame
|
||||
<AnimatedImage src={staticFile("animation.gif")} width={500} height={500} loopBehavior="pause-after-finish" />
|
||||
|
||||
// Play once, then clear canvas
|
||||
<AnimatedImage src={staticFile("animation.gif")} width={500} height={500} loopBehavior="clear-after-finish" />
|
||||
```
|
||||
|
||||
## Styling
|
||||
|
||||
Use the `style` prop for additional CSS (use `width` and `height` props for sizing):
|
||||
|
||||
```tsx
|
||||
<AnimatedImage
|
||||
src={staticFile('animation.gif')}
|
||||
width={500}
|
||||
height={500}
|
||||
style={{
|
||||
borderRadius: 20,
|
||||
position: 'absolute',
|
||||
top: 100,
|
||||
left: 50,
|
||||
}}
|
||||
/>
|
||||
```
|
||||
|
||||
## Getting GIF duration
|
||||
|
||||
Use `getGifDurationInSeconds()` from `@remotion/gif` to get the duration of a GIF.
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/gif # If project uses npm
|
||||
bunx remotion add @remotion/gif # If project uses bun
|
||||
yarn remotion add @remotion/gif # If project uses yarn
|
||||
pnpm exec remotion add @remotion/gif # If project uses pnpm
|
||||
```
|
||||
|
||||
```tsx
|
||||
import {getGifDurationInSeconds} from '@remotion/gif';
|
||||
import {staticFile} from 'remotion';
|
||||
|
||||
const duration = await getGifDurationInSeconds(staticFile('animation.gif'));
|
||||
console.log(duration); // e.g. 2.5
|
||||
```
|
||||
|
||||
This is useful for setting the composition duration to match the GIF:
|
||||
|
||||
```tsx
|
||||
import {getGifDurationInSeconds} from '@remotion/gif';
|
||||
import {staticFile, CalculateMetadataFunction} from 'remotion';
|
||||
|
||||
const calculateMetadata: CalculateMetadataFunction = async () => {
|
||||
const duration = await getGifDurationInSeconds(staticFile('animation.gif'));
|
||||
return {
|
||||
durationInFrames: Math.ceil(duration * 30),
|
||||
};
|
||||
};
|
||||
```
|
||||
|
||||
## Alternative
|
||||
|
||||
If `<AnimatedImage>` does not work (only supported in Chrome and Firefox), you can use `<Gif>` from `@remotion/gif` instead.
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/gif # If project uses npm
|
||||
bunx remotion add @remotion/gif # If project uses bun
|
||||
yarn remotion add @remotion/gif # If project uses yarn
|
||||
pnpm exec remotion add @remotion/gif # If project uses pnpm
|
||||
```
|
||||
|
||||
```tsx
|
||||
import {Gif} from '@remotion/gif';
|
||||
import {staticFile} from 'remotion';
|
||||
|
||||
export const MyComposition = () => {
|
||||
return <Gif src={staticFile('animation.gif')} width={500} height={500} />;
|
||||
};
|
||||
```
|
||||
|
||||
The `<Gif>` component has the same props as `<AnimatedImage>` but only supports GIF files.
|
||||
130
skills/remotion-video-creation/rules/images.md
Normal file
130
skills/remotion-video-creation/rules/images.md
Normal file
@@ -0,0 +1,130 @@
|
||||
---
|
||||
name: images
|
||||
description: Embedding images in Remotion using the <Img> component
|
||||
metadata:
|
||||
tags: images, img, staticFile, png, jpg, svg, webp
|
||||
---
|
||||
|
||||
# Using images in Remotion
|
||||
|
||||
## The `<Img>` component
|
||||
|
||||
Always use the `<Img>` component from `remotion` to display images:
|
||||
|
||||
```tsx
|
||||
import { Img, staticFile } from "remotion";
|
||||
|
||||
export const MyComposition = () => {
|
||||
return <Img src={staticFile("photo.png")} />;
|
||||
};
|
||||
```
|
||||
|
||||
## Important restrictions
|
||||
|
||||
**You MUST use the `<Img>` component from `remotion`.** Do not use:
|
||||
|
||||
- Native HTML `<img>` elements
|
||||
- Next.js `<Image>` component
|
||||
- CSS `background-image`
|
||||
|
||||
The `<Img>` component ensures images are fully loaded before rendering, preventing flickering and blank frames during video export.
|
||||
|
||||
## Local images with staticFile()
|
||||
|
||||
Place images in the `public/` folder and use `staticFile()` to reference them:
|
||||
|
||||
```
|
||||
my-video/
|
||||
├─ public/
|
||||
│ ├─ logo.png
|
||||
│ ├─ avatar.jpg
|
||||
│ └─ icon.svg
|
||||
├─ src/
|
||||
├─ package.json
|
||||
```
|
||||
|
||||
```tsx
|
||||
import { Img, staticFile } from "remotion";
|
||||
|
||||
<Img src={staticFile("logo.png")} />
|
||||
```
|
||||
|
||||
## Remote images
|
||||
|
||||
Remote URLs can be used directly without `staticFile()`:
|
||||
|
||||
```tsx
|
||||
<Img src="https://example.com/image.png" />
|
||||
```
|
||||
|
||||
Ensure remote images have CORS enabled.
|
||||
|
||||
For animated GIFs, use the `<Gif>` component from `@remotion/gif` instead.
|
||||
|
||||
## Sizing and positioning
|
||||
|
||||
Use the `style` prop to control size and position:
|
||||
|
||||
```tsx
|
||||
<Img
|
||||
src={staticFile("photo.png")}
|
||||
style={{
|
||||
width: 500,
|
||||
height: 300,
|
||||
position: "absolute",
|
||||
top: 100,
|
||||
left: 50,
|
||||
objectFit: "cover",
|
||||
}}
|
||||
/>
|
||||
```
|
||||
|
||||
## Dynamic image paths
|
||||
|
||||
Use template literals for dynamic file references:
|
||||
|
||||
```tsx
|
||||
import { Img, staticFile, useCurrentFrame } from "remotion";
|
||||
|
||||
const frame = useCurrentFrame();
|
||||
|
||||
// Image sequence
|
||||
<Img src={staticFile(`frames/frame${frame}.png`)} />
|
||||
|
||||
// Selecting based on props
|
||||
<Img src={staticFile(`avatars/${props.userId}.png`)} />
|
||||
|
||||
// Conditional images
|
||||
<Img src={staticFile(`icons/${isActive ? "active" : "inactive"}.svg`)} />
|
||||
```
|
||||
|
||||
This pattern is useful for:
|
||||
|
||||
- Image sequences (frame-by-frame animations)
|
||||
- User-specific avatars or profile images
|
||||
- Theme-based icons
|
||||
- State-dependent graphics
|
||||
|
||||
## Getting image dimensions
|
||||
|
||||
Use `getImageDimensions()` to get the dimensions of an image:
|
||||
|
||||
```tsx
|
||||
import { getImageDimensions, staticFile } from "remotion";
|
||||
|
||||
const { width, height } = await getImageDimensions(staticFile("photo.png"));
|
||||
```
|
||||
|
||||
This is useful for calculating aspect ratios or sizing compositions:
|
||||
|
||||
```tsx
|
||||
import { getImageDimensions, staticFile, CalculateMetadataFunction } from "remotion";
|
||||
|
||||
const calculateMetadata: CalculateMetadataFunction = async () => {
|
||||
const { width, height } = await getImageDimensions(staticFile("photo.png"));
|
||||
return {
|
||||
width,
|
||||
height,
|
||||
};
|
||||
};
|
||||
```
|
||||
67
skills/remotion-video-creation/rules/import-srt-captions.md
Normal file
67
skills/remotion-video-creation/rules/import-srt-captions.md
Normal file
@@ -0,0 +1,67 @@
|
||||
---
|
||||
name: import-srt-captions
|
||||
description: Importing .srt subtitle files into Remotion using @remotion/captions
|
||||
metadata:
|
||||
tags: captions, subtitles, srt, import, parse
|
||||
---
|
||||
|
||||
# Importing .srt subtitles into Remotion
|
||||
|
||||
If you have an existing `.srt` subtitle file, you can import it into Remotion using `parseSrt()` from `@remotion/captions`.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
First, the @remotion/captions package needs to be installed.
|
||||
If it is not installed, use the following command:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/captions # If project uses npm
|
||||
bunx remotion add @remotion/captions # If project uses bun
|
||||
yarn remotion add @remotion/captions # If project uses yarn
|
||||
pnpm exec remotion add @remotion/captions # If project uses pnpm
|
||||
```
|
||||
|
||||
## Reading an .srt file
|
||||
|
||||
Use `staticFile()` to reference an `.srt` file in your `public` folder, then fetch and parse it:
|
||||
|
||||
```tsx
|
||||
import {useState, useEffect, useCallback} from 'react';
|
||||
import {AbsoluteFill, staticFile, useDelayRender} from 'remotion';
|
||||
import {parseSrt} from '@remotion/captions';
|
||||
import type {Caption} from '@remotion/captions';
|
||||
|
||||
export const MyComponent: React.FC = () => {
|
||||
const [captions, setCaptions] = useState<Caption[] | null>(null);
|
||||
const {delayRender, continueRender, cancelRender} = useDelayRender();
|
||||
const [handle] = useState(() => delayRender());
|
||||
|
||||
const fetchCaptions = useCallback(async () => {
|
||||
try {
|
||||
const response = await fetch(staticFile('subtitles.srt'));
|
||||
const text = await response.text();
|
||||
const {captions: parsed} = parseSrt({input: text});
|
||||
setCaptions(parsed);
|
||||
continueRender(handle);
|
||||
} catch (e) {
|
||||
cancelRender(e);
|
||||
}
|
||||
}, [continueRender, cancelRender, handle]);
|
||||
|
||||
useEffect(() => {
|
||||
fetchCaptions();
|
||||
}, [fetchCaptions]);
|
||||
|
||||
if (!captions) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return <AbsoluteFill>{/* Use captions here */}</AbsoluteFill>;
|
||||
};
|
||||
```
|
||||
|
||||
Remote URLs are also supported - you can `fetch()` a remote file via URL instead of using `staticFile()`.
|
||||
|
||||
## Using imported captions
|
||||
|
||||
Once parsed, the captions are in the `Caption` format and can be used with all `@remotion/captions` utilities.
|
||||
68
skills/remotion-video-creation/rules/lottie.md
Normal file
68
skills/remotion-video-creation/rules/lottie.md
Normal file
@@ -0,0 +1,68 @@
|
||||
---
|
||||
name: lottie
|
||||
description: Embedding Lottie animations in Remotion.
|
||||
metadata:
|
||||
category: Animation
|
||||
---
|
||||
|
||||
# Using Lottie Animations in Remotion
|
||||
|
||||
## Prerequisites
|
||||
|
||||
First, the @remotion/lottie package needs to be installed.
|
||||
If it is not, use the following command:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/lottie # If project uses npm
|
||||
bunx remotion add @remotion/lottie # If project uses bun
|
||||
yarn remotion add @remotion/lottie # If project uses yarn
|
||||
pnpm exec remotion add @remotion/lottie # If project uses pnpm
|
||||
```
|
||||
|
||||
## Displaying a Lottie file
|
||||
|
||||
To import a Lottie animation:
|
||||
|
||||
- Fetch the Lottie asset
|
||||
- Wrap the loading process in `delayRender()` and `continueRender()`
|
||||
- Save the animation data in a state
|
||||
- Render the Lottie animation using the `Lottie` component from the `@remotion/lottie` package
|
||||
|
||||
```tsx
|
||||
import {Lottie, LottieAnimationData} from '@remotion/lottie';
|
||||
import {useEffect, useState} from 'react';
|
||||
import {cancelRender, continueRender, delayRender} from 'remotion';
|
||||
|
||||
export const MyAnimation = () => {
|
||||
const [handle] = useState(() => delayRender('Loading Lottie animation'));
|
||||
|
||||
const [animationData, setAnimationData] = useState<LottieAnimationData | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
fetch('https://assets4.lottiefiles.com/packages/lf20_zyquagfl.json')
|
||||
.then((data) => data.json())
|
||||
.then((json) => {
|
||||
setAnimationData(json);
|
||||
continueRender(handle);
|
||||
})
|
||||
.catch((err) => {
|
||||
cancelRender(err);
|
||||
});
|
||||
}, [handle]);
|
||||
|
||||
if (!animationData) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return <Lottie animationData={animationData} />;
|
||||
};
|
||||
```
|
||||
|
||||
## Styling and animating
|
||||
|
||||
Lottie supports the `style` prop to allow styles and animations:
|
||||
|
||||
```tsx
|
||||
return <Lottie animationData={animationData} style={{width: 400, height: 400}} />;
|
||||
```
|
||||
|
||||
35
skills/remotion-video-creation/rules/measuring-dom-nodes.md
Normal file
35
skills/remotion-video-creation/rules/measuring-dom-nodes.md
Normal file
@@ -0,0 +1,35 @@
|
||||
---
|
||||
name: measuring-dom-nodes
|
||||
description: Measuring DOM element dimensions in Remotion
|
||||
metadata:
|
||||
tags: measure, layout, dimensions, getBoundingClientRect, scale
|
||||
---
|
||||
|
||||
# Measuring DOM nodes in Remotion
|
||||
|
||||
Remotion applies a `scale()` transform to the video container, which affects values from `getBoundingClientRect()`. Use `useCurrentScale()` to get correct measurements.
|
||||
|
||||
## Measuring element dimensions
|
||||
|
||||
```tsx
|
||||
import { useCurrentScale } from "remotion";
|
||||
import { useRef, useEffect, useState } from "react";
|
||||
|
||||
export const MyComponent = () => {
|
||||
const ref = useRef<HTMLDivElement>(null);
|
||||
const scale = useCurrentScale();
|
||||
const [dimensions, setDimensions] = useState({ width: 0, height: 0 });
|
||||
|
||||
useEffect(() => {
|
||||
if (!ref.current) return;
|
||||
const rect = ref.current.getBoundingClientRect();
|
||||
setDimensions({
|
||||
width: rect.width / scale,
|
||||
height: rect.height / scale,
|
||||
});
|
||||
}, [scale]);
|
||||
|
||||
return <div ref={ref}>Content to measure</div>;
|
||||
};
|
||||
```
|
||||
|
||||
143
skills/remotion-video-creation/rules/measuring-text.md
Normal file
143
skills/remotion-video-creation/rules/measuring-text.md
Normal file
@@ -0,0 +1,143 @@
|
||||
---
|
||||
name: measuring-text
|
||||
description: Measuring text dimensions, fitting text to containers, and checking overflow
|
||||
metadata:
|
||||
tags: measure, text, layout, dimensions, fitText, fillTextBox
|
||||
---
|
||||
|
||||
# Measuring text in Remotion
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Install @remotion/layout-utils if it is not already installed:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/layout-utils # If project uses npm
|
||||
bunx remotion add @remotion/layout-utils # If project uses bun
|
||||
yarn remotion add @remotion/layout-utils # If project uses yarn
|
||||
pnpm exec remotion add @remotion/layout-utils # If project uses pnpm
|
||||
```
|
||||
|
||||
## Measuring text dimensions
|
||||
|
||||
Use `measureText()` to calculate the width and height of text:
|
||||
|
||||
```tsx
|
||||
import { measureText } from "@remotion/layout-utils";
|
||||
|
||||
const { width, height } = measureText({
|
||||
text: "Hello World",
|
||||
fontFamily: "Arial",
|
||||
fontSize: 32,
|
||||
fontWeight: "bold",
|
||||
});
|
||||
```
|
||||
|
||||
Results are cached - duplicate calls return the cached result.
|
||||
|
||||
## Fitting text to a width
|
||||
|
||||
Use `fitText()` to find the optimal font size for a container:
|
||||
|
||||
```tsx
|
||||
import { fitText } from "@remotion/layout-utils";
|
||||
|
||||
const { fontSize } = fitText({
|
||||
text: "Hello World",
|
||||
withinWidth: 600,
|
||||
fontFamily: "Inter",
|
||||
fontWeight: "bold",
|
||||
});
|
||||
|
||||
return (
|
||||
<div
|
||||
style={{
|
||||
fontSize: Math.min(fontSize, 80), // Cap at 80px
|
||||
fontFamily: "Inter",
|
||||
fontWeight: "bold",
|
||||
}}
|
||||
>
|
||||
Hello World
|
||||
</div>
|
||||
);
|
||||
```
|
||||
|
||||
## Checking text overflow
|
||||
|
||||
Use `fillTextBox()` to check if text exceeds a box:
|
||||
|
||||
```tsx
|
||||
import { fillTextBox } from "@remotion/layout-utils";
|
||||
|
||||
const box = fillTextBox({ maxBoxWidth: 400, maxLines: 3 });
|
||||
|
||||
const words = ["Hello", "World", "This", "is", "a", "test"];
|
||||
for (const word of words) {
|
||||
const { exceedsBox } = box.add({
|
||||
text: word + " ",
|
||||
fontFamily: "Arial",
|
||||
fontSize: 24,
|
||||
});
|
||||
if (exceedsBox) {
|
||||
// Text would overflow, handle accordingly
|
||||
break;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best practices
|
||||
|
||||
**Load fonts first:** Only call measurement functions after fonts are loaded.
|
||||
|
||||
```tsx
|
||||
import { loadFont } from "@remotion/google-fonts/Inter";
|
||||
|
||||
const { fontFamily, waitUntilDone } = loadFont("normal", {
|
||||
weights: ["400"],
|
||||
subsets: ["latin"],
|
||||
});
|
||||
|
||||
waitUntilDone().then(() => {
|
||||
// Now safe to measure
|
||||
const { width } = measureText({
|
||||
text: "Hello",
|
||||
fontFamily,
|
||||
fontSize: 32,
|
||||
});
|
||||
})
|
||||
```
|
||||
|
||||
**Use validateFontIsLoaded:** Catch font loading issues early:
|
||||
|
||||
```tsx
|
||||
measureText({
|
||||
text: "Hello",
|
||||
fontFamily: "MyCustomFont",
|
||||
fontSize: 32,
|
||||
validateFontIsLoaded: true, // Throws if font not loaded
|
||||
});
|
||||
```
|
||||
|
||||
**Match font properties:** Use the same properties for measurement and rendering:
|
||||
|
||||
```tsx
|
||||
const fontStyle = {
|
||||
fontFamily: "Inter",
|
||||
fontSize: 32,
|
||||
fontWeight: "bold" as const,
|
||||
letterSpacing: "0.5px",
|
||||
};
|
||||
|
||||
const { width } = measureText({
|
||||
text: "Hello",
|
||||
...fontStyle,
|
||||
});
|
||||
|
||||
return <div style={fontStyle}>Hello</div>;
|
||||
```
|
||||
|
||||
**Avoid padding and border:** Use `outline` instead of `border` to prevent layout differences:
|
||||
|
||||
```tsx
|
||||
<div style={{ outline: "2px solid red" }}>Text</div>
|
||||
```
|
||||
106
skills/remotion-video-creation/rules/sequencing.md
Normal file
106
skills/remotion-video-creation/rules/sequencing.md
Normal file
@@ -0,0 +1,106 @@
|
||||
---
|
||||
name: sequencing
|
||||
description: Sequencing patterns for Remotion - delay, trim, limit duration of items
|
||||
metadata:
|
||||
tags: sequence, series, timing, delay, trim
|
||||
---
|
||||
|
||||
Use `<Sequence>` to delay when an element appears in the timeline.
|
||||
|
||||
```tsx
|
||||
import { Sequence } from "remotion";
|
||||
|
||||
const {fps} = useVideoConfig();
|
||||
|
||||
<Sequence from={1 * fps} durationInFrames={2 * fps} premountFor={1 * fps}>
|
||||
<Title />
|
||||
</Sequence>
|
||||
<Sequence from={2 * fps} durationInFrames={2 * fps} premountFor={1 * fps}>
|
||||
<Subtitle />
|
||||
</Sequence>
|
||||
```
|
||||
|
||||
This will by default wrap the component in an absolute fill element.
|
||||
If the items should not be wrapped, use the `layout` prop:
|
||||
|
||||
```tsx
|
||||
<Sequence layout="none">
|
||||
<Title />
|
||||
</Sequence>
|
||||
```
|
||||
|
||||
## Premounting
|
||||
|
||||
This loads the component in the timeline before it is actually played.
|
||||
Always premount any `<Sequence>`!
|
||||
|
||||
```tsx
|
||||
<Sequence premountFor={1 * fps}>
|
||||
<Title />
|
||||
</Sequence>
|
||||
```
|
||||
|
||||
## Series
|
||||
|
||||
Use `<Series>` when elements should play one after another without overlap.
|
||||
|
||||
```tsx
|
||||
import {Series} from 'remotion';
|
||||
|
||||
<Series>
|
||||
<Series.Sequence durationInFrames={45}>
|
||||
<Intro />
|
||||
</Series.Sequence>
|
||||
<Series.Sequence durationInFrames={60}>
|
||||
<MainContent />
|
||||
</Series.Sequence>
|
||||
<Series.Sequence durationInFrames={30}>
|
||||
<Outro />
|
||||
</Series.Sequence>
|
||||
</Series>;
|
||||
```
|
||||
|
||||
Same as with `<Sequence>`, the items will be wrapped in an absolute fill element by default when using `<Series.Sequence>`, unless the `layout` prop is set to `none`.
|
||||
|
||||
### Series with overlaps
|
||||
|
||||
Use negative offset for overlapping sequences:
|
||||
|
||||
```tsx
|
||||
<Series>
|
||||
<Series.Sequence durationInFrames={60}>
|
||||
<SceneA />
|
||||
</Series.Sequence>
|
||||
<Series.Sequence offset={-15} durationInFrames={60}>
|
||||
{/* Starts 15 frames before SceneA ends */}
|
||||
<SceneB />
|
||||
</Series.Sequence>
|
||||
</Series>
|
||||
```
|
||||
|
||||
## Frame References Inside Sequences
|
||||
|
||||
Inside a Sequence, `useCurrentFrame()` returns the local frame (starting from 0):
|
||||
|
||||
```tsx
|
||||
<Sequence from={60} durationInFrames={30}>
|
||||
<MyComponent />
|
||||
{/* Inside MyComponent, useCurrentFrame() returns 0-29, not 60-89 */}
|
||||
</Sequence>
|
||||
```
|
||||
|
||||
## Nested Sequences
|
||||
|
||||
Sequences can be nested for complex timing:
|
||||
|
||||
```tsx
|
||||
<Sequence from={0} durationInFrames={120}>
|
||||
<Background />
|
||||
<Sequence from={15} durationInFrames={90} layout="none">
|
||||
<Title />
|
||||
</Sequence>
|
||||
<Sequence from={45} durationInFrames={60} layout="none">
|
||||
<Subtitle />
|
||||
</Sequence>
|
||||
</Sequence>
|
||||
```
|
||||
11
skills/remotion-video-creation/rules/tailwind.md
Normal file
11
skills/remotion-video-creation/rules/tailwind.md
Normal file
@@ -0,0 +1,11 @@
|
||||
---
|
||||
name: tailwind
|
||||
description: Using TailwindCSS in Remotion.
|
||||
metadata:
|
||||
---
|
||||
|
||||
You can and should use TailwindCSS in Remotion, if TailwindCSS is installed in the project.
|
||||
|
||||
Don't use `transition-*` or `animate-*` classes - always animate using the `useCurrentFrame()` hook.
|
||||
|
||||
Tailwind must be installed and enabled first in a Remotion project - fetch https://www.remotion.dev/docs/tailwind using WebFetch for instructions.
|
||||
20
skills/remotion-video-creation/rules/text-animations.md
Normal file
20
skills/remotion-video-creation/rules/text-animations.md
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
name: text-animations
|
||||
description: Typography and text animation patterns for Remotion.
|
||||
metadata:
|
||||
tags: typography, text, typewriter, highlighter ken
|
||||
---
|
||||
|
||||
## Text animations
|
||||
|
||||
Based on `useCurrentFrame()`, reduce the string character by character to create a typewriter effect.
|
||||
|
||||
## Typewriter Effect
|
||||
|
||||
See [Typewriter](assets/text-animations-typewriter.tsx) for an advanced example with a blinking cursor and a pause after the first sentence.
|
||||
|
||||
Always use string slicing for typewriter effects. Never use per-character opacity.
|
||||
|
||||
## Word Highlighting
|
||||
|
||||
See [Word Highlight](assets/text-animations-word-highlight.tsx) for an example for how a word highlight is animated, like with a highlighter pen.
|
||||
179
skills/remotion-video-creation/rules/timing.md
Normal file
179
skills/remotion-video-creation/rules/timing.md
Normal file
@@ -0,0 +1,179 @@
|
||||
---
|
||||
name: timing
|
||||
description: Interpolation curves in Remotion - linear, easing, spring animations
|
||||
metadata:
|
||||
tags: spring, bounce, easing, interpolation
|
||||
---
|
||||
|
||||
A simple linear interpolation is done using the `interpolate` function.
|
||||
|
||||
```ts title="Going from 0 to 1 over 100 frames"
|
||||
import {interpolate} from 'remotion';
|
||||
|
||||
const opacity = interpolate(frame, [0, 100], [0, 1]);
|
||||
```
|
||||
|
||||
By default, the values are not clamped, so the value can go outside the range [0, 1].
|
||||
Here is how they can be clamped:
|
||||
|
||||
```ts title="Going from 0 to 1 over 100 frames with extrapolation"
|
||||
const opacity = interpolate(frame, [0, 100], [0, 1], {
|
||||
extrapolateRight: 'clamp',
|
||||
extrapolateLeft: 'clamp',
|
||||
});
|
||||
```
|
||||
|
||||
## Spring animations
|
||||
|
||||
Spring animations have a more natural motion.
|
||||
They go from 0 to 1 over time.
|
||||
|
||||
```ts title="Spring animation from 0 to 1 over 100 frames"
|
||||
import {spring, useCurrentFrame, useVideoConfig} from 'remotion';
|
||||
|
||||
const frame = useCurrentFrame();
|
||||
const {fps} = useVideoConfig();
|
||||
|
||||
const scale = spring({
|
||||
frame,
|
||||
fps,
|
||||
});
|
||||
```
|
||||
|
||||
### Physical properties
|
||||
|
||||
The default configuration is: `mass: 1, damping: 10, stiffness: 100`.
|
||||
This leads to the animation having a bit of bounce before it settles.
|
||||
|
||||
The config can be overwritten like this:
|
||||
|
||||
```ts
|
||||
const scale = spring({
|
||||
frame,
|
||||
fps,
|
||||
config: {damping: 200},
|
||||
});
|
||||
```
|
||||
|
||||
The recommended configuration for a natural motion without a bounce is: `{ damping: 200 }`.
|
||||
|
||||
Here are some common configurations:
|
||||
|
||||
```tsx
|
||||
const smooth = {damping: 200}; // Smooth, no bounce (subtle reveals)
|
||||
const snappy = {damping: 20, stiffness: 200}; // Snappy, minimal bounce (UI elements)
|
||||
const bouncy = {damping: 8}; // Bouncy entrance (playful animations)
|
||||
const heavy = {damping: 15, stiffness: 80, mass: 2}; // Heavy, slow, small bounce
|
||||
```
|
||||
|
||||
### Delay
|
||||
|
||||
The animation starts immediately by default.
|
||||
Use the `delay` parameter to delay the animation by a number of frames.
|
||||
|
||||
```tsx
|
||||
const entrance = spring({
|
||||
frame: frame - ENTRANCE_DELAY,
|
||||
fps,
|
||||
delay: 20,
|
||||
});
|
||||
```
|
||||
|
||||
### Duration
|
||||
|
||||
A `spring()` has a natural duration based on the physical properties.
|
||||
To stretch the animation to a specific duration, use the `durationInFrames` parameter.
|
||||
|
||||
```tsx
|
||||
const spring = spring({
|
||||
frame,
|
||||
fps,
|
||||
durationInFrames: 40,
|
||||
});
|
||||
```
|
||||
|
||||
### Combining spring() with interpolate()
|
||||
|
||||
Map spring output (0-1) to custom ranges:
|
||||
|
||||
```tsx
|
||||
const springProgress = spring({
|
||||
frame,
|
||||
fps,
|
||||
});
|
||||
|
||||
// Map to rotation
|
||||
const rotation = interpolate(springProgress, [0, 1], [0, 360]);
|
||||
|
||||
<div style={{rotate: rotation + 'deg'}} />;
|
||||
```
|
||||
|
||||
### Adding springs
|
||||
|
||||
Springs return just numbers, so math can be performed:
|
||||
|
||||
```tsx
|
||||
const frame = useCurrentFrame();
|
||||
const {fps, durationInFrames} = useVideoConfig();
|
||||
|
||||
const inAnimation = spring({
|
||||
frame,
|
||||
fps,
|
||||
});
|
||||
const outAnimation = spring({
|
||||
frame,
|
||||
fps,
|
||||
durationInFrames: 1 * fps,
|
||||
delay: durationInFrames - 1 * fps,
|
||||
});
|
||||
|
||||
const scale = inAnimation - outAnimation;
|
||||
```
|
||||
|
||||
## Easing
|
||||
|
||||
Easing can be added to the `interpolate` function:
|
||||
|
||||
```ts
|
||||
import {interpolate, Easing} from 'remotion';
|
||||
|
||||
const value1 = interpolate(frame, [0, 100], [0, 1], {
|
||||
easing: Easing.inOut(Easing.quad),
|
||||
extrapolateLeft: 'clamp',
|
||||
extrapolateRight: 'clamp',
|
||||
});
|
||||
```
|
||||
|
||||
The default easing is `Easing.linear`.
|
||||
There are various other convexities:
|
||||
|
||||
- `Easing.in` for starting slow and accelerating
|
||||
- `Easing.out` for starting fast and slowing down
|
||||
- `Easing.inOut`
|
||||
|
||||
and curves (sorted from most linear to most curved):
|
||||
|
||||
- `Easing.quad`
|
||||
- `Easing.sin`
|
||||
- `Easing.exp`
|
||||
- `Easing.circle`
|
||||
|
||||
Convexities and curves need be combined for an easing function:
|
||||
|
||||
```ts
|
||||
const value1 = interpolate(frame, [0, 100], [0, 1], {
|
||||
easing: Easing.inOut(Easing.quad),
|
||||
extrapolateLeft: 'clamp',
|
||||
extrapolateRight: 'clamp',
|
||||
});
|
||||
```
|
||||
|
||||
Cubic bezier curves are also supported:
|
||||
|
||||
```ts
|
||||
const value1 = interpolate(frame, [0, 100], [0, 1], {
|
||||
easing: Easing.bezier(0.8, 0.22, 0.96, 0.65),
|
||||
extrapolateLeft: 'clamp',
|
||||
extrapolateRight: 'clamp',
|
||||
});
|
||||
```
|
||||
19
skills/remotion-video-creation/rules/transcribe-captions.md
Normal file
19
skills/remotion-video-creation/rules/transcribe-captions.md
Normal file
@@ -0,0 +1,19 @@
|
||||
---
|
||||
name: transcribe-captions
|
||||
description: Transcribing audio to generate captions in Remotion
|
||||
metadata:
|
||||
tags: captions, transcribe, whisper, audio, speech-to-text
|
||||
---
|
||||
|
||||
# Transcribing audio
|
||||
|
||||
Remotion provides several built-in options for transcribing audio to generate captions:
|
||||
|
||||
- `@remotion/install-whisper-cpp` - Transcribe locally on a server using Whisper.cpp. Fast and free, but requires server infrastructure.
|
||||
https://remotion.dev/docs/install-whisper-cpp
|
||||
|
||||
- `@remotion/whisper-web` - Transcribe in the browser using WebAssembly. No server needed and free, but slower due to WASM overhead.
|
||||
https://remotion.dev/docs/whisper-web
|
||||
|
||||
- `@remotion/openai-whisper` - Use OpenAI Whisper API for cloud-based transcription. Fast and no server needed, but requires payment.
|
||||
https://remotion.dev/docs/openai-whisper/openai-whisper-api-to-captions
|
||||
122
skills/remotion-video-creation/rules/transitions.md
Normal file
122
skills/remotion-video-creation/rules/transitions.md
Normal file
@@ -0,0 +1,122 @@
|
||||
---
|
||||
name: transitions
|
||||
description: Fullscreen scene transitions for Remotion.
|
||||
metadata:
|
||||
tags: transitions, fade, slide, wipe, scenes
|
||||
---
|
||||
|
||||
## Fullscreen transitions
|
||||
|
||||
Using `<TransitionSeries>` to animate between multiple scenes or clips.
|
||||
This will absolutely position the children.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
First, the @remotion/transitions package needs to be installed.
|
||||
If it is not, use the following command:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/transitions # If project uses npm
|
||||
bunx remotion add @remotion/transitions # If project uses bun
|
||||
yarn remotion add @remotion/transitions # If project uses yarn
|
||||
pnpm exec remotion add @remotion/transitions # If project uses pnpm
|
||||
```
|
||||
|
||||
## Example usage
|
||||
|
||||
```tsx
|
||||
import {TransitionSeries, linearTiming} from '@remotion/transitions';
|
||||
import {fade} from '@remotion/transitions/fade';
|
||||
|
||||
<TransitionSeries>
|
||||
<TransitionSeries.Sequence durationInFrames={60}>
|
||||
<SceneA />
|
||||
</TransitionSeries.Sequence>
|
||||
<TransitionSeries.Transition presentation={fade()} timing={linearTiming({durationInFrames: 15})} />
|
||||
<TransitionSeries.Sequence durationInFrames={60}>
|
||||
<SceneB />
|
||||
</TransitionSeries.Sequence>
|
||||
</TransitionSeries>;
|
||||
```
|
||||
|
||||
## Available Transition Types
|
||||
|
||||
Import transitions from their respective modules:
|
||||
|
||||
```tsx
|
||||
import {fade} from '@remotion/transitions/fade';
|
||||
import {slide} from '@remotion/transitions/slide';
|
||||
import {wipe} from '@remotion/transitions/wipe';
|
||||
import {flip} from '@remotion/transitions/flip';
|
||||
import {clockWipe} from '@remotion/transitions/clock-wipe';
|
||||
```
|
||||
|
||||
## Slide Transition with Direction
|
||||
|
||||
Specify slide direction for enter/exit animations.
|
||||
|
||||
```tsx
|
||||
import {slide} from '@remotion/transitions/slide';
|
||||
|
||||
<TransitionSeries.Transition presentation={slide({direction: 'from-left'})} timing={linearTiming({durationInFrames: 20})} />;
|
||||
```
|
||||
|
||||
Directions: `"from-left"`, `"from-right"`, `"from-top"`, `"from-bottom"`
|
||||
|
||||
## Timing Options
|
||||
|
||||
```tsx
|
||||
import {linearTiming, springTiming} from '@remotion/transitions';
|
||||
|
||||
// Linear timing - constant speed
|
||||
linearTiming({durationInFrames: 20});
|
||||
|
||||
// Spring timing - organic motion
|
||||
springTiming({config: {damping: 200}, durationInFrames: 25});
|
||||
```
|
||||
|
||||
## Duration calculation
|
||||
|
||||
Transitions overlap adjacent scenes, so the total composition length is **shorter** than the sum of all sequence durations.
|
||||
|
||||
For example, with two 60-frame sequences and a 15-frame transition:
|
||||
|
||||
- Without transitions: `60 + 60 = 120` frames
|
||||
- With transition: `60 + 60 - 15 = 105` frames
|
||||
|
||||
The transition duration is subtracted because both scenes play simultaneously during the transition.
|
||||
|
||||
### Getting the duration of a transition
|
||||
|
||||
Use the `getDurationInFrames()` method on the timing object:
|
||||
|
||||
```tsx
|
||||
import {linearTiming, springTiming} from '@remotion/transitions';
|
||||
|
||||
const linearDuration = linearTiming({durationInFrames: 20}).getDurationInFrames({fps: 30});
|
||||
// Returns 20
|
||||
|
||||
const springDuration = springTiming({config: {damping: 200}}).getDurationInFrames({fps: 30});
|
||||
// Returns calculated duration based on spring physics
|
||||
```
|
||||
|
||||
For `springTiming` without an explicit `durationInFrames`, the duration depends on `fps` because it calculates when the spring animation settles.
|
||||
|
||||
### Calculating total composition duration
|
||||
|
||||
```tsx
|
||||
import {linearTiming} from '@remotion/transitions';
|
||||
|
||||
const scene1Duration = 60;
|
||||
const scene2Duration = 60;
|
||||
const scene3Duration = 60;
|
||||
|
||||
const timing1 = linearTiming({durationInFrames: 15});
|
||||
const timing2 = linearTiming({durationInFrames: 20});
|
||||
|
||||
const transition1Duration = timing1.getDurationInFrames({fps: 30});
|
||||
const transition2Duration = timing2.getDurationInFrames({fps: 30});
|
||||
|
||||
const totalDuration = scene1Duration + scene2Duration + scene3Duration - transition1Duration - transition2Duration;
|
||||
// 60 + 60 + 60 - 15 - 20 = 145 frames
|
||||
```
|
||||
53
skills/remotion-video-creation/rules/trimming.md
Normal file
53
skills/remotion-video-creation/rules/trimming.md
Normal file
@@ -0,0 +1,53 @@
|
||||
---
|
||||
name: trimming
|
||||
description: Trimming patterns for Remotion - cut the beginning or end of animations
|
||||
metadata:
|
||||
tags: sequence, trim, clip, cut, offset
|
||||
---
|
||||
|
||||
Use `<Sequence>` with a negative `from` value to trim the start of an animation.
|
||||
|
||||
## Trim the Beginning
|
||||
|
||||
A negative `from` value shifts time backwards, making the animation start partway through:
|
||||
|
||||
```tsx
|
||||
import { Sequence, useVideoConfig } from "remotion";
|
||||
|
||||
const fps = useVideoConfig();
|
||||
|
||||
<Sequence from={-0.5 * fps}>
|
||||
<MyAnimation />
|
||||
</Sequence>
|
||||
```
|
||||
|
||||
The animation appears 15 frames into its progress - the first 15 frames are trimmed off.
|
||||
Inside `<MyAnimation>`, `useCurrentFrame()` starts at 15 instead of 0.
|
||||
|
||||
## Trim the End
|
||||
|
||||
Use `durationInFrames` to unmount content after a specified duration:
|
||||
|
||||
```tsx
|
||||
|
||||
<Sequence durationInFrames={1.5 * fps}>
|
||||
<MyAnimation />
|
||||
</Sequence>
|
||||
```
|
||||
|
||||
The animation plays for 45 frames, then the component unmounts.
|
||||
|
||||
## Trim and Delay
|
||||
|
||||
Nest sequences to both trim the beginning and delay when it appears:
|
||||
|
||||
```tsx
|
||||
<Sequence from={30}>
|
||||
<Sequence from={-15}>
|
||||
<MyAnimation />
|
||||
</Sequence>
|
||||
</Sequence>
|
||||
```
|
||||
|
||||
The inner sequence trims 15 frames from the start, and the outer sequence delays the result by 30 frames.
|
||||
|
||||
171
skills/remotion-video-creation/rules/videos.md
Normal file
171
skills/remotion-video-creation/rules/videos.md
Normal file
@@ -0,0 +1,171 @@
|
||||
---
|
||||
name: videos
|
||||
description: Embedding videos in Remotion - trimming, volume, speed, looping, pitch
|
||||
metadata:
|
||||
tags: video, media, trim, volume, speed, loop, pitch
|
||||
---
|
||||
|
||||
# Using videos in Remotion
|
||||
|
||||
## Prerequisites
|
||||
|
||||
First, the @remotion/media package needs to be installed.
|
||||
If it is not, use the following command:
|
||||
|
||||
```bash
|
||||
npx remotion add @remotion/media # If project uses npm
|
||||
bunx remotion add @remotion/media # If project uses bun
|
||||
yarn remotion add @remotion/media # If project uses yarn
|
||||
pnpm exec remotion add @remotion/media # If project uses pnpm
|
||||
```
|
||||
|
||||
Use `<Video>` from `@remotion/media` to embed videos into your composition.
|
||||
|
||||
```tsx
|
||||
import { Video } from "@remotion/media";
|
||||
import { staticFile } from "remotion";
|
||||
|
||||
export const MyComposition = () => {
|
||||
return <Video src={staticFile("video.mp4")} />;
|
||||
};
|
||||
```
|
||||
|
||||
Remote URLs are also supported:
|
||||
|
||||
```tsx
|
||||
<Video src="https://remotion.media/video.mp4" />
|
||||
```
|
||||
|
||||
## Trimming
|
||||
|
||||
Use `trimBefore` and `trimAfter` to remove portions of the video. Values are in seconds.
|
||||
|
||||
```tsx
|
||||
const { fps } = useVideoConfig();
|
||||
|
||||
return (
|
||||
<Video
|
||||
src={staticFile("video.mp4")}
|
||||
trimBefore={2 * fps} // Skip the first 2 seconds
|
||||
trimAfter={10 * fps} // End at the 10 second mark
|
||||
/>
|
||||
);
|
||||
```
|
||||
|
||||
## Delaying
|
||||
|
||||
Wrap the video in a `<Sequence>` to delay when it appears:
|
||||
|
||||
```tsx
|
||||
import { Sequence, staticFile } from "remotion";
|
||||
import { Video } from "@remotion/media";
|
||||
|
||||
const { fps } = useVideoConfig();
|
||||
|
||||
return (
|
||||
<Sequence from={1 * fps}>
|
||||
<Video src={staticFile("video.mp4")} />
|
||||
</Sequence>
|
||||
);
|
||||
```
|
||||
|
||||
The video will appear after 1 second.
|
||||
|
||||
## Sizing and Position
|
||||
|
||||
Use the `style` prop to control size and position:
|
||||
|
||||
```tsx
|
||||
<Video
|
||||
src={staticFile("video.mp4")}
|
||||
style={{
|
||||
width: 500,
|
||||
height: 300,
|
||||
position: "absolute",
|
||||
top: 100,
|
||||
left: 50,
|
||||
objectFit: "cover",
|
||||
}}
|
||||
/>
|
||||
```
|
||||
|
||||
## Volume
|
||||
|
||||
Set a static volume (0 to 1):
|
||||
|
||||
```tsx
|
||||
<Video src={staticFile("video.mp4")} volume={0.5} />
|
||||
```
|
||||
|
||||
Or use a callback for dynamic volume based on the current frame:
|
||||
|
||||
```tsx
|
||||
import { interpolate } from "remotion";
|
||||
|
||||
const { fps } = useVideoConfig();
|
||||
|
||||
return (
|
||||
<Video
|
||||
src={staticFile("video.mp4")}
|
||||
volume={(f) =>
|
||||
interpolate(f, [0, 1 * fps], [0, 1], { extrapolateRight: "clamp" })
|
||||
}
|
||||
/>
|
||||
);
|
||||
```
|
||||
|
||||
Use `muted` to silence the video entirely:
|
||||
|
||||
```tsx
|
||||
<Video src={staticFile("video.mp4")} muted />
|
||||
```
|
||||
|
||||
## Speed
|
||||
|
||||
Use `playbackRate` to change the playback speed:
|
||||
|
||||
```tsx
|
||||
<Video src={staticFile("video.mp4")} playbackRate={2} /> {/* 2x speed */}
|
||||
<Video src={staticFile("video.mp4")} playbackRate={0.5} /> {/* Half speed */}
|
||||
```
|
||||
|
||||
Reverse playback is not supported.
|
||||
|
||||
## Looping
|
||||
|
||||
Use `loop` to loop the video indefinitely:
|
||||
|
||||
```tsx
|
||||
<Video src={staticFile("video.mp4")} loop />
|
||||
```
|
||||
|
||||
Use `loopVolumeCurveBehavior` to control how the frame count behaves when looping:
|
||||
|
||||
- `"repeat"`: Frame count resets to 0 each loop (for `volume` callback)
|
||||
- `"extend"`: Frame count continues incrementing
|
||||
|
||||
```tsx
|
||||
<Video
|
||||
src={staticFile("video.mp4")}
|
||||
loop
|
||||
loopVolumeCurveBehavior="extend"
|
||||
volume={(f) => interpolate(f, [0, 300], [1, 0])} // Fade out over multiple loops
|
||||
/>
|
||||
```
|
||||
|
||||
## Pitch
|
||||
|
||||
Use `toneFrequency` to adjust the pitch without affecting speed. Values range from 0.01 to 2:
|
||||
|
||||
```tsx
|
||||
<Video
|
||||
src={staticFile("video.mp4")}
|
||||
toneFrequency={1.5} // Higher pitch
|
||||
/>
|
||||
<Video
|
||||
src={staticFile("video.mp4")}
|
||||
toneFrequency={0.8} // Lower pitch
|
||||
/>
|
||||
```
|
||||
|
||||
Pitch shifting only works during server-side rendering, not in the Remotion Studio preview or in the `<Player />`.
|
||||
Reference in New Issue
Block a user