Fragments โ ๐๐๐ฃ๐ฎ ๐๐ง๐๐๐ข๐๐ฃ๐ฉ๐จ ๐ข๐๐ ๐๐ฃ๐ ๐ฎ๐ค๐ช๐ง ๐๐๐๐ ๐๐ช๐ก๐ก โจ
TL;DR
Fragments is a web app and Chrome extension that helps creators capture, organize, and search video moments using Mux for video processing and Supabase for backend. It features AI transcription, GIF previews, and privacy controls, transforming video consumption into a searchable knowledge base.
Key Takeaways
- โขFragments enables creators to capture short video clips (up to 60 seconds) with a keyboard shortcut, organizing them into a searchable library with tags and notes.
- โขThe tool leverages Mux for video processing, including AI-powered transcription, thumbnail generation, GIF previews, and secure streaming, enhancing user experience without infrastructure headaches.
- โขBuilt with Next.js, Supabase, and Tailwind CSS, it offers features like real-time updates, NSFW detection, public/private sharing, and a neobrutalist UI design for a distinctive aesthetic.
- โขPrivacy is prioritized through Row Level Security in Supabase, ensuring users control access to their fragments, with optional public sharing and age verification for flagged content.
- โขFuture plans include expanding to other browsers, adding AI summaries, collaboration features, and a mobile app to further support creators in building a visual second brain.
Tags
๐๐ฎ๐๐ฒ๐ด๐ผ๐ฟ๐ ๐ฆ๐๐ฏ๐บ๐ถ๐๐๐ถ๐ผ๐ป: Best Use of Mux
Note: We'd also like to consider our project under the Show and Tell track. This is a submission for the DEV's Worldwide Show and Tell Challenge Presented by Mux
Participants: @neilblaze & @achalbajpai
Ever lose that perfect video moment because you forgot where you saw it? That frame that could've sparked your next big idea โ gone. Your brain isn't a hard drive. That's why we built Fragments! โจ
Video โถ๏ธ
โ OR โ
ใ ค
What we built ๐ค
Fragments is your visual second brain for creators who believe great work is shaped by intentional consumption. It's a seamless synergy of both Webapp and Chrome extension that lets you capture, organize, and rediscover video moments that spark insight! Ideas don't appear in isolation โ they emerge from moments you notice, save, and revisit over time. In many creative fields, a short video clip can convey more insight than pages of text. โก
๐ก The whole idea behind Fragments is that great creators aren't just skilled at making โ they're careful about what they consume. We built a tool that makes capturing those fleeting "aha!" moments as effortless as a keyboard shortcut. Whether you're a designer spotting a slick animation, a developer watching a tutorial, or a researcher collecting interview clips โ Fragments ensures nothing gets lost!
๐Try it out here: https://fragmentsofmux.vercel.app
ใ ค
How it works? ๐ฃ
Users install our Chrome extension and sign up via Supabase Auth with Google OAuth. Once authenticated, they can capture any screen content with a simple Alt+Shift shortcut. The recording (max 60 seconds) gets uploaded directly to Mux for processing. Mux handles video storage, streaming, thumbnail generation, and AI-powered transcription. Users add tags, notes, and categories while saving. The dashboard provides a beautiful gallery view with GIF previews, full-text search across titles/tags/transcripts, and detailed analytics per fragment. Everything syncs in real-time via Supabase PostgreSQL database.
ใ ค
Codebase / App Repository ๐
- Fragments ๐ github.com/neilblaze/fragments [Open Source]
- Fragments Extension ๐ github.com/Neilblaze/fragments/releases/tag/v1.0.0
Fragments
Fragments is a web-app and chromium extension designed for creators who believe that great work is shaped by intentional consumption. Ideas do not appear in isolation. They emerge from moments you notice, save, and revisit over time. In many creative fields, a short video clip can convey more insight than pages of text.
Fragments helps you capture, organize, and rediscover those moments so they are available when you need them most, built with Mux and Supabase for the DEV's Worldwide Show and Tell Challenge 2025.
Tech Stack
- Next.js 15
- Mux (Video processing and streaming and AI features)
- Supabase (Authentication and Database)
- Tailwind CSS + Shadcn UI
Prerequisites
- Node.js 20 or higher
- Mux account and API tokens
- Supabase project and service role key
Env Configuration
Create a .env file in the root directory with the following variables.
NEXT_PUBLIC_SUPABASE_URL=your_supabase_url
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_supabase_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key
MUX_TOKEN_ID=your_mux_token_id
MUX_TOKEN_SECRET=your_mux_token_secret
MUX_WEBHOOK_SECRET=your_mux_webhook_secret
DASHBOARD_URL=http://localhost:3000
Database Setup
- Open your Supabaseโฆ
ใ ค
Features ๐
- Chrome Extension with invisible capture (Alt+Shift shortcut, max 60s)
- Mux Video Processing for streaming, thumbnails, and GIF previews
- AI-Powered Transcription via Mux's auto-generated captions
- Deep Search across titles, tags, notes, and transcripts
- Neobrutalist UI with RetroUI-inspired design (Tailwind + ShadCN)
- Google OAuth via Supabase Authentication
- Public/Private Sharing with NSFW detection and age verification
- Reddit-style Voting (upvotes/downvotes) on community fragments
- View Analytics tracking per fragment
- MP4 Downloads via Mux Static Renditions
- Tag Management with popular tag suggestions
- NSFW Restrictions with NSFW.js on every fragments
- Responsive Design works on desktop and mobile
- Real-time Updates via Supabase subscriptions
- Privacy-first with Row Level Security (RLS) policies
Privacy & Security ๐
Fragments deals with personal video captures, which can be sensitive. We've implemented Row Level Security (RLS) policies in Supabase ensuring users can only access their own fragments. Public sharing is explicit and opt-in. NSFW content is detected and requires age verification to view. All API calls are authenticated, and video processing happens securely through Mux's infrastructure.
Background ๐
Here's the thing โ creators consume thousands of videos but only a handful of moments truly matter. Those 10-second clips that demonstrate a technique, explain a concept, or inspire an idea. But where do they go? ๐
The problem with current solutions:
Bookmarks get lost, notes lack context, and rewatching entire videos to find that one moment is painful. Creators need a system that captures moments in context, makes them searchable, and surfaces them when relevant. It's 2026, and we're still losing inspiration to forgotten browser tabs.
The core problem is that video content is hard to search and organize. Text notes can be searched, but video moments require watching. Until now.
ใ ค
| Traditional Approach | Fragments Solution |
|---|---|
| ๐ Bookmarking full videos | ๐ข Capture only the moment that matters |
| ๐ Text notes without visual context | ๐ข Video clips with searchable transcripts |
| ๐ Scattered across multiple apps | ๐ข Single organized library |
| ๐ Can't search video content | ๐ข Full-text search across transcripts |
| ๐ No way to preview quickly | ๐ข Looping GIF previews in gallery |
| ๐ No sharing workflow | ๐ข Public/private with community feed |
ใ ค
Fragments transforms passive video consumption into an active, searchable knowledge base. Capture what matters, search by what was said, and build your visual second brain!
Fragments changes the game by making video moments as searchable as text, as organized as notes, and as shareable as links! ๐ช
ใ ค
Why Mux?
Video processing is make-or-break for a tool like Fragments. Mux gives us everything we need without the headaches.
Mux handles video upload, processing, streaming, thumbnail generation, GIF creation, and AI transcription โ all through a single, elegant API. This lets us focus on the user experience instead of video infrastructure.
// Mux Integration in Fragments ๐ฌ
export const muxService = {
async createUpload(corsOrigin?: string) {
const mux = getMux();
const upload = await mux.video.uploads.create({
cors_origin: corsOrigin || "*",
new_asset_settings: {
playback_policy: ["public"],
static_renditions: [{ resolution: "highest" }],
input: [{
generated_subtitles: [{
language_code: "en",
name: "English (auto)",
}],
}],
},
});
return { uploadUrl: upload.url, uploadId: upload.id };
},
getThumbnailUrl(playbackId: string, time = 0) {
return `https://image.mux.com/${playbackId}/thumbnail.png?time=${time}`;
},
getGifUrl(playbackId: string, start = 0, end = 5) {
return `https://image.mux.com/${playbackId}/animated.gif?start=${start}&end=${end}`;
},
};
Mux powers our entire video stack:
- Upload & Processing โ Direct uploads from the extension
- Streaming โ HLS playback via @mux/mux-player-react
- Thumbnails & GIFs โ Instant preview generation
- AI Transcription โ Auto-generated captions searchable in our database
- Static Renditions โ MP4 downloads for users
- Analytics โ View counts and engagement metrics
Data Overview
|
Assets
|
Engagement
|
Metrics
|
Error Logs
|
Views Metrics
|
We dove deep into Mux's API for webhooks, asset management, and playback customization. Building a video-first app was a learning curve โ we had to understand encoding, streaming protocols, and optimal UX for video galleries. Mux made it manageable!
Use of Mux (Additional Prize Category) ๐ฌ
As mentioned before, Fragments utilizes 7 distinct Mux features beyond just video hosting. Here's a deep dive into our implementation:
1. Direct Uploads (Extension โ Mux)
We use Mux Direct Uploads to enable our Chrome extension to upload screen recordings directly to Mux without routing through our server.
// Chrome extension uploads directly to Mux
const upload = await mux.video.uploads.create({
cors_origin: "*",
new_asset_settings: {
playback_policy: ["public"],
static_renditions: [{ resolution: "highest" }],
input: [{
generated_subtitles: [{ language_code: "en", name: "English (auto)" }],
}],
},
});
// Extension uses upload.url to PUT video directly
2. Mux Player React
We use @mux/mux-player-react for seamless HLS playback with built-in controls, customizable theming, and analytics tracking.
<MuxPlayer
playbackId={playbackId}
metadata={{
video_id: playbackId,
video_title: title,
player_name: "Fragments Dashboard",
}}
accentColor="#ff6101"
streamType="on-demand"
/>
3. Dynamic Thumbnails
Mux Image API generates thumbnails on-the-fly for our gallery cards:
// Thumbnails at any timestamp
`https://image.mux.com/${playbackId}/thumbnail.png?time=${time}&width=640`
4. Animated GIF Previews
We create looping GIF previews for gallery hover states using Mux's GIF endpoint:
// 5-second looping previews
`https://image.mux.com/${playbackId}/animated.gif?start=0&end=5&fps=15&width=320`
5. AI-Powered Auto-Transcription
We enable auto-generated subtitles during asset creation. The transcripts are searchable in our database:
input: [{
generated_subtitles: [{
language_code: "en",
name: "English (auto)",
}],
}]
6. Static Renditions (MP4 Downloads)
Users can download fragments as MP4 files via Static Renditions:
// Download URL for users
`https://stream.mux.com/${playbackId}/highest.mp4?download=${filename}`
7. Webhooks + NSFW Moderation Pipeline
Our most creative use of Mux! We listen to Mux Webhooks and trigger content moderation:
// Webhook handler: video.asset.ready
case "video.asset.ready": {
// Update fragment status
await supabase.from("fragments").update({
mux_asset_id: assetId,
mux_playback_id: playbackId,
status: "ready",
thumbnail_url: `https://image.mux.com/${playbackId}/thumbnail.png`,
});
// Trigger NSFW detection using Mux thumbnails!
fetch("/api/moderate", {
method: "POST",
body: JSON.stringify({ fragmentId }),
});
}
The NSFW moderation leverages Mux thumbnails โ we extract frames at multiple timestamps and run them through OpenAI's omni-moderation-latest model:
// Moderation uses Mux thumbnail API for frame extraction
const thumbnailUrls = [
`https://image.mux.com/${playbackId}/thumbnail.png?time=1&width=640`,
`https://image.mux.com/${playbackId}/thumbnail.png?time=5&width=640`,
`https://image.mux.com/${playbackId}/thumbnail.png?time=10&width=640`,
];
// Each frame is analyzed for NSFW content
const results = await openai.moderations.create({
model: "omni-moderation-latest",
input: [{ type: "image_url", image_url: { url: thumbnailUrl } }],
});
If flagged, the fragment is automatically marked as NSFW and requires age verification to view publicly. Note that, due to exhaustion of OpenAI credits (on our end), we're rolled back this feature to NSFW.js which functions using an ultra-lightweight onDevice MobileNet v2 model (4.2MB), that works flawlessly!
ใ ค
ใ ค
Design ๐จ
We were heavily inspired by the Neobrutalist design โ bold borders, stark shadows, and intentional imperfection. Our UI uses RetroUI components combined with Tailwind CSS and ShadCN UI for a distinctive aesthetic.
- Capture: Minimal, invisible UI that doesn't interrupt flow
- Organize: Tags, categories, and notes for context
- Search: Find by what was said, not just titles
- Share: Public community feed with voting
We focused on making the gallery feel like a creative workspace โ GIF previews loop automatically, cards have subtle hover effects, and the search experience is fast and intuitive.
CREDITS
- Design Resources: RetroUI, Neobrutalist principles
- Icons: Lucide React, Hugeicons
- Typography: Syne, Space Mono, Geist Mono
Challenges we ran into ๐ค
Building a screen capture extension + video platform brought some interesting technical challenges.
The biggest headache was getting screen capture to work reliably across different websites and tab contexts. Chrome's Manifest V3 has strict security policies, and coordinating the capture โ upload โ process โ display flow required careful state management.
Real-time video processing meant handling async webhooks from Mux and updating the UI accordingly. We implemented polling for status updates while waiting for transcription to complete.
Performance optimization for the gallery was crucial โ loading dozens of GIF previews without janky scrolling required lazy loading and careful memory management.
We're really proud of creating a capture experience that feels invisible and a gallery that makes rediscovery delightful! :)
ใ ค
What's next? ๐
Fragments has serious potential to become the go-to visual knowledge base for creators. We've built the foundation, and there's so much more to explore!
What we're building next:
- Browser Integration: Support for Firefox and Edge
- Collections: Group related fragments into themed collections
- AI Summaries: Auto-generated descriptions for each fragment
- Collaboration: Share collections with teams
- Mobile App: Capture from mobile screens
- API Access: Let developers build on top of Fragments
We're excited to expand Mux integration, improve search accuracy, and build a thriving creator community!
ใ ค
End Notes ๐๐ป
Huge thanks to DEV for hosting this challenge, the Mux team for incredible video infrastructure and documentation, and to the open-source community for inspiration! ๐

Data Overview
Assets
Engagement
Metrics
Error Logs
Views Metrics
