Bard Group · Venture 01

AI-powered media intelligence that transforms raw footage into a structured, searchable, editorially-aware knowledge base.
The one thing no computer will ever replicate is human creativity. The instinct for story, the feel for a moment, the editorial judgement that turns footage into something people care about. PUCK takes everything else off the table.
Whether you're producing documentaries, running live events, powering newsrooms, or managing content at scale. PUCK takes the heavy lifting and gives your people the one thing no amount of talent can buy: time.
Time to think. Time to be creative. Time to tell the story properly.
The Philosophy
Between footage capture and the first creative decision, there are days, sometimes weeks, of grinding administrative work. Logging, transcribing, tagging, reviewing, organising. Essential work. But not creative work.
Without PUCK
Days or weeks of manual footage logging before editing begins
Assistant editors exhausted by admin before they reach creative work
Great moments buried in hours of unreviewed footage
Every new production starts from zero with no institutional memory
Content created channel by channel, each requiring separate passes
With PUCK
Searchable, annotated knowledge base delivered within hours
Editors start cutting with full context on day one
Every significant moment found, ranked, and contextualised automatically
A compounding database that recognises recurring talent, locations, and assets
One ingest feeds every downstream channel simultaneously
"We ran 27 hours of footage through PUCK overnight. By morning, every team member was fully across every frame. That's PUCKing magic." 
What PUCK Does
PUCK operates across the entire media lifecycle, from the moment footage lands on storage to the moment content reaches your audience.
Transcribes all dialogue, analyses every frame across three AI tiers, tracks people and objects across entire productions, and synthesises everything into structured editorial output. Footage is ground truth. PUCK discovers the story from the material itself.
Integrates external data sources (fight results, match statistics, competitor records) and uses multi-channel audio corroboration to surface genuine highlights. Four independent audio channels must agree before a moment is flagged.
The same knowledge base that powers editorial decisions feeds directly into publication workflows. Website updates, social posts, ad creative, LinkedIn content, live broadcast graphics. All generated from a single source of truth.
PUCK in Action
Documentary
e.g. 6-day shoot, 27+ hours of footage, multiple locations and crew
Traditionally, the team waits days, sometimes weeks, before editing can begin. An assistant editor watches every clip, takes notes, builds bins, logs timecodes. Only then can the editor start making creative decisions.
With PUCK, footage lands on the NAS at the end of each shoot day and the pipeline runs overnight. By the next morning, every clip is transcribed, visually analysed, and tagged. People are tracked across days and locations. A structured editorial digest, delivered as a PDF, gives every team member a complete picture of what was shot, what the standout moments are, who appeared where, and what the emerging narrative threads look like. The editor, the producer, the director, the social team. Everyone reads the digest over coffee and is fully across 27 hours of footage in minutes, without watching a single frame.
The editor opens Premiere to find sequences already built, media linked, and the best moments flagged with editorial context. They start cutting with more insight than any human team could have assembled in a week.
Live Sports
e.g. Formula 1: practice, qualifying, race day across a full weekend
A single F1 race weekend generates a staggering volume of content. Onboard cameras, pit lane footage, team radio, press conferences, fan reactions, podium celebrations. All spread across three days. Traditionally, media, marketing, and social teams work in parallel silos, each scrubbing through hours of footage.
PUCK ingests the entire weekend as a single production. The multi-channel audio corroboration system identifies key moments. The overtake into Turn 1, the pit stop that changed the race, the radio message that told the story of a driver's frustration. Vision analysis finds the reaction shots, the team celebrations, the split-second near-miss.
From that single ingest, PUCK generates layered content across every channel simultaneously: a ranked highlight package ready for broadcast; sports-style race articles with career statistics and context; short-form social clips optimised per platform (the decisive overtake for X, a carousel of the top 5 moments for Instagram, a behind-the-scenes narrative for TikTok); a long-form race review for YouTube; a single-driver story arc following their weekend from P15 on Friday to a podium on Sunday; LinkedIn posts for sponsors linking real moments to brand activations; and website results pages updated with data, imagery, and editorial context.
Every team gets a massive head start, working from the same knowledge base, with consistent data and editorial context across every touchpoint.
News & Media
e.g. a modern newsroom producing video-first journalism across broadcast, web, and social, daily
A modern newsroom never stops. Field crews file footage from multiple stories simultaneously. Correspondents send interview rushes from three time zones. A breaking story needs a package for the evening bulletin, a web cut, social clips, and a push notification, all within the hour.
PUCK sits at the centre of the operation as a persistent intelligence layer. Every piece of footage that enters the newsroom, whether it's a two-minute phone clip from a breaking scene or a full day's rushes from an embedded correspondent, is ingested, transcribed, visually analysed, and added to a searchable archive that spans every story the newsroom has ever covered.
For breaking news, PUCK delivers a structured brief within minutes: who is in the frame, what's being said, what the most significant visual moments are, and draft copy for web and social, ready for the desk to verify, sharpen, and publish. The social team gets platform-native cuts. The web team gets an article draft with embedded video markers.
For longer-form work, the compounding database is transformative. That forty-hour investigation backlog? PUCK processes it overnight. By morning, the team has a complete knowledge base: every interview transcribed and cross-referenced, every person tracked, every recurring detail mapped. The journalist can search every word ever spoken on camera and find the exact frame where the document was visible on screen. In seconds, not days.
The entity registry means the newsroom builds institutional memory. A politician from last year's story is recognised automatically in today's footage. A location from an earlier investigation is flagged when it appears again. The archive doesn't just store footage. It understands it.
One ingest. Ten content streams. Zero humans watching footage on fast-forward. What the PUCK.
How It Works
Each phase builds on the previous. The system tracks completion automatically. Processing can be paused, resumed, and monitored from any device.
Footage is scanned from network storage, catalogued with technical metadata, and registered in the database.
Audio is extracted, screened for speech via Voice Activity Detection, and transcribed with full-text search indexing.
Technical metadata is probed (timecodes, creation timestamps) and 1080p proxies are generated for fast local processing.
Clips are classified by content type: interview, action, B-roll, drone. A per-second structural scan extracts face counts, motion levels, shot types, and scene boundaries.
Frames extracted at density matching content: dense for fast action, sparse for static interviews, audio-driven for dialogue.
Three-tier visual analysis: a local model describes every frame; a frame curator scores each against signals; only 10-25% flagged as editorially significant reach premium AI.
All data unified into multicam timelines, entity-aware block summaries, editorial digests, and export-ready Premiere Pro projects.
The Key Innovation
Every frame gets seen. Every moment gets scored. Only the editorially significant ones get deep analysis. Nothing is missed, and the output is ready in hours, not weeks.
Every second of footage analysed instantly. Face detection, motion, shot classification, scene boundaries.
Natural language descriptions of every extracted frame. Full visual understanding across the entire production.
Only frames crossing editorial significance (reaction shots, emotional shifts, decisive moments) receive deep storytelling-level interpretation.
Same budget. 10x faster. 5x the output. That's PUCKing clever. 
The Model
The result is not fewer people. It is the same people doing better work, faster, with more insight and more creative freedom than they have ever had.
Surfaces key conversations, flags emotional peaks, maps who said what and when
Finds the moments that serve the narrative, already knowing where they are
Presents an editorial brief: reaction shots, energy shifts, visual turning points
Builds sequences from a position of insight, using shots they never knew existed
Delivers structured cast and entity maps: who appears where, how often, alongside whom
Makes editorial decisions about whose story to tell and how to structure it
Delivers highlight packages with editorial rationale for each moment
Chooses pacing, music, and narrative arc with full creative control
Articles, social posts, website content, ad copy, shaped to each platform
Reviews, refines, adds editorial voice, and publishes
Market Applications
PUCK is designed for any team that works with footage at scale.
Footage logging in hours not weeks. Searchable knowledge base across multi-day shoots. Entity tracking across series.
Real-time highlight detection. Automated articles and results pages. Post-event content packages.
Rapid processing of field footage. Automated rough-cut assembly. Cross-referencing across stories.
Photo and video library management at scale. Automatic tagging, categorisation, and content generation.
Statistical overlays, graphics data, and narrative context injected into live production.
Season-long tracking. Automated website and social presence. Sponsorship activation material.
Client deliverables accelerated. Assistant editor workload reduced to creative review.
Campaign material from production footage. Platform-optimised ad variants tied to real moments.
Proven in Production
PUCK is processing real footage for real broadcasts, right now.
Documentary
A five-episode documentary series (27+ hours of footage across 6 shooting days) fully processed with transcripts, visual analysis, entity tracking, editorial digests, and Premiere Pro exports.
Live Events
Multiple NHRL combat robotics livestreams (4+ hours each) processed with multi-channel audio hit detection, automated Top 10 highlights, and sports-style articles.
Road Trip Documentary
A five-day road trip documentary processed through vision and editorial synthesis, with entity tracking across locations and days.
Entity Intelligence
A persistent entity registry tracking people, robots, and locations across productions, recognising returning subjects automatically.
Not a concept. Not a pitch deck. Running in production, right now. Oh PUCK, it actually works.
Your team's creativity is irreplaceable. PUCK handles everything that isn't, so they can focus on what matters.
Talk to us about PUCKLet humans do what only humans can.
Let PUCK do the rest.