Learn how HeyGen and HubSpot 10x video output, localize content, and use AI avatars to create on-brand campaigns, ads, and personalized videos at scale.
In this session of HeyGen Bootcamp, Holly Xiao (Head of Marketing at HeyGen) and Oscar Eduardo Estrada (AI Video Specialist at HubSpot) walked through a simple but powerful idea:
Video doesn’t have to be a painful production line. With AI, it can be a system you run.
This blog breaks down that session and shows how teams from scrappy teams of one to global enterprises are using HeyGen to scale video marketing without burning out their people or their budgets.
Why video marketing feels so hard right now
Holly started with the reality everyone in marketing is living:
- There’s more demand for content than ever
- Every channel wants video first
- Budgets and headcount?… not so much
According to Gartner, average marketing budgets sit around 7.7% of company revenue and haven’t bounced back to prepandemic levels. The mandate from leadership is blunt: “Do more with the same team (or less).”
And video is where that pressure hits hardest.
It’s not just “make a brand video.” It’s:
- Sizzle reels
- Product demos
- Social clips
- Webinars and trainings
- Customer stories
- Personalized lifecycle videos
…often in multiple languages.
Meanwhile, buyers have changed too.
The attention game has flipped
The old playbook: Buy impressions, blast emails, and statistically something works.
The new reality: You don’t win because you shout the loudest. You win because you show up with:
- The most relevant message
- In a format native to the platform
- At the exact moment someone cares
And that format is almost always video.
A few truths Holly highlighted:
- Your buyers are not reading long product pages on their phones at 10 pm
- They are watching:
- Short explainers
- Quick demos
- Snackable tutorials
- Customer stories
- Usually on mute while multitasking
If you’re not showing up on video frequently, you’re functionally invisible for most of the buyer journey.
So the question isn’t, “Should we do more video?”
You’re already convinced of that.
The real question is: How do we do video at the speed the market expects without blowing up budget, timelines, or sanity?
Why traditional video production doesn’t scale
If you map out a typical video production process, it’s a lot:
- Strategy and scripting
- Creative direction
- Talent and scheduling
- Location and set
- Filming (with all the reshoots and surprises)
- Post-production and motion graphics
- Localization
- Final approvals and distribution
And that’s for one video, in one language.
Even for companies that do have solid production resources, Holly called out three repeating pain points:
- Bottlenecks: One or two “video people” become the choke point for every request
- Latency: Weeks (or months) from idea to final file, which is too slow when competitors can respond in days
- Opportunity cost:
- Campaigns you never launch because “we don’t have video”
- Concepts you kill because production is too heavy
- Channels you underinvest in because you can’t feed them enough content
If you’re a one-person marketing team, it’s even more brutal: it’s you, a ring light, and 27 takes of you tripping over your own script.
Which leads to the real constraint: Demand for video is exponential. Your production capacity is linear.
Working harder won’t close that gap.
Changing the system will.
Rethinking the system: Separating message from filming
The turning point Holly described is simple but powerful:
Instead of asking, “How do we squeeze one more campaign out of this tired process?”
Ask: “What would it take to 10x our video output and speed?”
That sounds impossible if you assume:
- Every video = bespoke shoot
- Every update = reshoot
- Every language = separate production
But if you separate:
- The creative from the filming
- The message from the on-screen talent
Then AI becomes a new kind of production model.
Not a gimmick, just a different way of doing the same job.
That’s where HeyGen comes in.
HeyGen as your “always-on” production layer
Holly summed up HeyGen like this: “Video production for the rest of us.”
HeyGen is built for:
- Teams without blockbuster studio budgets
- Marketers who aren’t camera comfortable
- Orgs where video demand massively outpaces headcount
With HeyGen, you can:
- Create video from scripts, slides, prompts, or existing footage
- Use AI avatars (digital twins or stock) instead of live shoots
- Personalize videos at scale
- Localize into 175+ languages and dialects
- Keep brand control across teams and markets
And you do it all in the browser without a camera or crew.
Let’s break down the core pillars Holly shared.
Pillar 1: Create – Three ways to go from idea to video
HeyGen gives you three main starting points:
1. Type or paste a script
Perfect when you already have written content:
- Email copy
- Blog posts
- Landing page copy
- Internal announcements
You drop your script into AI Studio, pick an avatar, choose a layout, and you’ve got a first cut.
You can also:
- Upload a PDF or slide deck and have HeyGen auto-create a video using your slides as background
- Edit the whole thing scene-by-scene inside AI Studio
Great for:
- Training content
- Product explainers
- Onboarding modules
2. Upload assets (decks, PDFs, existing visuals)
If you already have content you love, you don’t start from zero.
HeyGen can:
- Use your slides as backgrounds
- Pull key points into the script
- Build a video around your existing materials
3. Use Video Agent: prompt to video
Video Agent is HeyGen’s “give me a brief, get a video” flow.
You:
- Describe the video you want (e.g. “30-second TikTok explaining X for creators”)
- Optionally upload brand assets (logos, product shots, screenshots)
- Let Video Agent auto-generate:
- Script
- Avatar
- Layout
- B-roll
- Music
From there, you click “edit in studio” and tweak everything: visuals, voice, timing, captions, backgrounds.
Video Agent gets you 80–90% of the way.
AI Studio gets you from “good” to “exactly right.”
Pillar 2: Personalize – Video that behaves like email
Think of HeyGen personalization like email merge fields but for video.
Instead of just: “Hi {{first_name}}”
You can create:
- “Hi {{first_name}}, I saw you signed up from {{company_name}}…”
- “Because you’re in {{industry}} we picked these 3 use cases for you…”
Tokens can control:
- Spoken lines (what the avatar says)
- On-screen text
- Images or scenes for different segments
All of this can be orchestrated:
- Via API
- Or through native integrations like HubSpot, where:
- A CRM trigger (e.g. new signup, demo request, new opportunity)
- Automatically generates and sends a personalized video
Use cases Holly mentioned:
- Event registration follow ups (“Hey Audrey, thanks for joining bootcamp…”)
- Lifecycle nurture sequences
- Sales outreach at scale
- Onboarding and expansion campaigns
The big win: Marketing keeps control over messaging and brand, while sales and CS get the benefits of “personal video at scale” without recording a thousand takes.
Pillar 3: Localize – The easiest win to start with
If you’re completely new to AI video, localization is the perfect first experiment.
With HeyGen translation and dubbing, you can:
- Take an existing video
- Translate it into up to 10 languages at once
- Preserve:
- The original speaker’s voice characteristics
- Lip sync (with video dubbing)
- Timing
Holly shared several examples:
Clara
- The CEO used his digital twin to deliver quarterly updates
- Easy to publish on social and reuse across channels
- Fast to update when metrics or talking points change
Trivago
- Localized a single hero ad into 30 markets
- The same on-screen talent (“Mr. Trivago”) appears to speak each language fluently
- Lip sync and mannerisms remain intact
McDonald’s “Grandma McFlurry” campaign
- Real people spoke from the heart in one language
- HeyGen translated and lip synced those videos into others
- Families could see heartfelt messages in their native language
- The emotional impact stayed, the language changed
HubSpot YouTube localization
Oscar shared how HubSpot:
- Took their best-performing English YouTube videos
- Used HeyGen to localize them into Spanish and other languages
- Combined HeyGen dubbing with in-language editors for thumbnails and motion graphics
It’s a simple formula with huge upside:
- Same content
- More markets
- Better experience than just captions
Real-world marketing use cases Holly highlighted
There are countless ways to use AI video in marketing, but Holly called out a “sweet spot” set for most teams:
- Executive communications: Town halls, investor updates, CEO newsletters, change management.
- Localization: Campaigns, product launches, always-on content across regions.
- Marketing campaigns & social: Performance ads, TikToks, Reels, LinkedIn thought leadership.
- Product demos & onboarding: In-product walkthroughs, user education, customer training.
- Personalized videos: For prospects, customers, partners, or internal stakeholders.
The pattern across all of them: What used to be blocked by camera time, reshoots, or language constraints becomes something you can actually iterate on.
HubSpot’s story: From “one more reshoot” to an AI-first system
The second half of the session was a fireside-style chat between Holly and Oscar from HubSpot.
Here are the biggest takeaways.
1. Video is embedded across HubSpot’s entire funnel
Oscar shared where video shows up today:
- Top of funnel
- Brand campaigns
- YouTube channels
- Social ads
- Middle of funnel
- Webinars
- Customer education
- Academy content
- Bottom of funnel
- Conversion-focused ads
- Product walkthroughs
- Activation and onboarding sequences
Demand keeps rising, especially for:
- Localization
- Self-serve tools so non-video teams can create their own assets
2. Their old workflow looked like everyone else’s
Before HeyGen, a typical video meant:
- Research and scripting
- Talent coordination
- Studio booking
- Filming
- Post-production
- Localization (if budget/time allowed)
It was:
- Rigid – hard to fix things late
- Slow – weeks to months
- Expensive – especially if anything changed
A great example Oscar shared:
- They filmed a big product launch video with a senior executive
- Motion graphics done, edit done, everything ready
- Then the product name changed
- Rescheduling the executive, rebuilding the set, and redoing everything was painful and slow
With HeyGen, the next time this happened, they:
- Created an avatar of the executive
- Updated the script
- Regenerated those lines with the same look and voice
- Avoided a full reshoot
That one “fire drill” became a turning point.
3. Where HeyGen is “business as usual” for HubSpot now
Oscar called out three core use cases that are now standard:
1) Ads at scale
- Previously: 5–15 video variations per campaign
- Now: 100+ ad variants using HeyGen
- They can test:
- Different avatars
- Hooks and angles
- Value props and CTAs
- Result: much faster creative testing and optimization
2) Reshoots and updates
- Product names change
- Messaging evolves
- Last-minute legal tweaks appear
Instead of rebooking executives and studio time, they:
- Update the script
- Regenerate lines with the avatar
- Patch into existing edits
3) Localization
- HubSpot serves six main languages: English, French, Portuguese, Spanish, German, Japanese
- Demand for localized video is growing fast
- HeyGen lets them:
- Localize hero content efficiently
- Serve global audiences with native audio, not just subtitles
4. How they decide what needs a “real” shoot vs. AI
Oscar’s rule of thumb:
- If the content requires complex physical movement or interaction (e.g. walking across a room, manipulating props, physical comedy), it’s a good candidate for a traditional production
- If it’s primarily talking head, educational, or message-driven, it’s a perfect fit for HeyGen
They also constantly experiment to push the boundary of what AI video can handle and adjust their workflows as capabilities improve.
Handling skepticism: Trust, disclosure, and experiments
Both Holly and Oscar acknowledged: there is skepticism around AI video and avatars.
The key is how you use it.
HubSpot’s approach:
- Treat every new use case as an experiment, not gospel
- A/B test:
- Different disclosures (“I’m an AI avatar”, lower-third labels, etc.)
- Different formats (real person vs. avatar)
- Let data—not opinions—decide
Holly added that in HeyGen’s own research, viewers are more open to AI than many marketers assume as long as you’re transparent.
Brands like Coca-Cola openly note when an ad was created with AI.
That kind of honesty builds trust instead of eroding it.
Practical advice if you’re just getting started
Holly and Oscar left the group with some simple, non-intimidating ways to begin.
1. Start with localization
- Pick one existing video that already performs well
- Translate it into one or two key languages with HeyGen
- Measure:
- Watch time
- Completion rates
- Feedback from local teams
This is usually:
- High value (new markets, better experience)
- Low risk (content is already vetted)
- Fast to ship (minutes instead of weeks)
2. Then test a simple avatar use case
For example:
- A leadership update
- A product announcement
- A short ad or social post
Keep it small, gather feedback, and iterate.
3. Run a small pilot for leadership buy-in
Oscar’s formula:
- Get approval for a pilot, not a permanent change
- Pick a use case with clear business value (e.g. ads, localized content, reshoots).
- Compare:
- Time saved
- Cost avoided
- Performance (watch time, CTR, conversions)
- Bring those numbers back to leadership
It’s much easier to get buy-in with:
“We saved 80% of production time and maintained performance”
…than with “We think this might be cool.”
Where to go from here
If you’ve made it this far, you probably recognize yourself in at least one of these realities:
- You’re under pressure to deliver more video with the same (or less) budget
- Your current production process can’t keep up with the demand
- You know you should be localizing and personalizing, but it feels out of reach
HeyGen doesn’t magically make strategy, storytelling, or taste obsolete.
What it does is remove the production drag between:
“We have a great idea”and“We have a ready-to-publish video.”
A simple way to start today:
- Take one existing video you like
- Translate it into another language with HeyGen
- Or: create your first digital twin and record a short internal update
- Or: generate a few ad variants with Video Agent and test them on social
Once you feel how fast that loop is, it’s hard to go back.
Video stops being a bottleneck.
It becomes a system you can actually scale.








