Epic Games officially shut down the MetaHuman Creator web application in late 2025. If you've been away from MetaHuman development for a while, the landscape has changed significantly. The browser-based character creation tool that launched MetaHuman into mainstream game development is gone. In its place is a fully integrated in-editor experience inside Unreal Engine 5.7, new cross-engine licensing that opens MetaHuman assets to Blender and Maya for the first time, and a series of technical improvements that remove long-standing frustrations.
This guide covers everything: what happened, why it happened, how to migrate existing MetaHuman projects, the new in-editor workflow, cross-engine use in Blender (including automation with the Blender MCP Server), batch configuration with the Unreal MCP Server, and performance optimization for shipping MetaHumans in actual games.
Let's start with what changed and why.
What Happened to MetaHuman Creator
MetaHuman Creator launched in April 2021 as a browser-based application that let developers create photorealistic digital humans in minutes. It was impressive technology — you could sculpt faces, choose body types, select hairstyles, and download fully rigged characters ready for Unreal Engine.
But the web app had persistent problems:
Workflow friction. Creating a character in the browser, downloading it, importing it into your UE5 project, and then making adjustments was a multi-step process with frequent points of failure. Downloads were large (often 2-4GB per character), the import process was finicky, and any changes required going back to the browser, re-downloading, and re-importing.
Limited customization. The web app offered a preset-based system with blend sliders. You could create convincing characters within the system's range, but pushing beyond that range — unusual proportions, stylized features, non-human modifications — was impossible. Game developers frequently needed to break out of the web app's constraints, which required manual mesh editing that could break the MetaHuman rig.
Dependency on cloud infrastructure. The web app required a constant internet connection, and Epic's servers occasionally had downtime. More critically, the character creation computation happened on Epic's cloud GPUs, which was expensive for Epic to maintain and created capacity issues during usage spikes.
The A-pose limitation. MetaHumans were always created and delivered in A-pose, which caused persistent issues with clothing fit, especially around the shoulders and armpits. The A-pose was a legacy decision from the original MetaHuman pipeline and was one of the most frequently requested changes.
Height and proportion restrictions. The web app imposed strict height and proportion ranges based on photogrammetry reference data. This made sense for photorealistic humans but was restrictive for games that needed larger-than-life heroes, children characters, or stylized proportions.
In mid-2025, Epic announced the transition from the web app to a fully in-editor experience. The web app entered a sunset period, and as of early 2026, it is no longer available. All MetaHuman creation now happens inside Unreal Engine 5.7.
The New In-Editor MetaHuman Workflow in UE5.7
The in-editor MetaHuman tools in UE5.7 are a significant improvement over the web app in every dimension. Here's what's different.
MetaHuman Plugin
The MetaHuman Plugin is now a first-party UE5.7 plugin (enabled by default in new projects). It replaces the separate "MetaHuman Creator" application with tools that live directly in the editor.
To access it: Open the MetaHuman panel from Window → MetaHuman → MetaHuman Editor. This opens a docked panel that provides the full character creation and editing experience within the UE5 editor.
A-Pose Removed
One of the most significant technical changes: MetaHumans in UE5.7 are authored and delivered in a neutral rest pose instead of A-pose. This was a fundamental pipeline change that required re-rigging the base MetaHuman skeleton, but the benefits are substantial:
- Clothing fits correctly by default. No more shoulder distortion or armpit stretching on imported garments.
- Blend shapes work more predictably. The neutral pose provides a more natural starting point for facial and body blend shapes.
- Third-party clothing assets integrate more easily. Most game-ready clothing assets are authored for T-pose or neutral pose, not A-pose. The mismatch was a constant source of friction.
If you have existing MetaHumans in A-pose: The migration tool (covered below) handles the re-posing automatically. Your animation assets remain compatible — the runtime skeleton is unchanged; only the authoring pose changed.
Height and Proportion Restrictions Lifted
The new in-editor tools allow a much wider range of body proportions:
- Height range: 120cm to 240cm (previously 155cm to 195cm)
- Proportion scaling: Individual limb and torso scaling with full rig support
- Non-standard builds: Heroic proportions, child characters, and exaggerated builds are now supported with proper skeletal deformation
This opens MetaHuman to genres and art styles that were previously out of reach. A fantasy RPG with a 7-foot barbarian and a 4-foot halfling can use MetaHuman for both characters, with the same rig and animation compatibility.
Face Sculpting Improvements
The in-editor face sculpting tools go beyond the web app's slider-based approach:
- Direct mesh manipulation. Sculpt the face mesh directly with brush tools, similar to ZBrush or Blender's sculpt mode. The underlying MetaHuman topology is preserved, so the rig continues to work.
- Region-based editing. Select facial regions (forehead, nose bridge, jaw, etc.) and adjust them independently with precise numerical control.
- Asymmetry support. Real faces aren't symmetrical. The new tools allow controlled asymmetry that was difficult to achieve with the web app's bilateral slider approach.
- DNA file editing. The MetaHuman DNA file (which stores the facial rig configuration) is now directly editable in the editor's property panel. Advanced users can modify rig behavior without external tools.
Hair and Groom System
MetaHuman hair in UE5.7 uses an updated groom system:
- Strand-based hair remains the high-quality option, now with improved physics simulation and lower runtime cost (approximately 30% reduction in GPU time compared to the 5.4 implementation).
- Cards-based hair is the performance option, suitable for mid-range hardware and large character counts. The cards generation from strand data has been improved to produce more natural-looking results.
- Auto-LOD for hair. The groom system now generates LODs automatically — strand hair transitions to cards at distance, and cards simplify at further distances. This was a manual process before and frequently left broken.
Clothing System
The new MetaHuman clothing workflow integrates with UE5.7's Chaos Cloth system:
- In-editor clothing simulation preview. See how clothing drapes and moves in the MetaHuman editor, not just in Play mode.
- Clothing preset library. A set of base garments (shirts, pants, jackets, dresses) that can be customized with materials and pattern variations.
- Custom garment import. Import your own clothing meshes from Blender or Maya, and the editor will automatically generate cloth simulation setup based on the mesh topology.
Step-by-Step Migration Guide
If you have existing MetaHuman characters created with the web app, here's how to migrate them to the new in-editor system.
Step 1: Back Up Everything
Before you start, make a complete backup of your project. The migration process modifies MetaHuman assets in place, and while it's been stable in our testing, a backup is essential.
// Back up your project's Content/MetaHumans/ directory
// Back up any Blueprint or Level assets that reference MetaHuman characters
// Back up your project's .uproject file
Step 2: Update to UE5.7
The in-editor MetaHuman tools require UE5.7. If you're on an earlier version, you'll need to upgrade your project first. This is a standard engine version migration — follow Epic's migration guide for your source version.
Key UE5.7 changes that affect MetaHuman projects:
- The MetaHuman plugin is now
MetaHumanEditorinstead ofMetaHumanCreator - The
MetaHumanComponentclass has been refactored — check your C++ code for API changes - The DNA asset format has been updated to version 3.0
Step 3: Run the MetaHuman Migration Tool
In UE5.7, go to Window → MetaHuman → Migrate Characters. This opens the migration wizard.
What the migration tool does:
- Scans your project for all MetaHuman assets (DNA files, skeletal meshes, groom assets, material instances).
- Converts A-pose to neutral pose. Re-poses the base skeletal mesh and updates all morph targets to use the new rest pose.
- Updates the DNA file to version 3.0 format. This adds new rig properties and updates the facial animation data to work with the new sculpting tools.
- Regenerates LODs. Hair, face, and body LODs are regenerated using the new LOD algorithms.
- Updates material instances. MetaHuman material instances are updated to use the new shader model, which includes improved skin subsurface scattering and eye rendering.
The process takes approximately 2-5 minutes per character depending on complexity. For projects with many MetaHumans, plan accordingly.
Step 4: Verify Animation Compatibility
After migration, your existing animations should continue to work without changes. The runtime skeleton is the same — only the authoring pose changed. However, verify:
- Facial animations: Play through your facial animation sequences and look for any subtle differences in expression shapes. The morph target re-posing can occasionally shift the intensity of certain expressions.
- Body animations: Test full-body animations, especially those with arm movements near the shoulders. The A-pose to neutral-pose change affects the default shoulder position.
- Clothing physics: If you have cloth simulation on existing characters, the new rest pose may require re-tuning constraint values.
Step 5: Re-Export Modified Characters (If Needed)
If you had previously exported MetaHumans to Blender or Maya for custom modifications, you'll need to re-export using the new pipeline (covered in the cross-engine section below). The old FBX export path still works but doesn't include the new DNA format benefits.
Step 6: Test Performance
The new MetaHuman shader model and LOD system may change your performance profile. Run your existing performance tests and compare:
- GPU time for character rendering (should be roughly equivalent or slightly better)
- Hair simulation cost (should be ~30% lower for strand hair)
- Memory usage (new LOD system may use more memory for LOD storage but less for runtime rendering)
Cross-Engine Licensing: MetaHuman in Blender and Maya
This is one of the most significant changes in the MetaHuman ecosystem. Epic has introduced cross-engine licensing that allows MetaHuman assets to be used in Blender 4.2+ and Maya 2025+ for the first time.
What's Allowed
Under the new licensing terms:
- Modification in DCC tools. You can export MetaHuman characters to Blender or Maya for custom modifications, including mesh editing, texture painting, custom blend shapes, and clothing creation.
- Rendering in DCC tools. MetaHuman characters can be rendered in Blender's Cycles/EEVEE or Maya's Arnold for cinematics, promotional material, or film/TV production.
- Non-game commercial use. MetaHuman characters can now be used in architectural visualization, product visualization, and film production through DCC tools.
What's Not Allowed
- Redistribution of MetaHuman source assets (DNA files, base mesh topology) outside of compiled game builds or rendered output.
- Use in competing real-time engines (you can't import MetaHumans into Unity or Godot for a shipped game — the cross-engine license covers DCC tools, not other game engines).
- Modification that removes the MetaHuman identity (you can't extract the underlying technology or rig system for use in a non-MetaHuman pipeline).
MetaHuman in Blender: Setup and Workflow
The MetaHuman Blender plugin (available from Epic's developer portal) enables import and export of MetaHuman characters in Blender 4.2 and newer, including Blender 5.0.
Installation:
- Download the MetaHuman Blender addon from the Epic Developer Portal
- In Blender, go to Edit → Preferences → Add-ons → Install
- Select the downloaded .zip file
- Enable the "MetaHuman Bridge" addon
Exporting from UE5.7 to Blender:
- In the MetaHuman Editor, select your character
- Click Export → Blender Format
- Choose export options (mesh, rig, textures, groom data)
- The export creates a .blend file with the full character setup
What you get in Blender:
- The complete mesh with proper topology
- The MetaHuman skeleton (as a Blender armature)
- Face rig controls (translated to Blender bone constraints)
- Hair groom data (as Blender curves)
- All textures and materials (translated to Blender shader nodes)
- Morph targets (as Blender shape keys)
What you can modify:
- Mesh sculpting and retopology
- Custom texture painting
- Additional blend shapes / shape keys
- Custom clothing authored in Blender
- Hair styling modifications
- Material and shader tweaks
Importing back to UE5.7:
- In Blender, use File → Export → MetaHuman (.mhx)
- In UE5.7, use the MetaHuman Editor's Import function
- The editor validates your changes and integrates them into the MetaHuman system
Using Blender MCP Server for MetaHuman Modifications
The Blender MCP Server with its 212 tools across 22 categories can significantly accelerate MetaHuman modification workflows in Blender. Here's how.
Batch character variations. If you need multiple character variations (different NPCs based on the same MetaHuman base), you can use the MCP server to script the variations:
- Adjust shape key values to create unique facial features per NPC
- Swap material assignments for different skin tones, eye colors, hair colors
- Scale body proportions within specified ranges for height/build variety
- Apply and randomize clothing combinations from a prepared wardrobe set
Instead of manually adjusting dozens of sliders per character, describe the variation parameters and let the MCP server generate 10, 50, or 100 unique NPC configurations.
Automated mesh modifications. The MCP server can handle repetitive mesh operations that are tedious to do manually:
- Adding scars, wrinkles, or other surface detail as sculpted displacement
- Modifying ear, nose, or jaw geometry across a batch of characters
- Adjusting body proportions for different character archetypes
- Generating LOD variants with controlled polygon reduction
Hair and groom operations. MetaHuman hair in Blender exists as curve objects, and the MCP server can manipulate curve data:
- Adjust hair length, density, and distribution
- Create hair color variations
- Modify groom shape for different hairstyles
- Generate particle system variants for cards-based hair fallbacks
Material and texture workflows. The MCP server excels at material operations:
- Batch-create skin material variants with different tones and textures
- Set up procedural freckle, mole, or blemish patterns
- Adjust subsurface scattering parameters across character sets
- Generate texture atlases for performance optimization
Example workflow — creating 20 unique NPCs from 3 base MetaHumans:
- Export 3 base MetaHumans from UE5.7 to Blender
- Use the Blender MCP Server to define variation parameters (face shape ranges, skin tone ranges, hair style options, clothing combinations)
- The MCP server generates 20 unique combinations, each as a separate collection in the .blend file
- Review the variations and make manual adjustments where needed
- Export all 20 back to UE5.7 using the MetaHuman Bridge
This process takes hours instead of days. The MCP server handles the mechanical work while you make the creative decisions.
Batch Configuration with Unreal MCP Server
Once your MetaHumans are in UE5.7, the Unreal MCP Server helps with the project-side configuration and management.
Character Setup Automation
Setting up a MetaHuman in a game involves more than just importing the asset. You need to configure:
- Animation Blueprint assignments
- LOD settings and screen size thresholds
- Physics asset configurations
- Collision profile settings
- Cloth simulation parameters
- Groom binding settings
- Material parameter overrides for runtime customization
The Unreal MCP Server can batch-apply these settings across all MetaHumans in your project. Define your standard configuration once, then apply it to every character, with per-character overrides where needed.
Level Integration
Placing MetaHumans in levels with correct references, animation assignments, and interaction setups is repetitive work. The MCP server can:
- Place MetaHuman characters at specified locations with proper rotation and scale
- Assign animation sequences or Animation Blueprint overrides per instance
- Wire up Blueprint interaction components (especially useful with the Blueprint Template Library interaction and dialogue systems)
- Configure AI behavior tree references for NPC characters
- Set up look-at targets and gaze behavior
Performance Profiling
The MCP server can run automated performance profiling for MetaHuman-heavy scenes:
- Place test configurations with varying numbers of MetaHumans (1, 5, 10, 20, 50)
- Run frame time captures for each configuration
- Generate reports showing the per-character cost breakdown (mesh, hair, cloth, animation)
- Identify performance bottlenecks across your character cast
Integrating MetaHumans with Blueprint Template Library
MetaHumans look incredible, but they need gameplay systems to function as actual game characters. The Blueprint Template Library provides the systems side of character integration.
Dialogue System Integration
The Blueprint Template Library's dialogue system includes MetaHuman-aware features:
- Facial animation triggers. Dialogue nodes can trigger specific MetaHuman facial animations or blend shape adjustments based on the emotional content of the dialogue line.
- Lip sync integration. The dialogue system's audio playback hooks into UE5.7's lip sync pipeline, which drives MetaHuman facial animation in real time.
- Camera framing. Dialogue sequences can trigger camera cuts to close-up shots that showcase MetaHuman facial detail. Combined with the Cinematic Spline Tool, you can create cinematic conversation sequences with smooth camera transitions between speakers.
- Portrait rendering. The dialogue UI supports character portraits rendered from the MetaHuman model, ensuring visual consistency between the dialogue box portrait and the in-world character.
NPC Character Setup
For NPCs using MetaHuman models, the Blueprint Template Library provides:
- Health and combat system. Assign health pools, damage reactions, and death animations to MetaHuman NPCs. The damage system supports hit location detection, which can trigger region-specific MetaHuman reactions (face hits trigger facial pain expressions, body hits trigger stumble animations).
- Quest system integration. MetaHuman NPCs can serve as quest givers, with the quest system managing available/active/completed states and driving appropriate dialogue and behavior.
- Interaction system. The interaction framework handles player-to-NPC interaction prompts, range detection, and input handling. MetaHumans respond to player proximity with gaze tracking and idle animation variations.
- Stats and progression. If NPCs have gameplay stats (for companion characters or RPG systems), the stats template integrates with MetaHuman to drive visual indicators and behavioral changes.
MetaHuman Performance Optimization
MetaHumans are expensive to render. A single high-quality MetaHuman with strand hair can cost 4-8ms of GPU time, which is 25-50% of a 60fps frame budget. For games that ship MetaHumans, performance optimization is critical.
LOD Strategy
UE5.7's auto-LOD for MetaHumans generates 4 LOD levels by default:
- LOD 0: Full quality. Strand hair, full facial rig, all morph targets active. Use for close-up dialogue and cutscenes.
- LOD 1: Reduced quality. Cards-based hair, simplified facial rig (50% of morph targets), simplified cloth. Use for mid-range gameplay distances.
- LOD 2: Low quality. Simplified cards hair, no facial animation, static clothing. Use for background characters.
- LOD 3: Minimum quality. Billboard or highly simplified mesh. Use for crowd scenes.
Customize LOD transitions based on your game:
For a first-person RPG with dialogue close-ups:
- LOD 0 for the active dialogue partner (1 character)
- LOD 1 for nearby NPCs (3-5 characters)
- LOD 2 for visible NPCs at medium distance (5-15 characters)
- LOD 3 for distant NPCs (20+ characters)
For a third-person action game:
- LOD 0 for the player character (always)
- LOD 1 for combat targets and active NPCs (2-5 characters)
- LOD 2 for other visible characters (5-20 characters)
- LOD 3 rarely needed (characters at LOD 2 distance are usually culled)
Hair Optimization
Hair is typically the most expensive part of a MetaHuman. Optimization strategies:
Strand hair (LOD 0 only):
- Reduce strand count if the default is too expensive. 50,000 strands is typical for a MetaHuman; reducing to 25,000 saves significant GPU time with minimal visual impact except in extreme close-ups.
- Disable hair physics simulation when the character isn't in motion. Static hair that only simulates during movement saves CPU and GPU.
- Limit strand hair to one character at a time (the dialogue partner or focused character).
Cards hair (LOD 1-2):
- Use fewer cards with higher-resolution textures rather than many cards with low-resolution textures. This is counterintuitive but produces better results because the texture filtering handles detail better than geometric density.
- Enable depth pre-pass for hair cards to reduce overdraw.
Skin Rendering Optimization
MetaHuman skin uses a multi-layer subsurface scattering shader that's more expensive than standard materials:
- Reduce subsurface samples at LOD 1 and below. The visual difference is subtle at gameplay distances.
- Disable micro-detail normal maps at LOD 2+. These add skin pore detail that's invisible beyond close-up range.
- Use shared material instances where possible. Characters with similar skin tones can share base material instances, reducing draw calls.
Animation Optimization
MetaHuman facial animation is driven by a complex set of morph targets (blend shapes). The full face rig has over 700 morph targets, which is expensive to evaluate every frame.
- Reduce active morph targets at distance. LOD 1 should use ~300 morph targets (removing subtle microexpression targets). LOD 2 should use ~50 morph targets (basic expression only). LOD 3 should use zero morph targets.
- Update animation at reduced tick rates for background characters. A character at LOD 2 doesn't need facial animation updates every frame — every 3rd or 5th frame is sufficient.
- Disable body simulation (cloth, secondary motion) for characters outside the player's view frustum. UE5.7 does this automatically for completely occluded characters, but you can be more aggressive for characters at the screen edges.
Memory Optimization
Each MetaHuman consumes significant memory:
- Textures: 200-400 MB per character at full quality (face textures, body textures, clothing textures, hair textures)
- Mesh data: 50-100 MB per character (including all LODs and morph targets)
- Groom data: 20-80 MB per character (strand definitions, physics data)
Strategies:
- Use texture streaming aggressively. Only the active LOD's textures need to be fully loaded.
- Share body textures where possible (characters with the same body type can share underlying mesh textures, with face textures being unique).
- Limit the total number of unique MetaHumans loaded simultaneously. Use a streaming pool approach where characters are loaded/unloaded based on proximity.
Platform Budgets
Here are realistic per-frame budgets for MetaHuman rendering on different hardware:
| Platform | Max LOD 0 Characters | Max LOD 1 Characters | Max Total Visible | Notes |
|---|---|---|---|---|
| RTX 4070+ | 2 | 5-8 | 30-50 | Comfortable headroom |
| RTX 3060 / RX 6700 | 1 | 3-5 | 15-25 | Tight budget |
| Steam Deck | 1 | 2-3 | 8-12 | Very tight, needs careful LOD tuning |
| PS5 / Xbox Series X | 1-2 | 4-6 | 20-35 | Hardware-specific optimizations help |
| Xbox Series S | 1 | 2-3 | 10-15 | Similar to Steam Deck budget |
These are rough guidelines for 60fps targets. If you're targeting 30fps, roughly double the character counts. If your game has other expensive rendering features (ray tracing, dense vegetation, large draw distances), reduce these budgets accordingly.
Troubleshooting Common Migration Issues
Even with the automated migration wizard, some projects encounter issues. Here are the most common problems and their solutions.
Issue 1: Facial Expressions Look Different After Migration
The A-pose to neutral pose conversion recalculates morph target deltas. In most cases, the expressions are identical, but some characters — particularly those with heavily customized jaw or neck regions — may show subtle differences in expression intensity.
Fix: Open the MetaHuman Editor for the affected character, navigate to the Expression Calibration panel, and use the "Compare to Legacy" toggle. This shows the pre-migration expression alongside the current one. Adjust the morph target intensity multipliers until they match. Typically, jaw-open and neck-turn expressions need a 5-10% intensity boost to match the legacy behavior.
Issue 2: Custom Clothing Doesn't Fit After Migration
If you created custom clothing fitted to the A-pose MetaHuman, the neutral rest pose change means the clothing mesh no longer aligns correctly, especially around the shoulders and upper arms.
Fix: Re-bind the clothing mesh to the updated skeleton. In the MetaHuman Editor, use the Clothing → Re-bind to Current Pose option. If the clothing was created in Blender, re-export the MetaHuman in the new neutral pose, re-fit the clothing in Blender (the Blender MCP Server can automate re-fitting across multiple garments), and re-import.
Issue 3: Third-Party Animation Packs Show Shoulder Artifacts
Some animation packs created for the pre-5.7 MetaHuman skeleton may produce shoulder artifacts because they were authored with A-pose assumptions. The runtime skeleton is technically the same, but some animations stored shoulder rotation offsets relative to the A-pose that produce incorrect results with the neutral rest pose.
Fix: Use UE5.7's Animation Retarget system to create a retarget profile that compensates for the pose difference. The MetaHuman migration tool includes a "Legacy Animation Compatibility" retarget preset specifically for this purpose. Apply it to the affected animation assets.
Issue 4: Hair Groom Data Fails to Import
The groom data format changed in UE5.7, and some MetaHumans created with very early versions of the web app (2021-2022) may have groom data in a format that the migration tool can't automatically convert.
Fix: If the automatic migration fails for groom data, you can regenerate the hair. Open the MetaHuman Editor, go to the Hair panel, and select "Regenerate from DNA." This creates new groom data from the MetaHuman's DNA file, which stores the hair style definition independently of the groom mesh data. The result should be visually identical to the original.
Issue 5: Blueprint References Break After Plugin Rename
The plugin rename from MetaHumanCreator to MetaHumanEditor can break Blueprint nodes that reference MetaHuman component classes. UE5.7's core redirect system handles most of these automatically, but custom Blueprints that used the plugin name in string-based references (such as LoadClass calls with hardcoded paths) may fail.
Fix: Search your project for string references to MetaHumanCreator and update them to MetaHumanEditor. The Unreal MCP Server can automate this search-and-replace across all Blueprint and C++ assets in your project, ensuring no references are missed.
The Future of Digital Humans in Games
MetaHuman's evolution from a web app to an in-editor tool reflects a broader industry trend: digital humans are becoming a standard game asset type, not a special technology demo.
Where Things Are Heading
Real-time neural rendering for faces. NVIDIA's research into neural face rendering suggests that future digital humans might not be polygon meshes at all — they could be neural representations that render directly from learned face data. This is still research-stage, but the trajectory is clear.
AI-driven facial animation. Instead of authored blend shape animations, future MetaHumans might use AI models that generate facial animation from audio input in real time. The lip sync capabilities in UE5.7 are a step toward this, but full expression-from-audio is coming.
Procedural character generation. The ability to generate complete, unique NPC characters procedurally — face, body, clothing, voice — is approaching viability for background characters. This is relevant for open-world games that need hundreds of unique-looking NPCs.
Cross-platform character portability. As MetaHuman assets become usable across engines and DCC tools, the concept of a "character asset standard" is emerging. A character created in one tool, usable in any tool, is the long-term direction.
What This Means for Indie Developers
For indie developers, the key takeaway is that photorealistic digital humans are no longer reserved for AAA studios with dedicated character teams. The combination of MetaHuman's in-editor tools, MCP automation for batch operations, and template-based gameplay integration makes it feasible for small teams to include high-quality digital characters in their games.
The constraint has shifted from "can we create these characters?" to "can we optimize them for our target platform?" Performance optimization is the real challenge, and it's a solvable engineering problem rather than an art pipeline limitation.
Migration Checklist
Use this checklist for migrating existing MetaHuman projects to UE5.7:
Pre-Migration:
- Full project backup completed
- Documented all MetaHuman characters in the project (names, customizations, linked assets)
- Noted any custom mesh modifications made outside of MetaHuman Creator
- Recorded current performance benchmarks for comparison
Migration:
- Updated project to UE5.7
- Enabled MetaHuman Editor plugin
- Ran Migration Wizard for all characters
- Verified migration completed without errors
Post-Migration Verification:
- Facial animations play correctly
- Body animations play correctly
- Cloth simulation behaves as expected
- Hair renders correctly at all LOD levels
- Material quality is equivalent or improved
- Blueprint/C++ references to MetaHuman components still compile
Performance Verification:
- Frame time comparison with pre-migration benchmarks
- Memory usage comparison
- LOD transitions working at correct screen sizes
- Hair LOD transition (strand → cards) is smooth
Cross-Engine (If Applicable):
- Installed MetaHuman Blender/Maya addon
- Tested export from UE5.7 to DCC tool
- Tested round-trip (export, modify, re-import)
- Verified no data loss in round-trip
Conclusion
The MetaHuman Creator web app served its purpose — it introduced digital humans to a wide audience of game developers and proved that photorealistic characters could be accessible to small teams. But its limitations were real, and the move to in-editor tools in UE5.7 is the right evolution.
The new workflow is better in every measurable way: faster iteration, more customization, no A-pose frustrations, broader proportion ranges, and cross-engine licensing that opens MetaHuman to the wider creative community.
For teams using our tools, the Blender MCP Server makes batch character creation in Blender practical at a scale that would be prohibitive manually. The Unreal MCP Server handles the project-side configuration and management. And the Blueprint Template Library provides the gameplay systems that turn beautiful characters into functional game entities.
MetaHuman in 2026 isn't a technology demo anymore. It's a production tool. Treat it accordingly.