Launch Discount: 25% off for the first 50 customers — use code LAUNCH25

StraySparkStraySpark
ProductsDocsBlogGamesAbout
Back to Blog
tutorial
StraySparkJune 22, 20265 min read
Blender to Unreal Pipeline: The Complete Asset Workflow for Indie Devs 
BlenderUnreal EnginePipelineIndie Dev3d Modeling

The Blender-to-Unreal pipeline is the backbone of most indie game art workflows. Blender is free, powerful, and has a massive community. Unreal Engine 5 is the leading real-time engine for high-fidelity games. But getting assets cleanly from one to the other involves more friction than either tool's documentation suggests.

Export settings that produce invisible geometry. Materials that don't translate. Normals that flip. Scale that's off by a factor of 100. Collision meshes that don't register. These aren't bugs — they're the result of two powerful tools having different assumptions about how 3D data should work.

This guide covers the full pipeline: modeling in Blender, exporting correctly, importing into Unreal, converting materials, setting up collision and LODs, and placing your assets at scale across a game world. We'll also look at how AI-assisted workflows with the Blender MCP Server and the Procedural Placement Tool can accelerate each step.

Why Blender + Unreal

Before diving into the workflow, let's address why this particular combination has become the default for indie teams.

Blender's Strengths

  • Free and open source. No license fees, no seat limits, no subscription. For a solo developer or small team, this matters enormously.
  • All-in-one DCC. Modeling, sculpting, UV unwrapping, texturing, rigging, animation, simulation, and rendering in a single application. No need for separate tools.
  • Rapid improvement. Blender 4.x and 5.0 have brought geometry nodes, improved modeling tools, a better outliner, and performance improvements that put it on par with commercial DCCs for game asset work.
  • Massive community. Tutorials, add-ons, asset libraries, and community support are extensive. If you're stuck, someone has solved your problem on Blender Stack Exchange or YouTube.

Unreal's Strengths

  • Nanite and Lumen. Nanite eliminates traditional LOD workflows for eligible meshes. Lumen provides dynamic global illumination. Together, they reduce the technical art burden significantly.
  • World Partition. Building large open worlds is feasible for small teams.
  • Marketplace and ecosystem. Access to high-quality assets, plugins, and tools.
  • Blueprint system. Visual scripting that lets artists and designers create gameplay without C++.

Why Not Use Unreal's Built-in Modeling?

UE5's modeling tools are good for in-engine adjustments — cutting holes, adding chamfers, and tweaking geometry without round-tripping to Blender. But they're not a replacement for a dedicated DCC tool for asset creation. You'll do primary modeling in Blender and save in-engine editing for adjustments and blockout work.

The Traditional Pipeline

The standard Blender-to-Unreal workflow looks like this:

  1. Model in Blender — create the mesh, unwrap UVs, set up material slots
  2. Export from Blender — FBX or USD format with specific settings
  3. Import into Unreal — configure import options for scale, materials, and collision
  4. Set up materials in Unreal — create UE5 materials or material instances
  5. Configure collision — add collision meshes for physics
  6. Generate LODs — create level-of-detail meshes (or enable Nanite)
  7. Place in the world — position the asset in your level

Each step has gotchas. Let's go through them.

Modeling with AI Assistance

Before we get to export and import, let's talk about the modeling phase itself — because it's where most of the time goes.

The Blender MCP Server Approach

The Blender MCP Server connects AI assistants like Claude, Cursor, and Windsurf directly to Blender's Python API. With 212 tools across 22 categories, it lets you describe what you want and have the AI execute Blender operations for you.

What this looks like in practice:

  • "Create a modular wall piece, 4m wide by 3m tall, with a doorway cutout on the left side. Add a second UV channel for lightmaps."
  • "Generate a rocky cliff face using displacement, make it Nanite-ready, and set up three material slots: rock face, moss top, and dirt base."
  • "Take this high-poly sculpt and create a game-ready version: decimate to 5000 triangles, bake normal and AO maps to the low-poly, and set up the UV layout."

The AI handles the Blender operations — creating geometry, setting modifiers, configuring UV maps, managing material slots — while you direct the creative decisions.

Where AI Assistance Shines

Repetitive setup tasks. Creating UV lightmap channels, setting up material slots, configuring export settings, naming conventions — the tedious work that eats time but doesn't require artistic judgment.

Procedural generation. Using Blender's geometry nodes to generate variations (fence posts, rocks, debris) is powerful but has a steep learning curve. Describing the variation rules in natural language and letting the AI build the geometry node graph saves hours of node-graph debugging.

Batch operations. "For every mesh in this collection, add a second UV channel, auto-unwrap it for lightmaps, and rename it to follow the SM_ModularKit_[name] convention." Doing this manually for 50 meshes is an afternoon. With the MCP Server, it's one request.

Learning curve reduction. Blender has a deep interface. New users spend weeks learning hotkeys, menus, and modifier stacks. AI assistance lets you describe what you want and learn the tool through the operations it performs.

Where You Still Need Manual Work

Artistic decisions. AI can create geometry, but the creative vision — proportions, silhouette, visual weight, stylistic choices — is yours.

Fine detail sculpting. Organic sculpting (character faces, creature anatomy, terrain details) requires the kind of iterative, visual feedback loop that works best with direct manipulation.

UV layout optimization. Auto-unwrap gets you 80% of the way. That last 20% — minimizing stretching on important faces, maximizing texture density on visible surfaces, ensuring consistent texel density — is still a manual skill.

Topology cleanup. Game-ready meshes need clean topology for deformation (characters), consistent normals (hard-surface), and reasonable poly counts. AI can decimate and retopologize, but reviewing the result requires a human eye.

Export Settings That Matter

This is where most pipeline problems originate. Blender's FBX exporter has dozens of settings, and the defaults don't match Unreal's expectations.

FBX Export Configuration

Here are the settings that matter, and why:

Scale: 1.0 with "Apply Unit" enabled. Blender uses meters by default. Unreal uses centimeters. Setting scale to 1.0 with "Apply Unit" converts correctly. If your assets import at 1/100th size or 100x size, this setting is wrong.

Forward axis: -Y Forward, Z Up. Blender's default is -Y forward and Z up, which matches Unreal's coordinate system. Don't change these unless you have a specific reason.

Apply Transforms: enabled. This bakes the object's location, rotation, and scale into the mesh data. If disabled, an object rotated 45 degrees in Blender will import at that rotation in Unreal, which is usually not what you want. You want the mesh data to reflect the visual state.

Smoothing: Face. Export smoothing as "Face" rather than "Normals" or "Edge." This preserves your custom normals and smooth/sharp edge markup correctly in Unreal.

Tangent Space: enabled. Export tangent space data for correct normal map rendering in Unreal. Without this, normal maps may look incorrect or flat.

Mesh: Apply Modifiers. Bake all modifiers (subdivision, mirror, array, booleans) into the exported mesh. Unreal can't interpret Blender modifiers — it needs the final geometry.

Armature settings (for animated meshes): Export with "Add Leaf Bones" disabled (Unreal doesn't need them and they clutter the skeleton). Set primary and secondary bone axes to match your rig.

USD as an Alternative

Unreal's USD import has matured significantly. USD offers advantages over FBX:

  • Better material preservation — USD materials map more cleanly to Unreal material concepts
  • Stage-based workflow — reference USD files nondestructively
  • Industry standard — better long-term support than the aging FBX format

However, the FBX pipeline is still more battle-tested for game assets. USD is worth evaluating for your project, but FBX remains the safer default for most indie teams in 2026.

Naming Conventions

Establish naming conventions before exporting your first asset. Unreal expects (and rewards) consistent naming:

  • Static meshes: SM_AssetName (e.g., SM_WallModular_4x3)
  • Skeletal meshes: SK_AssetName
  • Textures: T_AssetName_Suffix (e.g., T_WallModular_D for diffuse, _N for normal, _ORM for packed occlusion/roughness/metallic)
  • Materials: M_AssetName or MI_AssetName for instances
  • Collision: UCX_AssetName_01 (convex collision prefix recognized by Unreal's importer)

Name your objects in Blender before export. Renaming hundreds of assets in Unreal after import is tedious.

Collision Meshes in Blender

Unreal recognizes collision meshes exported alongside your visual mesh if they follow the naming convention:

  • UCX_MeshName_## — convex collision. The importer creates convex collision from these meshes.
  • UBX_MeshName_## — box collision. Creates an oriented bounding box.
  • USP_MeshName_## — sphere collision. Creates a sphere collider.
  • UCP_MeshName_## — capsule collision. Creates a capsule collider.

Create these as separate objects in Blender, parent them to the visual mesh, and export everything together. Unreal assigns them as collision automatically during import.

Key rule: Collision meshes must be convex. Concave collision meshes don't work with this naming convention. For concave shapes, decompose into multiple convex pieces (UCX_MeshName_01, UCX_MeshName_02, etc.).

Import and Material Conversion

Import Settings

When importing FBX into Unreal:

Import Mesh: enabled. Import the mesh geometry.

Generate Missing Collision: usually disabled. If you exported custom collision (UCX_ meshes), Unreal uses those. If you enable auto-generation, it creates simple collision that may not match your intent. Better to create it explicitly.

Import Materials and Textures: your choice. Unreal can auto-create materials from FBX material data, but the results are basic. For production assets, you'll create materials manually or use material instances from a master material. For quick prototyping, auto-import gets you something visible fast.

Normal Import Method: Import Normals and Tangents. This preserves your custom normals from Blender. If normals look wrong after import, check that you exported normals correctly from Blender (Face smoothing, tangent space enabled).

Transform settings: If you exported with correct scale and axes in Blender, leave the import transform at defaults (scale 1.0, no rotation). If your mesh is the wrong size, fix it in the export settings, not the import settings — fixing at import creates a transform offset that causes problems later.

Material Conversion

Blender's Principled BSDF and Unreal's default material model are both PBR, but they don't map 1:1. Here's what translates and what doesn't:

Direct mapping:

  • Base Color -> Base Color
  • Metallic -> Metallic
  • Roughness -> Roughness
  • Normal Map -> Normal Map (but check the normal map format — Blender uses OpenGL, Unreal uses DirectX. Flip the green channel if normals look inverted.)
  • Emission -> Emissive Color

Indirect mapping:

  • Specular (Blender) -> Specular (Unreal), but the scales differ. Blender's 0.5 default maps to Unreal's 0.5 default, but non-default values need adjustment.
  • Alpha (Blender) -> Opacity or Opacity Mask (Unreal). You need to set the material's blend mode to Translucent or Masked in Unreal — it doesn't auto-detect transparency.
  • Subsurface (Blender) -> Subsurface Profile (Unreal). Completely different implementation. Requires a Subsurface Profile asset in Unreal.

No direct equivalent:

  • Blender's procedural textures (Noise, Voronoi, Wave, etc.) don't export. Bake them to texture maps before export.
  • Blender's shader nodes (Math, ColorRamp, Mapping, etc.) don't translate. Any procedural material needs to be baked to textures or rebuilt in Unreal's material editor.

The Master Material Approach

For production workflows, don't create unique materials for every asset. Create a master material (or a few master materials) and use material instances:

  1. Create a master material in Unreal with parameters for Base Color, Normal, ORM (packed Occlusion/Roughness/Metallic), Emissive, and any global settings (tiling, detail textures, wind animation for foliage).
  2. Create material instances from the master for each asset or asset family. Set the texture parameters to point to your imported textures.
  3. Assign material instances to imported meshes, replacing the auto-generated materials.

This approach gives you global control (change the master material, all instances update) and keeps the material library organized.

Texture Packing

Unreal prefers packed textures to reduce texture samples:

  • ORM texture: Occlusion (R), Roughness (G), Metallic (B) in a single RGB texture. Blender exports these as separate maps. Pack them in an image editor or with a Blender compositing node setup.
  • Normal maps: Export as 16-bit for better quality. Ensure DirectX format (green channel up).
  • Base Color: sRGB color space. Don't include ambient occlusion in the base color — that goes in the ORM texture.

Packing textures saves GPU memory (three channels in one texture vs. three separate textures) and reduces sampling cost.

Collision and LODs

Collision Strategies

If you didn't export collision meshes from Blender, you have several options in Unreal:

Auto-generated convex collision. In the Static Mesh editor, Collision > Auto Convex Collision. This generates convex hulls that approximate the mesh shape. Adjustable accuracy/complexity tradeoff.

Simple collision shapes. Add box, sphere, capsule, or convex collision directly in the Static Mesh editor. For most game objects (crates, barrels, furniture), simple shapes are faster and more predictable than mesh-accurate collision.

Complex as Simple. Uses the render mesh for collision. Extremely expensive for runtime physics queries. Only use for very simple meshes or meshes that absolutely need precise collision (terrain, architecture that the player navigates closely).

Best practice: Use the simplest collision that produces acceptable gameplay. Players don't notice that a barrel's collision is a cylinder instead of matching every bump on the mesh. They do notice when collision checks cause frame rate drops.

LOD Setup

For meshes that aren't Nanite-eligible, you need level-of-detail meshes:

Option 1: Generate LODs in Unreal. The Static Mesh editor can auto-generate LODs by reducing triangle count at each level. Quick and works acceptably for most props. Reduction quality is decent but not perfect — auto-generated LODs sometimes collapse important silhouette details.

Option 2: Create LODs in Blender. Model explicit LOD meshes (or use the Decimate modifier) and export them as separate meshes. Name them MeshName_LOD0, MeshName_LOD1, etc. Import them as LOD levels in the Static Mesh editor. More control, more work.

Option 3: Use Nanite. Enable Nanite on the mesh and skip traditional LODs entirely. Nanite handles continuous level-of-detail automatically, with no pop-in and no artist-authored LODs. This is the best option for static meshes that meet Nanite's requirements.

Nanite eligibility: Nanite works with opaque static meshes. It does not currently support translucency, masked materials (with limitations in recent versions), skeletal meshes, or world-position-offset animations. Foliage with alpha cutout is a gray area — Nanite foliage works in recent UE5 versions but with some limitations on alpha masking quality.

LOD Distances

If using traditional LODs, configure screen-size thresholds for each LOD level:

  • LOD 0 (full detail): Screen size > 0.5 (mesh takes up more than half the screen)
  • LOD 1 (50% reduction): Screen size 0.2–0.5
  • LOD 2 (75% reduction): Screen size 0.05–0.2
  • LOD 3 (90% reduction): Screen size < 0.05

These are starting points. Profile and adjust based on your specific meshes and how visible the LOD transitions are.

Placing Assets at Scale

You've modeled, exported, imported, and set up your assets. Now you need to place thousands of them in your world.

Hand Placement vs. Procedural Placement

Hand placement works for unique landmarks, key structures, and hero assets — things that have specific positions for gameplay or narrative reasons.

It doesn't work for environment fill: trees, rocks, grass, debris, ground cover, scattered props. A forest with 50,000 trees can't be hand-placed. A rocky hillside with 10,000 boulders can't either.

The Procedural Placement Tool

The Procedural Placement Tool fills this gap with rule-based scatter:

What it does:

  • Scatters assets across surfaces based on configurable rules (density, slope limits, altitude range, exclusion zones)
  • Uses HISM (Hierarchical Instanced Static Mesh) for maximum rendering performance — 100,000+ instances per second
  • Supports biome zones that define different scatter rules for different world regions
  • Includes spline-based scatter for linear features (river banks, road edges, cliff faces)
  • Integrates with World Partition for large-world streaming

How it fits the Blender-to-Unreal pipeline:

  1. Create asset variations in Blender (5-10 rock variations, 3-4 tree types, ground cover plants)
  2. Export and import into Unreal with correct materials and collision
  3. Create scatter rules in the Procedural Placement Tool: which meshes to use, density, slope constraints, altitude range, minimum spacing
  4. Run the scatter across your landscape
  5. Adjust rules and re-scatter until the environment looks natural

Rule-based variation: Instead of placing each tree individually, define rules: "Scatter oak trees at 0.3 density on slopes below 30 degrees, between 100m and 400m altitude. Mix with birch trees at 0.1 density. Add pine trees above 400m altitude." The tool translates these rules into a natural-looking distribution.

Combining Manual and Procedural Work

The best environments combine both approaches:

  1. Block out the level with key landmarks and structures (hand-placed)
  2. Define exclusion zones around hand-placed areas so scatter doesn't overlap
  3. Run procedural scatter for forests, ground cover, and environment fill
  4. Hand-adjust problem areas — remove instances that clip through buildings, add specific hero trees at focal points
  5. Iterate — adjust scatter rules, re-run, review, repeat

This workflow scales from small levels (a few minutes of scatter configuration) to massive open worlds (define biomes and let the tool handle the scale).

The Full Workflow

Here's the complete pipeline from empty Blender file to populated game world, summarized:

Phase 1: Asset Creation (Blender)

  1. Model the asset (or use the Blender MCP Server for AI-assisted modeling and setup)
  2. Unwrap UVs — one channel for textures, one for lightmaps
  3. Set up material slots matching your Unreal master material structure
  4. Create collision meshes (UCX_ naming) for game-ready assets
  5. Create LOD meshes if not using Nanite
  6. Texture in Blender, Substance Painter, or your preferred texturing tool
  7. Pack textures (ORM, normal map in DirectX format)

Phase 2: Export (Blender)

  1. Apply all modifiers
  2. Verify naming conventions (SM_ prefix, UCX_ collision meshes, LOD suffixes)
  3. Export FBX with correct settings (scale 1.0, Apply Unit, -Y Forward, Z Up, Face smoothing, tangent space)
  4. Export textures separately if not embedded in FBX

Phase 3: Import (Unreal)

  1. Import FBX with correct settings (import normals/tangents, no auto-generated collision if using custom)
  2. Verify scale, orientation, and pivot point
  3. Enable Nanite if the mesh is eligible
  4. Set up materials using material instances from your master material
  5. Configure collision — verify custom collision or generate simple collision
  6. Set up LODs if not using Nanite

Phase 4: Placement (Unreal)

  1. Hand-place unique and hero assets
  2. Configure the Procedural Placement Tool with scatter rules for environment fill
  3. Define biome zones for large worlds
  4. Run scatter passes (major features, then ground cover, then linear features)
  5. Review, adjust, iterate

Phase 5: Validation

  1. Test collision — walk the player through the environment, verify physics interactions
  2. Check LOD transitions — view from multiple distances, verify no jarring pop-in
  3. Profile performance — check draw calls, memory usage, streaming behavior
  4. Review at target resolution — texel density should be consistent across visible surfaces

Tips and Gotchas

Gotcha: Inverted Normals

Symptom: Faces appear inside-out in Unreal (dark, or visible from the wrong side).

Cause: Normals are flipped in Blender. This often happens with mirrored geometry or boolean operations.

Fix: In Blender, select the mesh, enter Edit Mode, select all faces, Mesh > Normals > Recalculate Outside. Then re-export. In Unreal, you can enable "Two-Sided" on the material as a temporary workaround, but this doubles the draw cost — fix it in Blender.

Gotcha: Scale Mismatch

Symptom: Assets import at 100x or 1/100th the expected size.

Cause: Blender meters vs. Unreal centimeters. The FBX export scale setting or "Apply Unit" checkbox is wrong.

Fix: In Blender's FBX export dialog, set Scale to 1.0 and enable "Apply Unit." If your Blender scene units are set to centimeters (unusual but possible), set Scale to 1.0 and disable "Apply Unit."

Gotcha: Normal Map Green Channel

Symptom: Lighting on surfaces looks like it's pushing in instead of out (bumps appear as dents, or vice versa).

Cause: Blender uses OpenGL normal maps (green channel = Y+). Unreal uses DirectX normal maps (green channel = Y-). The green channel is inverted.

Fix: Flip the green channel in your image editor before importing into Unreal. Or, in Unreal's texture settings, enable "Flip Green Channel" on the normal map texture.

Gotcha: Smoothing Groups

Symptom: Hard edges appear where you expect smooth shading, or vice versa.

Cause: Blender doesn't use smoothing groups natively — it uses auto-smooth with angle-based or edge-marked smooth/sharp. FBX export can misinterpret this.

Fix: In Blender, use "Edge > Mark Sharp" to explicitly control hard edges. In the FBX export dialog, set Smoothing to "Face." In Unreal's import settings, use "Import Normals and Tangents."

Gotcha: Missing Second UV Channel

Symptom: Lightmap baking produces artifacts (overlapping shadows, stretched lighting).

Cause: The mesh doesn't have a second UV channel for lightmaps, or the second UV channel has overlapping UVs.

Fix: In Blender, add a second UV channel (UV Maps panel in Mesh Data properties). Auto-unwrap it with no overlapping islands (Smart UV Project works for lightmap UVs). Name it "UVMap_Lightmap" or similar. In Unreal, set the mesh's "Light Map Coordinate Index" to 1 (the second UV channel). Alternatively, check "Generate Lightmap UVs" on import to have Unreal auto-generate a lightmap UV.

Tip: Batch Export with Blender

If you have dozens of assets to export, automate it. Write a Blender Python script (or use the Blender MCP Server to generate one) that iterates through collections and exports each object as a separate FBX with the correct settings. This ensures consistent settings across all exports and saves time.

Tip: Use DataSmith for Complex Scenes

If you're importing entire scenes (architectural visualization, environment kits with many materials), consider the Datasmith importer. It handles complex material setups and scene hierarchies better than the standard FBX importer for multi-material, multi-object scenes.

Tip: Version Your Assets

Keep your Blender source files organized and version-controlled alongside your Unreal project. When you need to adjust an asset (fix collision, add a LOD, change UVs), having the source file in the same repository means the pipeline is one export away, not a search through backup drives.

Tip: Texel Density Consistency

Maintain consistent texel density across your assets. If a 2m wall uses a 2048x2048 texture, a 4m wall should use a 4096x4096 texture (or tile the 2048 texture twice). Inconsistent texel density is immediately visible in-game — some surfaces look sharp while adjacent surfaces look blurry.

Getting Started

If you're new to the Blender-to-Unreal pipeline:

  1. Start with one asset. Model a simple prop (crate, barrel, rock) in Blender, export it, import it into Unreal, and place it in a level. This end-to-end test reveals your specific pipeline issues before you invest in creating dozens of assets.

  2. Set up your master material early. Create one or two master materials in Unreal before importing production assets. This ensures consistent material quality and makes assigning materials fast.

  3. Establish naming conventions immediately. Renaming 200 assets after the fact is painful. Start with SM_, T_, M_ prefixes from your first asset.

  4. Try the Blender MCP Server for setup tasks. Even if you prefer manual modeling, AI assistance for UV setup, export configuration, batch operations, and geometry node graphs saves significant time on non-creative tasks.

  5. Use the Procedural Placement Tool once you have asset variations. Five rock variations and three tree types are enough to populate a convincing environment. You don't need hundreds of unique assets — you need a few good ones placed well.

The Blender-to-Unreal pipeline has rough edges, but once you establish your export/import settings and material workflow, it becomes second nature. The key is getting the foundational settings right on your first few assets and then applying them consistently. Every hour spent setting up a clean pipeline saves ten hours of debugging weird import issues later.

Tags

BlenderUnreal EnginePipelineIndie Dev3d Modeling

Continue Reading

tutorial

UE5 Landscape & World Partition: Building Truly Massive Open Worlds in 2026

Read more
tutorial

Multiplayer-Ready Architecture: Designing Your UE5 Game Systems for Replication

Read more
tutorial

AI in Game Development 2026: What's Actually Useful vs. Hype

Read more
All posts
StraySparkStraySpark

Game Studio & UE5 Tool Developers. Building professional-grade tools for the Unreal Engine community.

Products

  • Complete Toolkit (Bundle)
  • Procedural Placement Tool
  • Cinematic Spline Tool
  • Blueprint Template Library
  • Unreal MCP Server
  • Blender MCP Server

Resources

  • Documentation
  • Blog
  • Changelog
  • Roadmap
  • FAQ
  • Contact

Legal

  • Privacy Policy
  • Terms of Service

© 2026 StraySpark. All rights reserved.