For most of photogrammetry's history, indie use was a fantasy. The capture rigs cost too much, the software was expensive and crashed on consumer hardware, the resulting assets were unwieldy poly counts, and the pipeline into a game engine was a multi-day per-asset slog. By April 2026 four things have changed simultaneously: RealityCapture went free for non-commercial use after the Epic acquisition (and indie commercial pricing is now reasonable), Polycam and Luma democratized phone-based capture, Gaussian Splatting matured for real-time rendering, and UE 5.7's Nanite makes "ridiculous poly count" basically free.
This post is the practical 2026 read for indie devs adding photogrammetry to their open-world pipeline. What each tool actually does, what hardware you actually need, where the workflow breaks, and how to integrate captured assets into a UE 5.7 or Unity 7 project without a six-month learning ramp. For broader context see our environment art pipeline post and Gaussian Splatting UE5 capture post.
The Three Tools That Matter
RealityCapture (Epic-owned, 2026)
Epic acquired Capturing Reality in 2021, integrated RealityCapture deeper with UE in 2024–2025, and as of 2026 the pricing model is:
- Free for individuals earning under $1M/year (matching the Unreal Engine royalty threshold). This change in 2024 was the watershed moment for indie photogrammetry.
- Pay-per-input model above the threshold, scaling with input image count.
- First-class UE 5.7 integration — capture inside RC, click "send to UE," asset arrives with proper materials.
What it's good at:
- Highest-fidelity reconstructions. Truly production quality.
- Handles thousands of input images.
- Mesh quality and texture quality both excellent.
- Direct UE pipeline.
Weaknesses:
- Steep learning curve.
- Requires a real Windows workstation with a GPU.
- Slower than Polycam / Luma for fast in-the-field captures.
Polycam
Polycam took the "phone-based photogrammetry" approach further than anyone. By 2026 it offers:
- iPhone Pro / iPad Pro LiDAR-assisted capture
- Photo-only mode for any phone
- Cloud processing in 2–10 minutes
- Direct USDZ / OBJ / GLB / FBX export
- Web viewer for sharing
- Free tier with watermarks; Pro at $19/month
What it's good at:
- Fast captures in the field.
- Surprisingly good quality for objects and small scenes.
- Smooth pipeline to Blender for cleanup.
Weaknesses:
- Not the right tool for large outdoor scenes.
- Cloud processing means uploading captures (privacy / IP consideration).
Luma AI
Luma started as a NeRF tool, evolved into Gaussian Splatting, and by 2026 ships:
- Phone-based capture with cloud processing
- Gaussian Splatting output (the new hotness)
- Direct UE 5.7 plugin for splat playback
- Photogrammetric mesh export as fallback
- Free tier with watermarks; Pro at $30/month
What it's good at:
- Capturing things photogrammetry struggles with — glass, foliage, glossy surfaces, complex lighting.
- Real-time-ish rendering of captured environments.
- Great for environmental backdrops and "set dressing" scenes.
Weaknesses:
- Splats are not modifiable like meshes. Edit-by-recapture.
- File sizes are large. Streaming-friendly but not lightweight.
- Technology still maturing — UE plugin had stability issues until late 2025.
The Three Capture Use Cases for Indies
Hero Props and Statues
Single objects you want hero-quality versions of. Old stone statues. Architectural elements. Decorative props.
Best tool: Polycam for the capture, Blender for cleanup, RealityCapture only if you need extreme detail.
Workflow:
- Phone capture, ~50–100 photos, 5 minutes
- Cloud processing in Polycam, ~5 minutes
- Download .obj or .glb
- Open in Blender, retopologize to under 50k tris
- Bake high-poly to low-poly normal map
- Generate ORM via Substance or ComfyUI workflow
- Import into UE 5.7
Time per hero asset: 1–2 hours after capture.
Environment Set Pieces
Larger scenes or scene fragments. A ruined wall. A patch of forest floor. A market stall.
Best tool: RealityCapture if you have time and a workstation. Polycam Pro if you need speed.
Workflow:
- Capture with DSLR or Sony phone for best quality, ~200–400 photos
- Process in RealityCapture (15–60 min on a real GPU)
- Decimation pass — Nanite handles high poly so be liberal here, but stay reasonable
- Texture cleanup in Substance Painter
- Import into UE 5.7 with Nanite enabled
Time per asset: half a day to a full day.
Background Environments (the new use case)
Distant landscapes and skybox-replacement vistas. Game-feel "this place exists" backdrops.
Best tool: Luma AI Gaussian Splatting.
Workflow:
- Phone-walk the location, ~3 minutes of video
- Upload to Luma
- Cloud splatting, ~10–30 minutes
- Download splat
- Place in UE 5.7 via the Luma plugin as a background element
- Composite with traditional skybox / lighting
Time per backdrop: ~1 hour.
This third use case is the new one for 2026 — Gaussian Splatting was not viable in indie pipelines before late 2025.
The Hardware You Actually Need
For capture:
- Smartphone camera (any 2022+ flagship): fine for Polycam and Luma. iPhone Pro's LiDAR helps.
- DSLR or mirrorless: meaningful quality bump for RealityCapture inputs. A used Sony A6000 ($300) is enough.
- Polarizing filter and overcast day: free quality multiplier. Reduces specular highlights that confuse photogrammetry.
For processing:
- Cloud-based (Polycam / Luma): zero local hardware required.
- RealityCapture: needs a Windows machine with a CUDA GPU. RTX 3060 12 GB is plenty.
For integration:
- UE 5.7 with Nanite: handles arbitrarily high poly counts. The historical "decimate aggressively" step is largely unnecessary.
- Unity 7 with high-poly streaming meshes: similar story.
What Photogrammetry Is Still Bad At
Honest weaknesses in 2026:
- Moving objects — water, foliage in wind, animals. Capture ambient still shots.
- Glossy and glass surfaces — Gaussian Splatting handles these better than mesh photogrammetry, which is part of why Luma is the right tool for backdrops.
- Featureless surfaces — flat white walls, smooth metal. Photogrammetry needs surface texture to triangulate.
- Underwater scenes — possible but specialized rigs only.
- Indoor scenes with strong light variation — windows blowing out highlights breaks reconstructions.
- Tiny details below capture resolution — coins, jewelry, fine engravings. Use a different technique.
The Legal Side
Things to know:
Site permissions. Capturing public land in most countries is fine. Capturing private property needs permission. Capturing inside museums almost always violates terms of entry. Capturing copyrighted works (statues, monuments under copyright in some countries) is jurisdiction-dependent.
Likeness. Avoid capturing identifiable people without consent.
Cloud processing. Polycam and Luma upload your captures to their servers. Read the terms — most are fine for game use but some imply non-exclusive licensing of your captures for their training. Read carefully or use RealityCapture locally.
Slot Into a UE 5.7 Pipeline
A practical 2026 indie environment pipeline incorporating photogrammetry:
1. Megascans / Quixel for baseline assets (free with UE)
2. Photogrammetry for hero props specific to your game's setting
3. Luma splats for distant backdrops
4. PCG framework for placement at scale (see our PCG post)
5. Nanite for rendering everything at high quality
6. Lumen for dynamic lighting
This pipeline ships a believable open-world for an indie team that does not have AAA art-direction budget.
Bottom Line
Photogrammetry in April 2026 is finally indie-accessible. RealityCapture is free for indies under the $1M threshold, Polycam costs $19 a month, Luma adds Gaussian Splatting backdrops at $30 a month, and UE 5.7's Nanite makes the historical "poly count anxiety" mostly obsolete. The pipeline that took weeks per asset in 2018 takes hours per asset in 2026.
The three use cases — hero props, environment set pieces, distant backdrops — fit naturally into an indie open-world workflow. Pair them with Megascans for baseline coverage and the PCG framework for placement, and a two-person indie team can produce environment art that visually competes with mid-tier 2020-era AAA. That is a real shift.
Start with one location this week. Capture, process, import. The first one is the longest. The fifth one will be an hour.