Initialization Sparsity in 3D Gaussian Splatting

Experimental analysis of initialization sparsity in 3D Gaussian Splatting (3DGS): we measure how densification and reconstruction quality (PSNR/SSIM) vary with the number of initial SfM points.

As you might know, most 3D Gaussian Splatting (3DGS) pipelines are initialized from a point cloud—typically produced by COLMAP/SfM. When that cloud is sparse, it’s not obvious how much initialization sparsity will limit the final result. In this blog, we measure how densification and reconstruction quality (PSNR/SSIM) vary with the number of initial SfM points.

Goal of this post: explore how the number of initial Gaussians (i.e., SfM points) affects 3DGS quality (PSNR/SSIM), and learn when to densify sparse point clouds using state-of-the-art methods.

If you want a minimal PyTorch implementation to hack on, this tutorial is a good starting point: “3D Gaussian Splatting from scratch in 100 lines of PyTorch (no CUDA, no C++)”.

Experimental Setup: Studying Sparsity at Initialization

We run the same 3DGS training on the same scene, but vary the number of Gaussians at initialization (roughly: 20k, 41k, 51k, 103k, 206k). We track (1) how the number of Gaussians grows over training iterations, and (2) final reconstruction quality measured by PSNR and SSIM.

1) Good News: Even Very Sparse Starts Grow to >1M Gaussians

The first result is surprisingly positive: even when starting from very few points, the built-in densification in 3DGS is robust. All runs eventually expand to about the same final magnitude: around ~1 million Gaussians. This is exactly what you want to see if you’re bootstrapping from sparse initialization.

I’ll soon publish a deep-dive on 3DGS’s built-in densification (with clean PyTorch code) — subscribe to my newsletter to get notified when it’s released.

Growth of Gaussians during training for different initialization sizes

Practical implication: if your device captures sparse points, 3DGS will still densify and produce a rich representation. You are not stuck with the initial sparsity.

2) Yet: Sparse Initialization Still Lowers Final Quality (PSNR/SSIM)

Here is the catch: densification can equalize the count, but it does not fully equalize the quality. The chart below plots PSNR and SSIM at convergence as a function of initialization size.

PSNR and SSIM vs number of Gaussians at initialization

The trend is clear: starting with more Gaussians yields better reconstruction fidelity. For example, initializing with ~200k Gaussians (instead of only ~25k) raised final PSNR from roughly ~26.5 dB to ~28.2 dB, and SSIM from about ~0.88 to ~0.908 in our test scene. So while training can expand the cloud, the final quality doesn’t fully recover if your initialization is too sparse.

Interpretation: 3DGS densification is not magic — it tends to refine and replicate what’s already present. If the initialization under-covers surfaces or misses thin structures, later densification may fill space, but it can struggle to place Gaussians in the “right” places early enough to achieve the same optimum.

When to Densify a Sparse Point Cloud

If your capture device produces a very sparse point cloud (few viewpoints, weak SfM, sparse LiDAR, noisy depth fusion), you will likely need to densify before or during training. Below are practical options, ordered from “closest to classic 3DGS” to “initialization-free”.

A) Point Cloud Densification Before Training (Initialization-Aware)

Point Cloud Densification for 3D Gaussian Splatting from Sparse Input Views proposes a densification method that generates high-quality point clouds for improved initialization. Their key point: depth-map regularization methods can be sensitive to depth accuracy, so instead they construct a better point cloud to start training. (Chan et al., 2024)

B) Feed-Forward Gaussian Prediction (Bypass SfM)

MVSplat predicts a clean Gaussian scene in a single forward pass from sparse multi-view images by building a cost volume via plane sweeping. This is a strong option when SfM is unreliable or too sparse. (Chen et al., 2024)

C) Initialization-Free / Robust-to-Bad-Init Methods

AttentionGS targets initialization-free 3DGS by using structural attention: geometric attention helps recover global structure early, then texture attention refines details. This is particularly relevant when your point cloud is missing large regions or SfM is unstable. (Liu et al., 2025)

D) Smarter Densification During Training (Better Placement)

ConeGS proposes error-guided densification: it inserts Gaussians along pixel viewing cones at depth estimates from a fast proxy, rather than only cloning along existing geometry. This can improve reconstruction quality under tight Gaussian budgets (useful when memory is constrained). (Baranowski et al., 2025)

E) Efficient Initialization via Pruning (Same Quality, Fewer Gaussians)

If your issue is not only sparsity but also redundancy (too many unhelpful points), SDI-GS uses segmentation-driven initialization to keep structurally significant regions. They report up to ~50% Gaussian reduction while maintaining comparable PSNR/SSIM. (Li et al., 2025)

Rule of thumb: if your SfM point cloud under-covers surfaces (holes / missing thin structures), densify. If your point cloud is dense but messy (background / redundant regions), prune. If SfM is unreliable, use feed-forward or initialization-free methods.

Conclusion

Dense, reliable initialization is key to 3D Gaussian Splatting. Our experiment shows a nuanced picture:

If your pipeline starts from sparse SfM, the safest path is to densify (or predict Gaussians feed-forward) before training.

📘 Learn 3DGS Step-by-Step (PyTorch Only)

Want to truly understand 3D Gaussian Splatting—not just run a repo? My 3D Gaussian Splatting Course teaches the full pipeline from first principles in PyTorch only (no C++, no CUDA). You’ll learn initialization, densification, rendering, and how to experiment with recent papers.

Explore the Course →

References

  1. Kerbl et al., 3D Gaussian Splatting for Real-Time Radiance Field Rendering (2023): arXiv · project
  2. Chan et al., Point Cloud Densification for 3D Gaussian Splatting from Sparse Input Views (2024): OpenReview
  3. Chen et al., MVSplat: Efficient 3D Gaussian Splatting from Sparse Multi-View Images (2024): arXiv · project · code
  4. Liu et al., AttentionGS: Towards Initialization-Free 3D Gaussian Splatting via Structural Attention (2025): arXiv
  5. Baranowski et al., ConeGS: Error-Guided Densification Using Pixel Cones for Improved Reconstruction with Fewer Primitives (2025): arXiv · project
  6. Li et al., Segmentation-Driven Initialization for Sparse-view 3D Gaussian Splatting (2025): arXiv
  7. Papers in 100 Lines of Code (2025): 3DGS tutorial in 100 lines of PyTorch