Arbitrary Optics for Gaussian Splatting Using Space Warping
Due to recent advances in 3D reconstruction from RGB images, it is now possible to create photorealistic representations of real-world scenes that only require minutes to be reconstructed and can be rendered in real time. In particular, 3D Gaussian splatting shows promising results, outperforming pr...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2024-12-01
|
Series: | Journal of Imaging |
Subjects: | |
Online Access: | https://www.mdpi.com/2313-433X/10/12/330 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1846104216660082688 |
---|---|
author | Jakob Nazarenus Simin Kou Fang-Lue Zhang Reinhard Koch |
author_facet | Jakob Nazarenus Simin Kou Fang-Lue Zhang Reinhard Koch |
author_sort | Jakob Nazarenus |
collection | DOAJ |
description | Due to recent advances in 3D reconstruction from RGB images, it is now possible to create photorealistic representations of real-world scenes that only require minutes to be reconstructed and can be rendered in real time. In particular, 3D Gaussian splatting shows promising results, outperforming preceding reconstruction methods while simultaneously reducing the overall computational requirements. The main success of 3D Gaussian splatting relies on the efficient use of a differentiable rasterizer to render the Gaussian scene representation. One major drawback of this method is its underlying pinhole camera model. In this paper, we propose an extension of the existing method that removes this constraint and enables scene reconstructions using arbitrary camera optics such as highly distorting fisheye lenses. Our method achieves this by applying a differentiable warping function to the Gaussian scene representation. Additionally, we reduce overfitting in outdoor scenes by utilizing a learnable skybox, reducing the presence of floating artifacts within the reconstructed scene. Based on synthetic and real-world image datasets, we show that our method is capable of creating an accurate scene reconstruction from highly distorted images and rendering photorealistic images from such reconstructions. |
format | Article |
id | doaj-art-72c70c4fc0554e5e8541d098e30473e3 |
institution | Kabale University |
issn | 2313-433X |
language | English |
publishDate | 2024-12-01 |
publisher | MDPI AG |
record_format | Article |
series | Journal of Imaging |
spelling | doaj-art-72c70c4fc0554e5e8541d098e30473e32024-12-27T14:32:36ZengMDPI AGJournal of Imaging2313-433X2024-12-01101233010.3390/jimaging10120330Arbitrary Optics for Gaussian Splatting Using Space WarpingJakob Nazarenus0Simin Kou1Fang-Lue Zhang2Reinhard Koch3Department of Computer Science, Kiel University, 24118 Kiel, GermanySchool of Engineering and Computer Science, Victoria University of Wellington, Wellington 6012, New ZealandSchool of Engineering and Computer Science, Victoria University of Wellington, Wellington 6012, New ZealandDepartment of Computer Science, Kiel University, 24118 Kiel, GermanyDue to recent advances in 3D reconstruction from RGB images, it is now possible to create photorealistic representations of real-world scenes that only require minutes to be reconstructed and can be rendered in real time. In particular, 3D Gaussian splatting shows promising results, outperforming preceding reconstruction methods while simultaneously reducing the overall computational requirements. The main success of 3D Gaussian splatting relies on the efficient use of a differentiable rasterizer to render the Gaussian scene representation. One major drawback of this method is its underlying pinhole camera model. In this paper, we propose an extension of the existing method that removes this constraint and enables scene reconstructions using arbitrary camera optics such as highly distorting fisheye lenses. Our method achieves this by applying a differentiable warping function to the Gaussian scene representation. Additionally, we reduce overfitting in outdoor scenes by utilizing a learnable skybox, reducing the presence of floating artifacts within the reconstructed scene. Based on synthetic and real-world image datasets, we show that our method is capable of creating an accurate scene reconstruction from highly distorted images and rendering photorealistic images from such reconstructions.https://www.mdpi.com/2313-433X/10/12/3303D reconstructionnovel view synthesis3D Gaussian Splattingcamera models |
spellingShingle | Jakob Nazarenus Simin Kou Fang-Lue Zhang Reinhard Koch Arbitrary Optics for Gaussian Splatting Using Space Warping Journal of Imaging 3D reconstruction novel view synthesis 3D Gaussian Splatting camera models |
title | Arbitrary Optics for Gaussian Splatting Using Space Warping |
title_full | Arbitrary Optics for Gaussian Splatting Using Space Warping |
title_fullStr | Arbitrary Optics for Gaussian Splatting Using Space Warping |
title_full_unstemmed | Arbitrary Optics for Gaussian Splatting Using Space Warping |
title_short | Arbitrary Optics for Gaussian Splatting Using Space Warping |
title_sort | arbitrary optics for gaussian splatting using space warping |
topic | 3D reconstruction novel view synthesis 3D Gaussian Splatting camera models |
url | https://www.mdpi.com/2313-433X/10/12/330 |
work_keys_str_mv | AT jakobnazarenus arbitraryopticsforgaussiansplattingusingspacewarping AT siminkou arbitraryopticsforgaussiansplattingusingspacewarping AT fangluezhang arbitraryopticsforgaussiansplattingusingspacewarping AT reinhardkoch arbitraryopticsforgaussiansplattingusingspacewarping |