Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add EVER and 3D Gaussian Splatting with Deferred Reflection papers #321

Merged
merged 3 commits into from
Feb 13, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added assets/thumbnails/mai2024ever.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/thumbnails/ye20243d.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
64 changes: 64 additions & 0 deletions awesome_3dgs_papers.yaml
Original file line number Diff line number Diff line change
@@ -4644,6 +4644,38 @@
thumbnail: assets/thumbnails/xie2024supergs.jpg
publication_date: '2024-10-03T15:18:28+00:00'
date_source: arxiv
- id: mai2024ever
title: 'EVER: Exact Volumetric Ellipsoid Rendering for Real-time View Synthesis'
authors: Alexander Mai, Peter Hedman, George Kopanas, Dor Verbin, David Futschik,
Qiangeng Xu, Falko Kuester, Jonathan T. Barron, Yinda Zhang
year: '2024'
abstract: 'We present Exact Volumetric Ellipsoid Rendering (EVER), a method for
real-time differentiable emission-only volume rendering. Unlike recent rasterization
based approach by 3D Gaussian Splatting (3DGS), our primitive based representation
allows for exact volume rendering, rather than alpha compositing 3D Gaussian billboards.
As such, unlike 3DGS our formulation does not suffer from popping artifacts and
view dependent density, but still achieves frame rates of $\sim\!30$ FPS at 720p
on an NVIDIA RTX4090. Since our approach is built upon ray tracing it enables
effects such as defocus blur and camera distortion (e.g. such as from fisheye
cameras), which are difficult to achieve by rasterization. We show that our method
is more accurate with fewer blending issues than 3DGS and follow-up work on view-consistent
rendering, especially on the challenging large-scale scenes from the Zip-NeRF
dataset where it achieves sharpest results among real-time techniques.

'
project_page: https://half-potato.gitlab.io/posts/ever/
paper: https://arxiv.org/pdf/2410.01804.pdf
code: https://github.com/half-potato/ever_training
video: https://youtu.be/dqLi2-v38LE
tags:
- Code
- Project
- Ray Tracing
- Rendering
- Video
thumbnail: assets/thumbnails/mai2024ever.jpg
publication_date: '2024-10-02T17:59:09+00:00'
date_source: arxiv
- id: cao20243dgsdet
title: '3DGS-DET: Empower 3D Gaussian Splatting with Boundary Guidance and Box-Focused
Sampling for 3D Object Detection'
@@ -6720,6 +6752,38 @@
- SLAM
thumbnail: assets/thumbnails/peng2024rtgslam.jpg
publication_date: '2024-04-30T16:54:59+00:00'
- id: ye20243d
title: 3D Gaussian Splatting with Deferred Reflection
authors: Keyang Ye, Qiming Hou, Kun Zhou
year: '2024'
abstract: 'The advent of neural and Gaussian-based radiance field methods have achieved
great success in the field of novel view synthesis. However, specular reflection
remains non-trivial, as the high frequency radiance field is notoriously difficult
to fit stably and accurately. We present a deferred shading method to effectively
render specular reflection with Gaussian splatting. The key challenge comes from
the environment map reflection model, which requires accurate surface normal while
simultaneously bottlenecks normal estimation with discontinuous gradients. We
leverage the per-pixel reflection gradients generated by deferred shading to bridge
the optimization process of neighboring Gaussians, allowing nearly correct normal
estimations to gradually propagate and eventually spread over all reflective objects.
Our method significantly outperforms state-of-the-art techniques and concurrent
work in synthesizing high-quality specular reflection effects, demonstrating a
consistent improvement of peak signal-to-noise ratio (PSNR) for both synthetic
and real-world scenes, while running at a frame rate almost identical to vanilla
Gaussian splatting.

'
project_page: https://gapszju.github.io/3DGS-DR/
paper: https://arxiv.org/pdf/2404.18454.pdf
code: https://github.com/gapszju/3DGS-DR
video: https://youtu.be/3SsQZNXQBs8
tags:
- Code
- Project
- Video
thumbnail: assets/thumbnails/ye20243d.jpg
publication_date: '2024-04-29T06:24:32+00:00'
date_source: arxiv
- id: ni2024phyrecon
title: 'PhyRecon: Physically Plausible Neural Scene Reconstruction'
authors: Junfeng Ni, Yixin Chen, Bohan Jing, Nan Jiang, Bin Wang, Bo Dai, Puhao