|
4644 | 4644 | thumbnail: assets/thumbnails/xie2024supergs.jpg
|
4645 | 4645 | publication_date: '2024-10-03T15:18:28+00:00'
|
4646 | 4646 | date_source: arxiv
|
| 4647 | +- id: mai2024ever |
| 4648 | + title: 'EVER: Exact Volumetric Ellipsoid Rendering for Real-time View Synthesis' |
| 4649 | + authors: Alexander Mai, Peter Hedman, George Kopanas, Dor Verbin, David Futschik, |
| 4650 | + Qiangeng Xu, Falko Kuester, Jonathan T. Barron, Yinda Zhang |
| 4651 | + year: '2024' |
| 4652 | + abstract: 'We present Exact Volumetric Ellipsoid Rendering (EVER), a method for |
| 4653 | + real-time differentiable emission-only volume rendering. Unlike recent rasterization |
| 4654 | + based approach by 3D Gaussian Splatting (3DGS), our primitive based representation |
| 4655 | + allows for exact volume rendering, rather than alpha compositing 3D Gaussian billboards. |
| 4656 | + As such, unlike 3DGS our formulation does not suffer from popping artifacts and |
| 4657 | + view dependent density, but still achieves frame rates of $\sim\!30$ FPS at 720p |
| 4658 | + on an NVIDIA RTX4090. Since our approach is built upon ray tracing it enables |
| 4659 | + effects such as defocus blur and camera distortion (e.g. such as from fisheye |
| 4660 | + cameras), which are difficult to achieve by rasterization. We show that our method |
| 4661 | + is more accurate with fewer blending issues than 3DGS and follow-up work on view-consistent |
| 4662 | + rendering, especially on the challenging large-scale scenes from the Zip-NeRF |
| 4663 | + dataset where it achieves sharpest results among real-time techniques. |
| 4664 | + |
| 4665 | + ' |
| 4666 | + project_page: https://half-potato.gitlab.io/posts/ever/ |
| 4667 | + paper: https://arxiv.org/pdf/2410.01804.pdf |
| 4668 | + code: https://github.com/half-potato/ever_training |
| 4669 | + video: https://youtu.be/dqLi2-v38LE |
| 4670 | + tags: |
| 4671 | + - Code |
| 4672 | + - Project |
| 4673 | + - Ray Tracing |
| 4674 | + - Rendering |
| 4675 | + - Video |
| 4676 | + thumbnail: assets/thumbnails/mai2024ever.jpg |
| 4677 | + publication_date: '2024-10-02T17:59:09+00:00' |
| 4678 | + date_source: arxiv |
4647 | 4679 | - id: cao20243dgsdet
|
4648 | 4680 | title: '3DGS-DET: Empower 3D Gaussian Splatting with Boundary Guidance and Box-Focused
|
4649 | 4681 | Sampling for 3D Object Detection'
|
|
6720 | 6752 | - SLAM
|
6721 | 6753 | thumbnail: assets/thumbnails/peng2024rtgslam.jpg
|
6722 | 6754 | publication_date: '2024-04-30T16:54:59+00:00'
|
| 6755 | +- id: ye20243d |
| 6756 | + title: 3D Gaussian Splatting with Deferred Reflection |
| 6757 | + authors: Keyang Ye, Qiming Hou, Kun Zhou |
| 6758 | + year: '2024' |
| 6759 | + abstract: 'The advent of neural and Gaussian-based radiance field methods have achieved |
| 6760 | + great success in the field of novel view synthesis. However, specular reflection |
| 6761 | + remains non-trivial, as the high frequency radiance field is notoriously difficult |
| 6762 | + to fit stably and accurately. We present a deferred shading method to effectively |
| 6763 | + render specular reflection with Gaussian splatting. The key challenge comes from |
| 6764 | + the environment map reflection model, which requires accurate surface normal while |
| 6765 | + simultaneously bottlenecks normal estimation with discontinuous gradients. We |
| 6766 | + leverage the per-pixel reflection gradients generated by deferred shading to bridge |
| 6767 | + the optimization process of neighboring Gaussians, allowing nearly correct normal |
| 6768 | + estimations to gradually propagate and eventually spread over all reflective objects. |
| 6769 | + Our method significantly outperforms state-of-the-art techniques and concurrent |
| 6770 | + work in synthesizing high-quality specular reflection effects, demonstrating a |
| 6771 | + consistent improvement of peak signal-to-noise ratio (PSNR) for both synthetic |
| 6772 | + and real-world scenes, while running at a frame rate almost identical to vanilla |
| 6773 | + Gaussian splatting. |
| 6774 | + |
| 6775 | + ' |
| 6776 | + project_page: https://gapszju.github.io/3DGS-DR/ |
| 6777 | + paper: https://arxiv.org/pdf/2404.18454.pdf |
| 6778 | + code: https://github.com/gapszju/3DGS-DR |
| 6779 | + video: https://youtu.be/3SsQZNXQBs8 |
| 6780 | + tags: |
| 6781 | + - Code |
| 6782 | + - Project |
| 6783 | + - Video |
| 6784 | + thumbnail: assets/thumbnails/ye20243d.jpg |
| 6785 | + publication_date: '2024-04-29T06:24:32+00:00' |
| 6786 | + date_source: arxiv |
6723 | 6787 | - id: ni2024phyrecon
|
6724 | 6788 | title: 'PhyRecon: Physically Plausible Neural Scene Reconstruction'
|
6725 | 6789 | authors: Junfeng Ni, Yixin Chen, Bohan Jing, Nan Jiang, Bin Wang, Bo Dai, Puhao
|
|
0 commit comments