HDR-Plenoxels: Self-Calibrating
High Dynamic Range Radiance Fields

POSTECH
European Conference on Computer Vision (ECCV) 2022

1) Plenoxels synthesize an HDR image from HDR radiance by ray-marching, then 2) the differentiable tone-mapping function maps from HDR to LDR in an end-to-end manner.



The static and varying mean camera conditions include exposure, white balance, and CRF. The static camera condition is a controlled environmental setting, i.e., all views of the scene have the same components of exposure, white balance, and CRF. The varying camera condition is alternated environmental settings, i.e., all views of the scene have different components.

Abstract

We propose high dynamic range (HDR) radiance fields, HDR-Plenoxels, that learn a plenoptic function of 3D HDR radiance fields, geometry information, and varying camera settings inherent in 2D low dynamic range (LDR) images.

Our voxel-based volume rendering pipeline reconstructs HDR radiance fields with only multi-view LDR images taken from varying camera settings in an end-to-end manner and has a fast convergence speed. To deal with various cameras in real-world scenarios, we introduce a tone mapping module that models the digital in-camera imaging pipeline (ISP) and disentangles radiometric settings. Our tone mapping module allows us to render by controlling the radiometric settings of each novel view.

Finally, we build a multi-view dataset with varying camera conditions, which fits our problem setting. Our experiments show that HDR-Plenoxels can express detail and high-quality HDR novel views from only LDR images with various cameras.

Video

BibTeX


    @inproceedings{jun2022hdr,
        title = {HDR-Plenoxels: Self-Calibrating High Dynamic Range Radiance Fields},
        author = {Jun-Seong, Kim and Yu-Ji, Kim and Ye-Bin, Moon and Oh, Tae-Hyun},
        booktitle = {ECCV},
        year = {2022},
    }