We propose **Neural Density-Distance Field (NeDDF)**, a
novel 3D representation that reciprocally constrains the distance and
density fields. NeDDF makes it possible to define a distance field for
objects with indefinite boundaries, such as smoke, hairballs, and
glass, without losing density information.

The figure visualizes (left) the 2D slice for each field with iron, hair, and glass spheres, and (right) the plots of 1D slices for each field.

NeRF provides no distance information.
Unsigned Distance Field (UDF) cannot
handle some cases correctly, such as
**(a) ambiguous density changes** such as a hairball or
**(b) low densities** such as a glass ball.

Proposed **NeDDF** can represent both
cases properly. NeDDF represents ambiguous density changes by the
**gradient value** of distance fields, and low densities by the
**minimal value** of distance fields.

We derive an expression that
**converts the distance and its gradient into density** using
the fact that the distance field is an integral of a polynomial
about density.

By using the expression,
**a density field consistent with the distance field**
represented by the neural field can be obtained in a
differentiable form.

By applying the distance field obtained by NeDDF, it is possible to
take the **reprojection error** with the pseudo corresponding
points in addition to the conventional photometric error. Combining
these two errors enables
**highly accurate camera pose estimation even when the initial pose
is poor and the overlap of silhouettes is small**.

@inproceedings{ueda2022neural, title={Neural Density-Distance Fields}, author={Ueda, Itsuki and Fukuhara, Yoshihiro and Kataoka, Hirokatsu and Aizawa, Hiroaki and Shishido, Hidehiko and Kitahara, Itaru}, booktitle={Proceedings of the European Conference on Computer Vision}, year={2022} }