Jonas Kulhanek
PhD
Czech Technical University in Prague (CTU)
Towards a Unified 3D Scene Representation

3D scene representations, or 3D maps, are an essential component of a wide range of intelligent systems, such as self-driving cars, robots, or virtual reality. A fundamental limitation of the current approaches is, however, that they are designed for a specific sensor setup which makes them difficult to share between applications. The goal of this thesis project is thus to develop a unified map representation. One possibility is to represent the scene via an implicit neural function, i.e., a function that takes a 3D point as input and outputs density and colour value. This type of data structure has been shown to be able to model detailed scene geometry with high fidelity and can be trained from any 3D data, as well as raw sensor measurements such as images. However, unlike current neural radiance field approaches, which are optimised per scene, we aim to be able to estimate or initialise such a data structure quickly for new scenes. This could be enabled by building a database of 3D geometry parts which can be queried efficiently. We hope that our 3D scene representation will bridge the barriers between different modalities and will enable large-scale applications of systems which would otherwise require difficult-to-obtain data.

Track:
Academic Track
PhD Duration:
September 1st, 2021 - August 31st, 2025
First Exchange:
January 1st, 2024 - June 30th, 2024
ELLIS Edge Newsletter
Join the 6,000+ people who get the monthly newsletter filled with the latest news, jobs, events and insights from the ELLIS Network.