Labeling is intrinsically important for exploring and understanding complex environments and models in a variety of domains. We present a method for interactive labeling of crowded 3D scenes containing very many instances of objects spanning multiple scales in size. In contrast to previous labeling methods, we target cases where many instances of dozens of types are present and where the hierarchical structure of the objects in the scene presents an opportunity to choose the most suitable level for each placed label. Our solution builds on and goes beyond labeling techniques in medical 3D visualization, cartography, and biological illustrations from books and prints. In contrast to these techniques, the main characteristics of our new technique are: 1) a novel way of labeling objects as part of a bigger structure when appropriate, 2) visual clutter reduction by labeling only representative instances for each type of an object, and a strategy of selecting those. The appropriate level of label is chosen by analyzing the scene's depth buffer and the scene objects' hierarchy tree. We address the topic of communicating the parent-children relationship between labels by employing visual hierarchy concepts adapted from graphic design. Selecting representative instances considers several criteria tailored to the character of the data and is combined with a greedy optimization approach. We demonstrate the usage of our method with models from mesoscale biology where these two characteristics - multi-scale and multi-instance - are abundant, along with the fact that these scenes are extraordinarily dense.
|Original language||English (US)|
|Number of pages||10|
|Journal||IEEE Transactions on Visualization and Computer Graphics|
|State||Published - Jan 2019|
Bibliographical noteFunding Information:
This work has been funded through the ILLVISATION grant by WWTF (VRG11-010) and the ILLUSTRARE grant by FWF (I 2953-N31). Additionally, authors have been supported by the Ministry of Education, Youth and Sports of Czechia within the activity MOBILITY (MSMT-539/2017-1) under the identification code 7AMB17AT021, together with the Centre for International Cooperation & Mobility of OeAD-GmbH under the project identification number CZ 11/2017, Further funding was provided by the Internal Masaryk University grant (MU/0822/2015), and by European Unions Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 747985. The work was partly written in collaboration with the VRVis Competence Center. VRVis is funded by BMVIT, BMWFW, Styria, SFG and Vienna Business Agency in the scope of COMET - Competence Centers for Excellent Technologies (854174), which is managed by FFG. The Scripps Research Institute researchers acknowledge support from the National Institutes of Health under the grand R01GM120604. We would like to acknowledge support for GTJ from: GM115370. We thank the Allen Institute for Cell Science founder, Paul G. Allen, for his vision, encouragement, and support.
© 2018 IEEE.
- multi-instance data
- multi-scale data
ASJC Scopus subject areas
- Signal Processing
- Computer Vision and Pattern Recognition
- Computer Graphics and Computer-Aided Design