Publication | IEEE International Conference on Computer Vision 2021
Building-GAN
Graph-Conditioned Architectural Volumetric Design Generation
This paper extends the traditional 2D layout generation to 3D volumetric design.
Download publicationAssociated Autodesk Researchers
Jieliang (Rodger) Luo
Sr. Principal AI Research Scientist
Abstract
Building-GAN: Graph-Conditioned Architectural Volumetric Design Generation
Kai-Hung Chang, Chin-Yi Cheng, Jieliang Luo, Shingo Murata, Mehdi Nourbakhsh, Yoshito Tsuji
IEEE International Conference on Computer Vision (ICCV) 2021
Volumetric design is the first and critical step for professional building design, where architects not only depict the rough 3D geometry of the building but also specify the programs to form a 2D layout on each floor. Though 2D layout generation for a single story has been widely studied, there is no developed method for multi-story buildings. This paper focuses on volumetric design generation conditioned on an input program graph. Instead of outputting dense 3D voxels, we propose a new 3D representation named voxel graph that is both compact and expressive for building geometries. Our generator is a cross-modal graph neural network that uses a pointer mechanism to connect the input program graph and the output voxel graph, and the whole pipeline is trained using the adversarial framework. The generated designs are evaluated qualitatively by a user study and quantitatively using three metrics: quality, diversity, and connectivity accuracy. We show that our model generates realistic 3D volumetric designs and outperforms previous methods and baselines.
Related Resources
2023
Task-Centric Application Switching: How and Why Knowledge Workers Switch Software Applications for a Single TaskThis research studies task-centric application switching and…
2023
Learned Visual Features to Textual ExplanationsA novel method that leverages the capabilities of large language…
2023
Accelerating Scientific Computing with JAX-LBMExploring the fusion of JAX and LBM for ground-breaking research in…
2022
Neural Implicit Style-Net: synthesizing shapes in a preferred style exploiting self supervisionWe introduce a novel approach to disentangle style from content in the…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us