Publication | Parallel Problem Solving From Nature (PPSN) 2022
Evolving Through the Looking Glass
Learning Improved Search Spaces with Variational Autoencoders
We show how generative machine learning can learn a representation corresponding to a valid region of search space, enabling optimizers to search in the new latent space and always find solutions that satisfy constraints or additional criteria. This work continues previous work and demonstrates the method for more complex constraints and additional criteria.
Download publicationAbstract
Evolving Through the Looking Glass: Learning Improved Search Spaces with Variational Autoencoders.
Bentley, P. J., Lim, S. L., Gaier, A. and Tran, L.
Parallel Problem Solving From Nature (PPSN) 2022
Nature has spent billions of years perfecting our genetic representations, making them evolvable and expressive. Generative machine learning offers a shortcut: learn an evolvable latent space with implicit biases towards better solutions. We present SOLVE: Search space Optimization with Latent Variable Evolution, which creates a dataset of solutions that satisfy extra problem criteria or heuristics, generates a new latent search space, and uses a genetic algorithm to search within this new space to find solutions that meet the overall objective. We investigate SOLVE on five sets of criteria designed to detrimentally affect the search space and explain how this approach can be easily extended as the problems become more complex. We show that, compared to an identical GA using a standard representation, SOLVE with its learned latent representation can meet extra criteria and find solutions with distance to optimal up to two orders of magnitude closer. We demonstrate that SOLVE achieves its results by creating better search spaces that focus on desirable regions, reduce discontinuities, and enable improved search by the genetic algorithm.
Associated Researchers
Soo Ling Lim
University College London
Linh Tran
Autodesk AI Lab
Related Resources
2024
XLB: A Differentiable Massively Parallel Lattice Boltzmann Library in PythonThis research introduces the XLB library, a scalable Python-based…
2023
Learned Visual Features to Textual ExplanationsA novel method that leverages the capabilities of large language…
2022
Neural Implicit Style-Net: synthesizing shapes in a preferred style exploiting self supervisionWe introduce a novel approach to disentangle style from content in the…
2022
SkexGen: Autoregressive Generation of CAD Construction Sequences with Disentangled CodebooksWe present SkexGen, a novel autoregressive generative model for…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us