It's Never Too Late: Noise Optimization for Collapse Recovery in Trained Diffusion Models
Authors
Anne Harrington A. Sophia Koepke Shyamgopal Karthik Trevor Darrell Alexei A. Efros
Abstract
Contemporary text-to-image models exhibit a surprising degree of mode collapse, as can be seen when sampling several images given the same text prompt. While previous work has attempted to address this issue by steering the model using guidance mechanisms, or by generating a large pool of candidates and refining them, in this work we take a different direction and aim for diversity in generations via noise optimization. Specifically, we show that a simple noise optimization objective can mitigate mode collapse while preserving the fidelity of the base model. We also analyze the frequency characteristics of the noise and show that alternative noise initializations with different frequency profiles can improve both optimization and search. Our experiments demonstrate that noise optimization yields superior results in terms of generation quality and variety.
Paper Summary
Problem
Key Innovation
Practical Impact
Analogy / Intuitive Explanation
Paper Information
2601.00090v1