Couverture de Generative Modeling via Drifting — One-Step Image Generation

Generative Modeling via Drifting — One-Step Image Generation

Generative Modeling via Drifting — One-Step Image Generation

Écouter gratuitement

Voir les détails

À propos de ce contenu audio

Researchers from MIT and Harvard propose Drifting Models, a new paradigm for generative modeling that achieves state-of-the-art image generation in a single forward pass. Instead of iterating at inference time like diffusion models, Drifting Models evolve the generated distribution during training using an elegant attraction-repulsion mechanism. The result: one-step image generation with FID 1.54 on ImageNet 256x256, beating even multi-step diffusion models. From the lab of Kaiming He, the creator of ResNet.
Aucun commentaire pour le moment