Dream 7B is a diffusion large language model that outperforms existing models in text generation, offering superior planning abilities and flexible inference capabilities.
Dream 7B exhibits enhanced planning capabilities, particularly effective for tasks with complex constraints or objectives, outperforming similar-sized autoregressive models.
Allows for more diverse and controlled text generation by enabling outputs to be synthesized in arbitrary orders, and offers adjustable quality-speed trade-offs.
Enhances global coherence in text generation by integrating information from both directions, unlike sequential generation in AR models.
Supports generation of text in non-sequential orders to allow for more diverse user queries through flexible inference behavior.
Adjusts the number of generated tokens per step to balance between speed and accuracy, complementing traditional methods like chain-of-thought reasoning.