Black Forest Labs releases Flux 2 in three variants
Black Forest Labs releases Flux 2 in three variants
Black Forest Labs published Flux 2, a visual intelligence model family focused on photorealism, editing and compositional understanding.
Model lineup and capabilities
The release includes Pro, Flex and an open-source Dev variant with differing resource and control trade-offs.
Flux 2 emphasizes reduced plastic-looking skin and environments, supports generation at 4 MP, and improves small-text rendering and inter-object interactions.
Usage modes and control
Pro targets maximum speed and quality in the cloud, while Flex offers finer parameter control for users requiring adjustable generation settings.
Editing by text can be performed from a single reference or from many references, which aids creation of stable characters and multi-component objects.
Dev release and resource demands
The open-source Dev model on Hugging Face contains 32B parameters and derives from the Flux 2 base, demanding substantial compute and memory resources.
Running the model on a 4090 with 128 GB RAM historically consumed most available memory; recent updates reduced full RAM staging and improved iteration speed.
Reported performance reached about 3.8 sec/it in fp8 and 5.5 sec/it in fp16 when generating 1248x832 images over 20 steps, equating to roughly one to two minutes per image.
Quality, optimizations and licensing
Community-compressed builds such as GGUF and other optimizations are appearing, though they may affect final image fidelity compared with full checkpoints.
Even full bf16 runs of a 64 GB variant at 4 MP and 50 steps still produce many flawed outputs, including distorted logos and misplaced text, with about three minutes per image reported.
The Dev distribution is provided under a non-commercial license, while competing models mentioned in community comparisons allow commercial use of generated outputs.
Ecosystem and follow-up
Developers plan a lighter distillate called Flux 2 Klein under an Apache 2 license, but expectations for that smaller fork are more modest given current baseline issues.
Workflows and template support have already been added by several tools, and training toolkits now include Flux 2 compatibility and guidance for fine-tuning with custom LoRA modules.