Please leave us a star ⭐ if you find this work helpful.
- [2025/9] 🔥🔥 Lumina-DiMOO, OmniGen2, Infinity, X-Omni, OneCAT, Echo-4o, and MMaDA are added to all 🏅Leaderboard.
- [2025/9] 🔥🔥 Seedream-4.0 are added to all 🏅Leaderboard.
- [2025/9] 🔥🔥 We release UniGenBench 🏅Leaderboard (English Long) and 🏅Leaderboard (Chinese Long). We will continue to update them regularly.
- [2025/9] 🔥🔥 GPT-4o, Imagen-4-Ultra, Nano Banana, Seedream-3.0, Qwen-Image, FLUX-Kontext-[Max/Pro] are added to UniGenBench 🏅Leaderboard(English) and 🏅Leaderboard(Chinese).
- [2025/8] 🔥🔥 We release Pref-GRPO and UniGenBench, and 🏅Leaderboard(English).
- Clone this repository and navigate to the folder:
git clone https://github.com/CodeGoat24/UnifiedReward.git
cd UnifiedReward/Pref-GRPO- Install the training package:
conda create -n PrefGRPO python=3.12
conda activate PrefGRPO
bash env_setup.sh fastvideo
cd open_clip
pip install -e .
cd ..- Download Models
huggingface-cli download CodeGoat24/UnifiedReward-qwen-7b
huggingface-cli download CodeGoat24/UnifiedReward-Think-qwen-7b
wget https://huggingface.co/apple/DFN5B-CLIP-ViT-H-14-378/resolve/main/open_clip_pytorch_model.bin- Install vLLM
pip install vllm==0.9.0.1 transformers==4.52.4- Start server
bash vllm_utils/vllm_server_UnifiedReward_Think.sh we use training prompts in UniGenBench, as shown in "./data/unigenbench_train_data.txt".
bash fastvideo/data_preprocess/preprocess_flux_rl_embeddings.shbash finetune_prefgrpo_flux.shwe use test prompts in UniGenBench, as shown in "./data/unigenbench_test_data.csv".
bash inference/flux_dist_infer.shThen, evaluate the outputs following UniGenBench.
If you have any comments or questions, please open a new issue or feel free to contact Yibin Wang.
Our training code is based on DanceGRPO, Flow-GRPO, and FastVideo.
We also use UniGenBench for T2I model semantic consistency evaluation.
Thanks to all the contributors!
@article{Pref-GRPO&UniGenBench,
title={Pref-GRPO: Pairwise Preference Reward-based GRPO for Stable Text-to-Image Reinforcement Learning},
author={Wang, Yibin and Li, Zhimin and Zang, Yuhang and Zhou, Yujie and Bu, Jiazi and Wang, Chunyu and Lu, Qinglin and Jin, Cheng and Wang, Jiaqi},
journal={arXiv preprint arXiv:2508.20751},
year={2025}
}
