 
        ChronoMagic-Bench Leaderboard
Welcome to the leaderboard of the ChronoMagic-Bench! (NeurIPS 2024 D&B Spotlight)
šChronoMagic-Bench represents the inaugural benchmark dedicated to assessing T2V models' capabilities in generating time-lapse videos that demonstrate significant metamorphic amplitude and temporal coherence. The benchmark probes T2V models for their physics, biology, and chemistry capabilities, in a free-form text control.
If you like our project, please give us a star ā on GitHub for the latest update.
GitHub | arXiv | Home Page | ChronoMagic-Pro | ChronoMagic-ProH
In the table below, we summarize each task performance of all the models. We use UMT-FVD, UMTScore, MTScore, CHScore, GPT4o-MTScore as the primary evaluation metric for each tasks.
| U-Net |  257.56  |  1.916  |  0.478  |  81.82  |  3.13  | 
In the table below, we summarize each task performance of all the models. We use UMT-FVD, UMTScore, MTScore, CHScore, GPT4o-MTScore as the primary evaluation metric for each tasks.
| U-Net |  294.72  |  1.763  |  0.479  |  125.25  |  3.05  | 
| U-Net |  294.72   |  1.763   |  0.479   |  77.98   |  3.05   | |
| DiT |  214.91   |  2.387   |  0.474   |  95.97   |  3.11   | |
| DiT |  195.52   |  3.24   |  0.472   |  38.64   |  3.09   | |
| U-Net |  239.31   |  2.837   |  0.47   |  70.36   |  2.62   | |
| DiT |  241.09   |  2.676   |  0.448   |  75.94   |  2.57   | |
| U-Net |  214.06   |  2.763   |  0.437   |  75.9   |  2.87   | |
| U-Net |  244.49   |  2.282   |  0.422   |  58.08   |  3.06   | |
| U-Net |  230.74   |  2.783   |  0.409   |  61.01   |  3.01   | |
| U-Net |  260.61   |  2.232   |  0.403   |  94.67   |  2.29   | |
| U-Net |  250.22   |  2.559   |  0.399   |  18.54   |  2.62   | |
| DiT |  210.93   |  2.681   |  0.383   |  51.87   |  2.5   | |
| DiT |  218.99   |  2.4   |  0.373   |  125.25   |  2.62   | |
| DiT |  202.32   |  2.517   |  0.369   |  74.2   |  2.74   | |
| DiT |  232.29   |  2.122   |  0.366   |  72.57   |  2.42   | |
| DiT |  202.03   |  2.733   |  0.352   |  88.48   |  2.33   | |
| U-Net |  210.39   |  2.714   |  0.35   |  81.32   |  2.5   | |
| DiT |  187.36   |  2.758   |  0.348   |  63.88   |  3.19   | |
| DiT |  223.05   |  2.317   |  0.347   |  75.98   |  2.48   | |
| DiT |  228.7   |  2.459   |  0.331   |  61.5   |  2.21   | |
| DiT |  216.9   |  0.972   |  0.324   |  89.76   |  1.08   | |
| DiT |  -   |  -   |  -   |  71   |  2.64   | |
| DiT |  -   |  -   |  -   |  48.45   |  3.36   | |
| DiT |  -   |  -   |  -   |  28.07   |  3.76   | |
| DiT |  -   |  -   |  -   |  121.16   |  3.18   | 
Submission Guidelines
- Fill in 'Model Name' if it is your first time to submit your result or Fill in 'Revision Model Name' if you want to update your result.
- Select āBackbone Typeā (DiT or U-Net).
- Fill in your home page to 'Model Link'.
- After evaluation, follow the guidance in the github repository to obtain ChronoMagic-Bench-Input.jsonand upload it here.
- Click the 'Submit Eval' button.
- Click 'Refresh' to obtain the uploaded leaderboard.