Latest open-source LLM to rise to the top of the leaderboard is from China. Claims in paper need to be verified but pretty impressive if true (e.g. compares favorably to Llama 3.1), especially since they had less compute to work with:
arxiv.org/abs/2411.022...
Hunyuan-Large: An Open-Source MoE Model with 52 Billion Activated Parameters by Tencent
In this paper, we introduce Hunyuan-Large, which is currently the largest open-source Transformer-based mixture of experts model, with a total of 389 billion parameters and 52 billion activation param...