Product upvotes vs the next 3

Waiting for data. Loading

Product comments vs the next 3

Waiting for data. Loading

Product upvote speed vs the next 3

Waiting for data. Loading

Product upvotes and comments

Waiting for data. Loading

Product vs the next 3

Loading

Qwen 1.5 MoE

Highly efficient mixture-of-expert (MoE) model from Alibaba

Qwen1.5-MoE-A2.7B is a small mixture-of-expert (MoE) model with only 2.7 billion activated parameters yet matches the performance of state-of-the-art 7B models like Mistral 7B and Qwen1.5-7B.

Top comment

Don't sleep on the work of Alibaba's impressive AGI team Qwen! Performance, efficiency, and cost-effectiveness — in a nice open source wrapper!