So you realize what really drives AI applications. Downloads last month on
@huggingface
:
Mixtral-8x7B-Instruct-v0.1: 843,843
phi-2: 329,824
bert-base-uncased: 32,670,091
roberta-base: 21,673,938
Clearly, the most useful model right now would be a 1B-parameter BERT with 32k+…