Hugging Face Profile Banner
Hugging Face Profile
Hugging Face

@huggingface

343,230
Followers
189
Following
431
Media
8,875
Statuses

The AI community building the future. #BlackLivesMatter #stopasianhate

NYC and Paris and 🌏
Joined September 2016
Don't wanna be here? Send us removal request.
Pinned Tweet
@huggingface
Hugging Face
1 year
🤗 Transformers has been built by, with, and for the community. Reaching 100k ⭐ on GitHub is a testament to ML's reach and the community's will to innovate and contribute. To celebrate, we highlight 100 incredible projects in transformers' vicinity.
Tweet media one
96
269
1K
@huggingface
Hugging Face
1 year
We just released Transformers' boldest feature: Transformers Agents. This removes the barrier of entry to machine learning Control 100,000+ HF models by talking to Transformers and Diffusers Fully multimodal agent: text, images, video, audio, docs...🌎
Tweet media one
74
834
3K
@huggingface
Hugging Face
2 years
🤗🚀
Tweet media one
98
240
2K
@huggingface
Hugging Face
9 months
Llama 2: Now on Hugging Chat 🤗🦙 Try out the 70B Chat model for free with super fast inference, web search, and powered by open-source tools! 👉
38
445
2K
@huggingface
Hugging Face
1 year
THIS IS BIG! 👀 It's now possible to take any of the >30,000 ML apps from Spaces and run them locally (or on your own infrastructure) with the new "Run with @Docker " feature. 🔥🐳 See an app you like? Run it yourself in just 2 clicks🤯
Tweet media one
36
351
2K
@huggingface
Hugging Face
4 years
No labeled data? No problem. The 🤗 Transformers master branch now includes a built-in pipeline for zero-shot text classification, to be included in the next release. Try it out in the notebook here:
21
418
2K
@huggingface
Hugging Face
3 years
The first part of the Hugging Face Course is finally out! Come learn how the 🤗 Ecosystem works 🥳: Transformers, Tokenizers, Datasets, Accelerate, the Model Hub! Share with your friends who want to learn NLP, it's free! Come join us at
Tweet media one
23
497
2K
@huggingface
Hugging Face
3 years
🚨Transformers is expanding to Speech!🚨 🤗Transformers v4.3.0 is out and we are excited to welcome @facebookai 's Wav2Vec2 as the first Automatic Speech Recognition model to our library! 👉Now, you can transcribe your audio files directly on the hub:
Tweet media one
17
317
1K
@huggingface
Hugging Face
1 year
SAM, the groundbreaking segmentation model from @Meta is now in available in 🤗 Transformers! What does this mean? 1. One line of code to load it, one line to run it 2. Efficient batching support to generate multiple masks 3. pipeline support for easier usage More details: 🧵
Tweet media one
24
250
1K
@huggingface
Hugging Face
3 years
$40M series B! 🙏Thank you open source contributors, pull requesters, issue openers, notebook creators, model architects, twitting supporters & community members all over the 🌎! We couldn't do what we do & be where we are - in a field dominated by big tech - without you!
55
139
1K
@huggingface
Hugging Face
8 months
Code Llama: Now on Hugging Chat 💻🦙 Try out the 34B Instruct model for free with super fast inference! 👉
24
290
1K
@huggingface
Hugging Face
3 years
Last week, EleutherAI released two checkpoints for GPT Neo, an *Open Source* replication of OpenAI's GPT-3 These checkpoints, of sizes 1.3B and 2.7B are now available in🤗Transformers! The generation capabilities are truly🤯, try it now on the Hub:
Tweet media one
17
314
1K
@huggingface
Hugging Face
2 years
Last week @MetaAI publicly released huge LMs, with up to ☄️30B parameters. Great win for Open-Source🎉 These checkpoints are now in 🤗transformers! But how to use such big checkpoints? Introducing Accelerate and ⚡️BIG MODEL INFERENCE⚡️ Load & USE the 30B model in colab (!)⬇️
Tweet media one
15
236
1K
@huggingface
Hugging Face
1 year
The first RNN in transformers! 🤯 Announcing the integration of RWKV models in transformers with @BlinkDL_AI and RWKV community! RWKV is an attention free model that combines the best from RNNs and transformers. Learn more about the model in this blogpost:
Tweet media one
19
268
1K
@huggingface
Hugging Face
4 years
Let’s democratize NLP for all languages! 🌎🌎🌎 Today, with v2.9.1, we are releasing 1,008 machine translation models, covering ` of 140 different languages trained by @jorgtiedemann with @marian , ported by @sam_shleifer . Find your language here: [1/4]
Tweet media one
20
352
1K
@huggingface
Hugging Face
4 years
𝗢𝗨𝗥 𝗗𝗘𝗙𝗜𝗡𝗜𝗧𝗜𝗩𝗘 𝗧𝗨𝗧𝗢𝗥𝗜𝗔𝗟 🔥 How to train a new language model from scratch using Transformers and Tokenizers ➡️
Tweet media one
11
268
1K
@huggingface
Hugging Face
7 months
How to train a Llama 2 chatbot (a step-by-step guide, designed for non-coders)
10
237
1K
@huggingface
Hugging Face
4 years
Long-range sequence modeling meets 🤗 transformers! We are happy to officially release Reformer, a transformer that can process sequences as long as 500.000 tokens from @GoogleAI . Thanks a million, Nikita Kitaev and @lukaszkaiser ! Try it out here:
Tweet media one
7
253
1K
@huggingface
Hugging Face
3 years
We are honored to be awarded the Best Demo Paper for "Transformers: State-of-the-Art Natural Language Processing" at #emnlp2020 😍 Thank you to our wonderful team members and the fantastic community of contributors who make the library possible 🤗🤗🤗
Tweet media one
30
139
997
@huggingface
Hugging Face
4 years
Time to push explainable AI 🔬 exBERT, the visual analysis tool to explore learned representations from @MITIBMLab is now integrated on our model pages for BERT, DistilBERT, RoBERTa, XLM & more! Just click on the tag #exbert on @huggingface ’s models page:
7
276
958
@huggingface
Hugging Face
2 years
日本からの嬉しいお知らせです!rinnaが日本語で学習したJapanese Stable DiffusionがHugging Face Spacesでデモ化されました!
Tweet media one
2
282
947
@huggingface
Hugging Face
3 years
🤗 Transformers meets VISION 📸🖼️ v4.6.0 is the first CV dedicated release! - CLIP @OpenAI , Image-Text similarity or Zero-Shot Image classification - ViT @GoogleAI , and - DeiT @facebookai , SOTA Image Classification Try ViT/DeiT on the hub (Mobile too!):
10
261
960
@huggingface
Hugging Face
2 years
🧨Diffusion models have been powering impressive ML apps, enabling DALL-E or Imagen Introducing 🤗 diffusers: a modular toolbox for diffusion techniques, with a focus on: 🚄Inference pipelines ⏰Schedulers 🏭Models 📃Training examples
Tweet media one
13
201
940
@huggingface
Hugging Face
4 years
Introducing PruneBERT, fine-*P*runing BERT's encoder to the size of a high-resolution picture (11MB) while keeping 95% of its original perf! Based on our latest work on movement pruning: Code and weights:
17
261
933
@huggingface
Hugging Face
4 years
Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings➡️model inputs. Welcome 🤗Tokenizers: ultra-fast & versatile tokenization led by @moi_anthony : -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...
Tweet media one
11
233
894
@huggingface
Hugging Face
10 months
We are looking into an incident where a malicious user took control over the Hub organizations of Meta/Facebook & Intel via reused employee passwords that were compromised in a data breach on another site. We will keep you updated 🤗
34
142
846
@huggingface
Hugging Face
9 months
Hugging Face is now part of the PyTorch Foundation as a premier member 🤝 We have been collaborating with the PyTorch team for the past four years and are committed to supporting the project. We share an objective: to lower the barrier of entry to ML.
Tweet media one
16
139
834
@huggingface
Hugging Face
1 year
Hugging Faceから日本へのお知らせです! Hugging Faceコースの日本語翻訳を始めました。東北大学のStudent Ambassadorsの皆さんのお陰で第一章の翻訳が終了しました。 今後もコツコツと翻訳していきます。 是非コースを読んでHugging Face Tranformersについて学んで、使ってみてください!
2
230
810
@huggingface
Hugging Face
4 years
🤗Transformers v3.0 is out🔥 — [1/4]
Tweet media one
3
220
785
@huggingface
Hugging Face
4 years
Bored at home? Need a new friend? Hang out with BART, the newest model available in transformers (thx @sam_shleifer ) , with the hefty 2.6 release (notes: ). Now you can get state-of-the-art summarization with a few lines of code: 👇👇👇
Tweet media one
14
208
759
@huggingface
Hugging Face
3 years
EleutherAI's GPT-J is now in 🤗 Transformers: a 6 billion, autoregressive model with crazy generative capabilities! It shows impressive results in: - 🧮Arithmetics - ⌨️Code writing - 👀NLU - 📜Paper writing - ... Play with it to see how powerful it is:
Tweet media one
13
175
744
@huggingface
Hugging Face
4 years
Transformers v2.2 is out, with *4* new models and seq2seq capabilities! ALBERT is released alongside CamemBERT, implemented by the authors, DistilRoBERTa (twice as fast as RoBERTa-base!) and GPT-2 XL! Encoder-decoder with ⭐Model2Model⭐ Available on
Tweet media one
9
206
728
@huggingface
Hugging Face
1 year
Today we are excited to announce a new partnership with @awscloud ! 🔥 Together, we will accelerate the availability of open-source machine learning 🤝 Read the post 👉
11
161
724
@huggingface
Hugging Face
3 years
Document parsing meets 🤗 Transformers! 📄 #LayoutLMv2 and #LayoutXLM by @MSFTResearch are now available! 🔥 They're capable of parsing document images (like PDFs) by incorporating text, layout, and visual information, as in the @gradio demo below ⬇️
11
189
706
@huggingface
Hugging Face
5 years
🥁🥁🥁 Welcome to "pytorch-transformers", the 👾 library for Natural Language Processing!
Tweet media one
7
219
705
@huggingface
Hugging Face
4 years
The 101 for text generation! 💪💪💪 This is an overview of the main decoding methods and how to use them super easily in Transformers with GPT2, XLNet, Bart, T5,... It includes greedy decoding, beam search, top-k/nucleus sampling,...: by @PatrickPlaten
Tweet media one
12
206
693
@huggingface
Hugging Face
4 years
Transformers 2.4.0 is out 🤗 - Training transformers from scratch is now supported - New models, including *FlauBERT*, Dutch BERT, *UmBERTo* - Revamped documentation - First multi-modal model, MMBT from @facebookai , text & images Bye bye Python 2 🙃
7
168
685
@huggingface
Hugging Face
3 years
Fine-tuning a *3-billion* parameter model on a single GPU? Now possible in transformers, thanks to the DeepSpeed/Fairscale integrations! Thank you @StasBekman for the seamless integration, and thanks to @microsoft and @facebookai teams for their support!
8
176
669
@huggingface
Hugging Face
4 years
┏━━┓┏━━┓┏━━┓┏━━┓ ┗━┓┃┃┏┓┃┗━┓┃┃┏┓┃ ┏━┛┃┃┃┃┃┏━┛┃┃┃┃┃ Solving Natural Language Processing! ┃┏━┛┃┃┃┃┃┏━┛┃┃┃┃ ┃┗━┓┃┗┛┃┃┗━┓┃┗┛┃ ┗━━┛┗━━┛┗━━┛┗━━┛
7
79
659
@huggingface
Hugging Face
4 years
You can now visualize Transformers training performance with a seamless @weights_biases integration. Compare hyperparameters, output metrics, and system stats like GPU utilization across your models! Step-by-step guide: Colab:
8
178
649
@huggingface
Hugging Face
5 years
💃PyTorch-Transformers 1.1.0 is live💃 It includes RoBERTa, the transformer model from @facebookai , current state-of-the-art on the SuperGLUE leaderboard! Thanks to @myleott @julien_c @LysandreJik and all the 100+ contributors!
Tweet media one
6
194
648
@huggingface
Hugging Face
1 year
It's been an exciting year for 🤗Transformers. We tripled the number of weekly active users over 2022, with over 1M users most weeks now and 300k daily pip installs on average🤯
Tweet media one
10
83
640
@huggingface
Hugging Face
4 years
🚨New release alert 🚨BERT, RoBERTa, GPT2, TransformerXL and most of the community models are now an order of magnitude faster thanks to the integration of the tokenizers library! Check it out here:
Tweet media one
3
142
640
@huggingface
Hugging Face
3 years
🔥Fine-Tuning @facebookai 's Wav2Vec2 for Speech Recognition is now possible in Transformers🔥 Not only for English but for 53 Languages🤯 Check out the tutorials: 👉 Train Wav2Vec2 on TIMIT 👉 Train XLSR-Wav2Vec2 on Common Voice
Tweet media one
6
156
641
@huggingface
Hugging Face
2 years
The Technology Behind BLOOM Training🌸 Discover how @BigscienceW used @MSFTResearch DeepSpeed + @nvidia Megatron-LM technologies to train the World's Largest Open Multilingual Language Model (BLOOM):
8
154
635
@huggingface
Hugging Face
4 years
GPT-3 from @OpenAI got you interested in zero-shot and few-shot learning? You're lucky because our own @joeddav has just released a demo of zero-shot topic classification! Test how the model can predict a topic it has NEVER been trained on: 🤯🤯🤯
Tweet media one
10
176
633
@huggingface
Hugging Face
4 years
How big should my language model be? As NLP researchers and practitioners, that question is central. We have built a tool that calculates an optimal model size and training time for your budget so you don't have to. See it in action at ! [1/2]
Tweet media one
8
155
632
@huggingface
Hugging Face
4 years
We spend our time finetuning models on tasks like text classif, NER or question answering. Yet 🤗Transformers had no simple way to let users try these fine-tuned models. Release 2.3.0 brings Pipelines: thin wrappers around tokenizer + model to ingest/output human-readable data.
Tweet media one
4
152
630
@huggingface
Hugging Face
4 years
Want speedy transformers models w/o a GPU?! 🧐 Starting with transformers v3.1.0 your models can now run at the speed of light on commodity CPUs thanks to ONNX Runtime quantization!🚀. Check out our 2nd blog post with ONNX Runtime on the subject! 🔥
4
172
634
@huggingface
Hugging Face
2 months
We're having some infra issues; we're working on it. Please send hugs! 🤗 In the meantime, import os os.environ['HF_HUB_OFFLINE']=1
127
57
630
@huggingface
Hugging Face
2 years
💫 Perceiver IO by @DeepMind is now available in 🤗 Transformers! A general purpose deep learning model that works on any modality and combinations thereof 📜text 🖼️ images 🎥 video 🔊 audio ☁️ point clouds ... Read more in our blog post:
Tweet media one
4
113
602
@huggingface
Hugging Face
3 years
🔥We're launching the new and it's incredible 🚀Play live with +10 billion parameters models, deploy them instantly in production with our hosted API, join the 500 organizations using our hub to host/share models & datasets And one more thing... 👇
Tweet media one
15
132
597
@huggingface
Hugging Face
3 years
Release alert: the 🤗datasets library v1.2 is available now! With: - 611 datasets you can download in one line of python - 467 languages covered, 99 with at least 10 datasets - efficient pre-processing to free you from memory constraints Try it out at:
Tweet media one
1
156
602
@huggingface
Hugging Face
2 years
🖌️ Stable Diffusion meets 🧨Diffusers! Releasing diffusers==0.2.2 with full support of @StabilityAI 's Stable Diffusion & schedulers 🔥 Google colab: 👉 Code snippet 👇
Tweet media one
7
122
591
@huggingface
Hugging Face
4 years
The ultimate guide to encoder-decoder models! Today, we're releasing part one explaining how they work and why they have become indispensable for NLG tasks such as summarization and translation. > Subscribe for the full series:
Tweet media one
5
141
594
@huggingface
Hugging Face
4 months
Hugging Face 🫶 @GoogleColab With the latest release of huggingface_hub, you don't need to manually log in anymore. Create a secret once and share it with every notebook you run. 🤗 pip install --upgrade huggingface_hub Check it out!👇
5
112
585
@huggingface
Hugging Face
4 years
Today we're happy to release four new official notebook tutorials available in our documentation and in colab thanks to @MorganFunto to get started with tokenizers and transformer models in just seconds! (1/6)
Tweet media one
10
154
566
@huggingface
Hugging Face
4 years
The 1.5 billion parameter GPT-2 (aka gpt2-xl) is up: ✅ in the transformers repo: ✅ try it out live in Write With Transformer🦄 Coming next: 🔘 Detector model based on RoBERTa Thanks @OpenAI @Miles_Brundage @jackclarkSF and all
9
157
560
@huggingface
Hugging Face
5 years
This is SO meta 🤓 We trained a generative language model on a dataset of ArXiv NLP papers. You can now get a neural net to write your papers for (with?) you 🔥. We heard from a few researchers that they're already using it in submitted papers.
Tweet media one
12
148
555
@huggingface
Hugging Face
2 years
Transformers v4.22 is out, and includes the first VIDEO models! 🎥 💥VideoMAE: masked auto-encoders for video 💥X-CLIP: CLIP for video-language Other nice goodies: 💥Swin Transformer v2 💥Pegasus-X 💥Donut 💥MobileViT ... and MacOS support (device="mps")!
Tweet media one
2
99
547
@huggingface
Hugging Face
9 months
TRL 🤗 Hugging Face Excited to announce that we're doubling down on our efforts to democratize RLHF and reinforcement learning with TRL, new addition to the @huggingface family, developed and led by team member @lvwerra 🎉🎉 Train your first RLHF model 👉
Tweet media one
7
123
540
@huggingface
Hugging Face
3 years
The new SOTA is in Transformers! DeBERTa-v2 beats the human baseline on SuperGLUE and up to a crazy 91.7% dev accuracy on MNLI task. Beats T5 while 10x smaller! DeBERTa-v2 contributed by @Pengcheng2020 from @MSFTResearch Try it directly on the hub:
Tweet media one
5
113
535
@huggingface
Hugging Face
7 months
Zephyr 7b beats Llama 70b on MT Bench 🤯🤯🤯
Tweet media one
12
86
534
@huggingface
Hugging Face
3 years
🔥JAX meets Transformers🔥 @GoogleAI 's JAX/Flax library can now be used as Transformers' backbone ML library. JAX/Flax makes distributed training on TPU effortless and highly efficient! 👉 Google Colab: 👉 Runtime evaluation:
Tweet media one
3
113
532
@huggingface
Hugging Face
4 years
1/4. Four NLP tutorials are now available on @kaggle ! It's now easier than ever to leverage tokenizers and transformer models like BERT, GPT2, RoBERTa, XLNet, DistilBERT,... for your next competition! 💪💪💪! #NLProc #NLP #DataScience #kaggle
3
151
534
@huggingface
Hugging Face
11 months
📣 Calling all game dev and AI enthusiasts!🎮 Already 400 people signed up for the first Open Source AI Game Jam, where you'll use AI tools to make a game in a weekend🔥 Sign up here 👉 What AI tools? Let's focus today on Audio tools 🔊 ⬇️
Tweet media one
9
126
532
@huggingface
Hugging Face
2 months
The Open Source community is amazing 🤗
21
49
507
@huggingface
Hugging Face
3 years
We've heard your requests! Over the past few months ... we've been working on a Hugging Face Course! The release is imminent. Sign-up for the newsletter to know when it comes out: Sneak peek; Transfer Learning with @GuggerSylvain :
9
125
527
@huggingface
Hugging Face
2 years
Scikit-Learn and 🤗 join forces! With a growing number of tabular classification & regression checkpoints, we believe statistical ML has its place on the HF Hub. We're excited to partner with sklearn, statistical ML champion, and move forward together.
Tweet media one
4
93
516
@huggingface
Hugging Face
3 years
GPT-Neo, the #OpenSource cousin of GPT3, can do practically anything in #NLP from sentiment analysis to writing SQL queries: just tell it what to do, in your own words. 🤯 How does it work? 🧐 Want to try it out? 🎮 👉
8
149
511
@huggingface
Hugging Face
4 years
Thanks to @srush_nlp , we now have an example of a training module for NER leveraging transformers. Under 300 lines of codes and supports GPUs and TPUs thanks to @PyTorchLightnin ! Colab: Example:
Tweet media one
4
135
510
@huggingface
Hugging Face
2 years
Transformers v4.13.0 is out and it is *big*: Vision: - 🖼️ SegFormer - 🖨️ ImageGPT Audio: - 🔡 Language model support for ASR Multimodal: - ⚖️ Vision-Text dual encoders NLP: - 🔣 mLUKE - 🏅 DeBERTa-v3 Trainer: - 1⃣6⃣ The Trainer now supports BF16/TF32! 🌠New doc frontend 🌠
Tweet media one
5
89
509
@huggingface
Hugging Face
4 years
🔥Transformers' first-ever end-2-end multimodal demo was just released, leveraging LXMERT, SOTA model for visual Q&A! Model by @HaoTan5 , @mohitban47 , with an impressive implementation in Transformers by @avalmendoz ( @UNCnlp ) Notebook available here: 🤗
6
129
515
@huggingface
Hugging Face
2 years
Open Source
12
44
500
@huggingface
Hugging Face
2 years
TODAY'S A BIG DAY Spaces are now publicly available Build, host, and share your ML apps on @huggingface in just a few minutes. There's no limit to what you can build. Be creative, and share what you make with the community. 🙏 @streamlit and @gradio
Tweet media one
4
137
508
@huggingface
Hugging Face
4 years
👋 To all JS lovers: NLP is more accessible than ever! You can now leverage the power of DistilBERT-cased for Question Answering w/ just 3 lines of code!!! 🤗 You can even run the model remotely w/ the built-in @TensorFlow Serving compatibility 🚀
Tweet media one
10
128
500
@huggingface
Hugging Face
10 months
At Hugging Face, we are working to enable you to easily build and serve your own LLMs 🧑‍💻👨‍💻👩‍💻 In this blog, we talk about the amazing world of open-source LLMs, the challenges, and how the Hugging Face ecosystem can help you 🪐 Read about them here 👉
Tweet media one
7
119
497
@huggingface
Hugging Face
3 years
🥁 We can't wait to share our new inference product with you! 🤩 - it achieves 1ms latency on Transformer models 🏎 - you can deploy it in your own infrastructure ⚡️ - we call it: 🤗 Infinity 🚀 📅 Join us for a live event and demo on 9/28!
Tweet media one
13
64
497
@huggingface
Hugging Face
3 years
We're thrilled to partner with to create some great new content for their NLP Specialization on Coursera! With this update, you can access exciting new material and lectures that cover the state of the art in NLP 🧑‍🏫
5
78
495
@huggingface
Hugging Face
4 years
Happy to officially include DialoGPT from @MSFTResearch to 🤗transformers (see docs: ) DialoGPT is the first conversational response model added to the library. Now you can build a state-of-the-art chatbot in just 10 lines of code 👇👇👇
Tweet media one
5
120
480
@huggingface
Hugging Face
5 years
Our Distilbert paper just got accepted at NeurIPS 2019's ECM2 workshop! - 40% smaller 60% faster than BERT - 97% of the performance on GLUE We also distilled GPT2 in an 82M params model💥 All the weights are available in TF2.0 @tensorflow here:
Tweet media one
4
103
481
@huggingface
Hugging Face
4 years
Transformers v3.1.0 is out, first pypi release with 💫 PEGASUS, DPR, mBART 💫 📖 New & simpler docs and tutorials 🎤 Dialogue & zero-shot pipelines ⭐️ New encoder-decoder architectures: Bert2GPT2, Roberta2Roberta, Longformer2Roberta, ... 📕 Named outputs:
Tweet media one
8
124
480
@huggingface
Hugging Face
2 years
Last week, @MetaAI introduced NLLB-200: a massive translation model supporting 200 languages. Models are now available through the Hugging Face Hub, using 🤗Transformers' main branch. Models on the Hub: Learn about NLLB-200:
Tweet media one
6
142
476
@huggingface
Hugging Face
2 years
Machine learning demos are increasingly a vital part of releasing a model. Demos allow anyone, not just ML engineers, to try a model, give feedback on predictions, and build trust That's why we are thrilled to announce @Gradio 3.0: a grounds-up redesign of the Gradio library 🥳
Tweet media one
7
92
468
@huggingface
Hugging Face
3 years
🤗Transformers are starting to work with structured databases! We just released 🤗Transformers v4.1.1 with TAPAS, a multi-modal model for question answering on tabular data from @googleAI . Try it out through transformers or our inference API:
Tweet media one
5
92
465
@huggingface
Hugging Face
11 months
🚨Exciting news! Next week, we’ll be launching a brand-new Audio Course! 🤗 Sign up today () and join us for a LIVE course launch event featuring amazing guests like @DynamicWebPaige , Seokhwan Kim, and @functiontelechy ! ⚡️
Tweet media one
4
99
459
@huggingface
Hugging Face
3 years
🤗We are going to invest more in @tensorflow in 2021! If you want to take part in building the fastest growing NLP open-source library, join us:
9
80
447
@huggingface
Hugging Face
4 years
Our API now includes a brand new pipeline: zero-shot text classification This feature lets you classify sequences into the specified class names out-of-the-box w/o any additional training in a few lines of code! 🚀 Try it out (and share screenshots 📷):
Tweet media one
Tweet media two
Tweet media three
Tweet media four
12
107
449
@huggingface
Hugging Face
3 years
20,000+ machine learning models connected to 3,000+ apps? Hugging Face meets Zapier! 🤯🤯🤯 With the Hugging Face API, you can now easily connect models right into apps like Gmail, Slack, Twitter, and more: [1/2]
Tweet media one
7
78
444
@huggingface
Hugging Face
1 year
🧨Diffusers supports Stable Diffusion 2 ! Run @StabilityAI 's Stable Diffusion 2 with zero changes to your code using your familiar diffusers API. Everything is supported: attention optimizations, fp16, img2image, swappable schedulers, and more🤗
Tweet media one
5
85
444
@huggingface
Hugging Face
2 months
We're excited to collaborate with the Europan Space Agency for the release of MajorTOM, the largest ML-ready Sentinel-2 images dataset! 🚀 It covers 50% of the Earth. 2.5 trillion pixels of open source! 🤗👐🌌🚀🌏
@esa
European Space Agency
2 months
Our @ESA_EO Φ-lab has released, in partnership with @huggingface , the first dataset of 'MajorTOM', or the Terrestrial Observation Metaset, the largest community-oriented and machine-learning-ready collection of @CopernicusEU #Sentinel2 images ever published and covering over 50%…
Tweet media one
4
70
416
6
71
446
@huggingface
Hugging Face
3 years
🚨 NEW MODEL ALERT 🚨 Translate text to, or between 50 languages with mBART-50 from @facebookai ! 🇺🇳 One-to-Many model: translate from English to 49 other languages ↔️ Many-to-Many model: translation btw any pair of 50 languages
Tweet media one
7
75
431
@huggingface
Hugging Face
4 years
Transformers v2.9 is out, with a built-in Trainer and TFTrainer 🔥 This let us reorganize the example scripts completely for a cleaner codebase. - Same user-facing API for PyTorch and TF 2 - Support for GPU, Multi-GPU, and TPU - Easier than ever to share your fine-tuned models
Tweet media one
6
126
433
@huggingface
Hugging Face
4 years
Excited to welcome Longformer, the transformer for long-range document tasks, to transformers 🤗(thanks to @i_beltagy ). Try 1 of the 7 models from the model hub: or check out how to convert a pre-trained BERT to its "long" version: .
Tweet media one
5
118
428
@huggingface
Hugging Face
2 years
🎙️ Speech Translation stepped up its game! @MetaAI just released XLS-R, a model pretrained on 128 spoken languages🌍 ... and it's available on the Hub🤗 You can try out the first "All-to-All" Speech Translation checkpoint ever directly on the Hub 🔥 👉
Tweet media one
3
84
423
@huggingface
Hugging Face
3 years
🤗 Transformers v4.9.0: 🟠 Brand new @TensorFlow Examples 🟠: Examples for many NLP tasks are now available using Keras only, thanks to @carrigmat ! 🚀CANINE: Tokenizer-free, character-based model 🚂 Train a tokenizer from another: same config, different dataset!
Tweet media one
2
88
421
@huggingface
Hugging Face
5 years
You can now try “Write with Transformer 🦄”, powered by @OpenAI ’s LARGE GPT-2 model (774 million parameters!), in realtime. Try it out on 🔥 🔥 @jackclarkSF @Miles_Brundage @AlecRad @julien_c @Thom_Wolf @SanhEstPasMoi @ClementDelangue @LysandreJik
12
133
418
@huggingface
Hugging Face
3 months
We're having some issues on the Hub and looking into them! We should be back soon. Please send hugs🤗
126
28
415
@huggingface
Hugging Face
2 years
Dataset observability is key to better ML. You can now preview datasets' contents DIRECTLY on the Hub 🔥🎉 Thanks to the streaming feature of datasets, half of the community's datasets are supported out of the box. It also supports Image and Audio datasets, not just text!
Tweet media one
2
73
409