Aditya Grover Profile
Aditya Grover

@adityagrover_

8,226
Followers
418
Following
29
Media
799
Statuses

CS Prof @UCLA . AI, ML, Climate. Prev: Postdoc @berkeley_ai , PhD @StanfordAILab , bachelors @IITDelhi .

CA
Joined May 2009
Don't wanna be here? Send us removal request.
Pinned Tweet
@adityagrover_
Aditya Grover
6 months
Incredibly honored to be in the #ForbesUnder30 list for 2024! Thanks to my incredible support team over the years - students, mentors, colleagues, friends, and family.
Tweet media one
51
14
420
@adityagrover_
Aditya Grover
1 year
📢Introducing ClimateLearn, a new PyTorch library for accessing climate datasets, state-of-the-art ML models, and high quality training and visualization pipelines. Blog: Docs: Quickstart Colab: 🧵 (1/n)
17
277
1K
@adityagrover_
Aditya Grover
4 years
With my thesis formally approved @Stanford , I am excited to share next steps: I'll be joining @CS_UCLA @UCLAengineering as an assistant professor of CS in Fall 2021. For this coming year, I'll be a researcher at @facebookai and @pabbeel 's group at @berkeley_ai .
61
17
1K
@adityagrover_
Aditya Grover
3 years
Thrilled to share that my PhD dissertation won the ACM SIGKDD Dissertation Award for "outstanding work in data science and machine learning". Thanks to everyone involved, especially my advisor @StefanoErmon & @StanfordAILab !
56
45
800
@adityagrover_
Aditya Grover
5 years
Exciting first day co-teaching CS 236: Deep Generative Models with @ermonste at Stanford! Course enrollments more than doubled from our previous iteration. All our course materials will be available at:
Tweet media one
Tweet media two
6
94
468
@adityagrover_
Aditya Grover
4 years
Great time presenting a tutorial on Deep Generative Models at the Deep Learning for Science School () at Berkeley! Video: Slides:
Tweet media one
Tweet media two
6
77
440
@adityagrover_
Aditya Grover
2 years
I am hiring PhD students in ML this cycle! Topics include unsupervised ML, generative models, seq. decision making, and climate change. For those interested, plz apply to UCLA CS. Deadline: Dec 15. Plz share widely! More info: #AI #ML #AcademicTwitter
4
85
424
@adityagrover_
Aditya Grover
4 years
And it’s a wrap to my PhD @StanfordAILab ! Thanks to my wonderful committee — @StefanoErmon , @erichorvitz , @jure , @percyliang , Moses Charikar & Stephen Boyd — for putting up a manageable offense during my defense and everyone else who has helped me along the way 🙂
@StefanoErmon
Stefano Ermon
4 years
Super proud of my student Aditya who successfully defended his #PhD dissertation today! He has done awesome work on unsupervised learning with generative models. Congrats, Dr. @adityagrover_ 👏🎊🎉
Tweet media one
Tweet media two
36
23
605
23
6
419
@adityagrover_
Aditya Grover
2 years
Our paper on Moser Flows got the outstanding paper award at #NeurIPS2021 !! Congratulations and thanks to all my wonderful co-authors: Noam Rozen, @mnick , @lipmanya :) If you are attending #NeurIPS2021 , please attend our oral presentation and live Q&A on Dec 10 at 4 PST.
@lipmanya
Yaron Lipman
2 years
Excited to share that our paper "Moser Flow: Divergence-based Generative Modeling on Manifolds" won an Outstanding Paper Award at NeurIPS 2021!! Noam Rozen @adityagrover_ @mnick See thread for a short summary 👇
6
43
252
11
12
247
@adityagrover_
Aditya Grover
3 years
Every PhD application season, we see many familiar questions pop up repeatedly. @hima_lakkaraju and I hosted a panel last year with many distinguished ML faculty answering many such burning questions from prospective candidates: #AI #ML #AcademicTwitter
3
43
220
@adityagrover_
Aditya Grover
2 years
Every PhD application season, we see many familiar questions pop up repeatedly. @hima_lakkaraju and I hosted a panel where prominent AI faculty share a behind-the-scenes view with prospective candidates: #AI #ML #AcademicTwitter
3
46
182
@adityagrover_
Aditya Grover
4 years
Great summary article in WIRED covering our research in using ML for accelerating battery development! Fun fact: The largest public datasets for battery research today are at least 100x larger in size than the ones from 5 years ago.
@WIRED
WIRED
4 years
Battery storage is key to increasing the amount of renewable energy on the grid, and when it comes to decarbonizing our energy supply, time is of the essence. After decades of plodding progress, AI-driven battery research promises to pick up the pace.
1
31
93
2
19
165
@adityagrover_
Aditya Grover
3 years
The recording for the panel discussion is up on Youtube: Many gem insights here for both prospective and current PhD students!
@hima_lakkaraju
𝙷𝚒𝚖𝚊 𝙻𝚊𝚔𝚔𝚊𝚛𝚊𝚓𝚞
4 years
We are organizing a panel discussion/Q&A on “Demystifying ML PhD Applications to US Universities” with faculty spanning various universities this Saturday at 9am PT/12pm ET. Please register at to attend. #AI #ML #AcademicTwitter [1/n]
Tweet media one
8
121
447
1
15
144
@adityagrover_
Aditya Grover
2 years
📢New paper out: We show that transformers can be repurposed for uncertainty-aware learning and decision making, smashing all previous benchmarks for Neural Processes++. w/ @tungnd_13 -- first paper by my first PhD student! To appear at @icmlconf 2022.
@tungnd_13
Tung Nguyen
2 years
Transformers show excellent capabilities in few-shot/meta learning, but have been mostly evaluated on accuracy-based metrics. How can we represent uncertainty in meta learning with transformers? We address this question in our new work at #ICML2022 !
3
24
162
0
16
126
@adityagrover_
Aditya Grover
1 year
I am looking to hire a postdoc at UCLA in artificial intelligence and climate science. Details below. Please share and rt in your circles!
2
36
118
@adityagrover_
Aditya Grover
3 years
Very excited to share our new paper "Pretrained Transformers as Universal Computation Engines" led by the amazing @_kevinlu ! We show how to transfer pretrained language transformer to an arbitrary modality (vision, proteins etc) without any expensive finetuning. Details 👇
@IMordatch
Igor Mordatch
3 years
What are the limits to the generalization of large pretrained transformer models? We find minimal fine-tuning (~0.1% of params) performs as well as training from scratch on a completely new modality! with @_kevinlu , @adityagrover_ , @pabbeel paper: 1/8
4
71
354
0
14
117
@adityagrover_
Aditya Grover
5 months
Very excited to release Stormer, the new SOTA for ML-based weather forecasting. Personally, this reinforces the bitter lesson - scale and simplicity extends to AI for science. Great collab b/w @UCLA & @argonne led by @tungnd_13 . Look fwd to chatting more at #NeurIPS2023 .
@tungnd_13
Tung Nguyen
5 months
Introducing Stormer, a scalable transformer model for skillful and reliable medium-range weather forecasting. Stormer achieves competitive accuracy for short-range, 1–7 day forecasts, while outperforming Pangu-Weather and Graphcast by a large margin for longer lead times. Paper:
Tweet media one
10
83
475
3
11
114
@adityagrover_
Aditya Grover
10 months
After a successful presentation at the #ICML2023 main conference, delighted to share that ClimaX was also recognized with a Best Paper Award at the ICML Workshop on Scientific and ML modeling! Congrats to all coauthors @tungnd_13 @jo_brandstetter @akapoor_av8r @rejuvyesh !
Tweet media one
@adityagrover_
Aditya Grover
1 year
ClimaX will appear in #ICML2023 ! 🏖️🧑‍💻🌎 Bonus: @tungnd_13 will also be giving a spotlight talk next week at the upcoming #ICLR2023 workshop on Tackling Climate Change with AI.
1
8
42
4
11
114
@adityagrover_
Aditya Grover
2 years
One of our key motivations with Decision Transformers was to bring closer the research communities in sequential decision making, language, and perception. Thank you @huggingface for the integration! 🤗
@ThomasSimonini
Thomas Simonini ᯅ
2 years
We integrated Decision Transformers, an Offline Reinforcement Learning method, into the 🤗 transformers library and the @huggingface Hub 🥳 ➕ 9 pre-trained models for continuous control tasks in Gym 🔥 We wrote a tutorial if you want to try it 👉
Tweet media one
5
61
240
0
11
111
@adityagrover_
Aditya Grover
5 years
Our @iclr2019 paper proposes NeuralSort, a differentiable relaxation to sorting. Bonus: new Gumbel reparameterization trick for distributions over permutations. Check out our poster today at 4:30! Code: w/ @tl0en A. Zweig @ermonste
Tweet media one
2
18
109
@adityagrover_
Aditya Grover
5 months
Great time speaking at the IndoML conference at IIT Bombay about the latest research from my group on in-context learning and my thoughts on the future of generative AI. Thank you to the organizers for inviting me, and bringing together bright young minds from all over India.
@indoml_sym
IndoML Symposium, 2023
5 months
Day 1: Session 1 Foundational and Generative Models Talk 2: Aditya Grover Professor, UCLA His three pronged talk on TNP, ExPT and GPO for few shot surrogate modelling, experimental design and group preference optimisation was really thought-provoking!
Tweet media one
Tweet media two
Tweet media three
0
5
22
2
8
103
@adityagrover_
Aditya Grover
2 years
When using models such as CLIP for zero-shot predictions, how often do language and image representations agree? Our latest work ⬇️ reformulates the contrastive loss for multimodal data with significantly improved consistency and zero-shot robustness
@_akhaliq
AK
2 years
CYCLIP: Cyclic Contrastive Language-Image Pretraining abs:
Tweet media one
1
21
155
2
12
101
@adityagrover_
Aditya Grover
1 year
What makes weather and climate any different from language or vision? A lot if you ask a physicist, but less so under the lens of today's AI. Very excited to release ClimaX! Our vision is to develop a new scientific basis for Earth systems driven by ML foundation models.
@tungnd_13
Tung Nguyen
1 year
Introducing ClimaX, the first foundation model for weather and climate. A fast and accurate one-stop AI solution for a range of atmospheric science tasks. Paper: Blog: Thread🧵 #ML #Climate #Weather #FoundationModel
Tweet media one
35
179
851
4
10
100
@adityagrover_
Aditya Grover
2 years
To appear in #Neurips2022 ! Congrats to my students @_shashankgoel_ and @hbXNov . For those training CLIP models, check out how a simple symmetrization of the contrastive loss can boost zero shot robustness.
@_akhaliq
AK
2 years
CYCLIP: Cyclic Contrastive Language-Image Pretraining abs:
Tweet media one
1
21
155
0
8
95
@adityagrover_
Aditya Grover
3 years
Incorporating rotation invariance in neural nets is key for predicting molecular properties such as energies. Led by Larry Zitnick, our latest work achieves this via convolutions w.r.t. a local coordinate frame. State of the art on the OC20 dataset for catalyst discovery!
@Arxiv_Daily
arXiv Daily
3 years
Rotation Invariant Graph Neural Networks using Spin Convolutions by @mshuaibii et al. #DegreeofFreedom #MachineLearning
0
6
26
0
8
91
@adityagrover_
Aditya Grover
5 months
Except that they already did...at least a year back! Quite strange how this paper ignores all prior work in diffusion models for offline RL. Diffuser (ICML-22), Diffusion Q-Learning (ICLR-23), Decision Diffuser (ICML-23), Decision Stacks (NeurIPS-23; )
@j_foerst
Jakob Foerster
5 months
Diffusion models have revolutionised a number of areas in ML, now they are coming for offline RL. In our paper we guide the samples to be closer to our current policy, reducing the off-policy-ness of the generated data. This will unlock novel world applications of off-policy RL.
4
19
134
2
10
89
@adityagrover_
Aditya Grover
4 years
Prospective students: I will be recruiting PhD students to join my group in Fall 2021. If interested, please apply to the CS department at UCLA (deadline: Dec 1, 2020) and list my name in your application.
3
9
84
@adityagrover_
Aditya Grover
3 years
Data for many scientific domains, such as climate, materials, and biology, lies on manifolds. How do we learn generative models here? Moser's classic results (1965) + Flows to our rescue! SOTA on earth science benchmarks and (first!) scaling to curved surfaces. Details 👇
@lipmanya
Yaron Lipman
3 years
New paper: Introducing Moser Flows (MFs), a new class of continuous normalizing flows (CNFs) on manifolds based on divergences of neural nets. First generative modeling results on general curved surfaces! with Noam Rozen @adityagrover_ @mnick 1/7
Tweet media one
3
61
297
1
11
80
@adityagrover_
Aditya Grover
4 years
Excited to co-host this panel discussion and Q&A! Details in the thread below.
@hima_lakkaraju
𝙷𝚒𝚖𝚊 𝙻𝚊𝚔𝚔𝚊𝚛𝚊𝚓𝚞
4 years
We are organizing a panel discussion/Q&A on “Demystifying ML PhD Applications to US Universities” with faculty spanning various universities this Saturday at 9am PT/12pm ET. Please register at to attend. #AI #ML #AcademicTwitter [1/n]
Tweet media one
8
121
447
0
9
75
@adityagrover_
Aditya Grover
4 years
Improvements in battery technology are critical for building a green and sustainable planet. Co-leading this project (w/ Peter Attia) for the last 4 years of my PhD has been an absolutely amazing "learning" experience in inter-disciplinary collaboration and patience @nature
Tweet media one
@StefanoErmon
Stefano Ermon
4 years
Want to recharge your electric vehicle in 10 minutes? Check out our @nature paper on optimizing battery charging protocols with machine learning👉 🔋 battery testing times slashed by nearly 15-fold via @Stanford
2
52
272
1
10
72
@adityagrover_
Aditya Grover
6 years
Website for our #NIPS2018 Workshop on Relational Representation Learning (R2L) is up! Workshop date: 8th Dec. Papers due on 19th Oct. @paroma_varma @fredsala #stevenholtzen @ProfJenNeville @ermonste @HazyResearch
1
34
67
@adityagrover_
Aditya Grover
5 years
List of accepted papers and pdfs for the Relational Representation Learning Workshop at #NeurIPS2018 is up! Extremely thankful to our reviewers! #R2L @paroma_varma @fredsala @zengola @ProfJenNeville @ermonste @HazyResearch
1
24
67
@adityagrover_
Aditya Grover
5 years
In our #NeurIPS2018 paper, we propose a streamlined variational inference approach for branching during search and demonstrate its utility on some of the hardest constraint satisfaction problems. Code: w/ Tudor Achim & @ermonste
Tweet media one
Tweet media two
1
9
59
@adityagrover_
Aditya Grover
4 years
Never in my wildest dreams did I imagine that one day our Flow-GAN architecture () would generate a #rickroll 😂
If you think the GPT demos with just text are impressive, check this out. I built a model that generates music videos based on text input by replacing the final layers of GPT-3 with a Flow-GAN architecture. [Sound on]
38
304
1K
3
5
56
@adityagrover_
Aditya Grover
3 years
@yudapearl Thanks so much, Judea! Your work is an inspiration to so many young scientists such as myself :)
1
0
53
@adityagrover_
Aditya Grover
4 months
. @johnamqdang has 2/2 papers accepted at #ICLR2024 exploring alignment of LLMs with human preferences. He is also looking for PhD positions, and would be a great fit at the top departments!
@johnamqdang
John Dang
4 months
🎉Happy to share 2 papers accepted to #ICLR2024 w/ @hbXNov @siyan_zhao @adityagrover_ ! 1) Peering Through Preferences: Unraveling Feedback Acquisition for Aligning LLMs: 2) Group Preference Optimization: Few-Shot Alignment of LLMs
Tweet media one
Tweet media two
3
8
78
1
5
50
@adityagrover_
Aditya Grover
6 years
Amidst all the interest in compressed sensing using deep generative models, our new paper shows that sparsity assumptions can co-exist & aid generative models. Long oral at @icmlconf next week! w/ Manik Dhar & @ermonste Paper: Code:
Tweet media one
Tweet media two
0
14
50
@adityagrover_
Aditya Grover
6 years
Check out our new @icmlconf paper on algorithms and applications for unsupervised representation learning of policies in multiagent systems: w/ @alshedivat @rejuvyesh #YuraBurda #HarrisonEdwards @OpenAI Accepted as a long talk at #ICML2018 !
0
13
45
@adityagrover_
Aditya Grover
1 year
Thank you for the kind words. Great time at @UCSanDiego interacting with the scientific ML community on ClimaX, ClimateLearn, and promising paths ahead. Thanks @yuqirose for the invitation and organizing this event!
@astrokarpoor
Preethi Karpoor
1 year
For someone who has ZERO clue on Climate or Climate Modeling, @adityagrover_ is the teacher I need! It was such a pleasure to hear about his work. Also, I’m a bit biased towards Vision Transformers having worked on it previously, so all in all what a fantastic talk!
Tweet media one
2
1
15
0
1
46
@adityagrover_
Aditya Grover
2 years
So much exciting work in RL foundation models pretrained on large offline datasets. Beyond pretraining, how well can these models be finetuned to explore? Check out our Long Oral Presentation on Online Decision Transformers this Thu (7/21) at #ICML2022 .
@qqyuzu
Qinqing Zheng
2 years
Can sequence models balance exploration and exploitation? Excited to share our latest work proposing online decision transformers for online+offline reinforcement learning via sequence modeling! w/ @yayitsamyzhang @adityagrover_
Tweet media one
2
23
139
1
10
46
@adityagrover_
Aditya Grover
4 years
Super grateful to my advisor @StefanoErmon , collaborators, friends, and family for their support during my entire PhD, especially while hunting for a job in the middle of a pandemic. Will miss the wonderful vibe at @StanfordAILab !
1
0
44
@adityagrover_
Aditya Grover
2 years
Agreed! Now more than ever, different fields of AI can talk/see/hear each other (pun intended). If you haven't read this yet, here is our paper on Transformers as Universal Computation Engines:
@karpathy
Andrej Karpathy
2 years
👌👌👏 I wrote earlier about the ongoing consolidation in AI towards transformer architecture from a mostly practical viewpoint, but there are also major implications on paths towards AGI exactly along the lines of this post (& Transformers as Universal Computation Engines paper)
24
78
732
1
1
42
@adityagrover_
Aditya Grover
6 months
Very excited to share this work! Using foundation models for scientific experimentation is a vast opportunity to connect their capabilities with the physical world and accelerate scientific discoveries.
@tungnd_13
Tung Nguyen
6 months
Introducing ExPT, a general-purpose model for few-shot experimental design (ED) that combines unsupervised pretraining and in-context learning. ExPT solves challenging ED problems with only a handful of samples. To appear at #NeurIPS2023 ! Paper:
Tweet media one
3
19
66
0
4
43
@adityagrover_
Aditya Grover
3 years
Excited to help organize this new challenge at #NeurIPS2021 ! The challenge brings together a range of research directions in graph neural nets, sequential modeling, and distribution shifts. More details:
@AIatMeta
AI at Meta
3 years
We are launching the Open Catalyst Challenge, an open AI research competition to build new machine learning models that will help scientists discover new catalysts for efficient, economical green energy storage. Learn more:
Tweet media one
4
89
259
1
8
42
@adityagrover_
Aditya Grover
1 year
ClimaX will appear in #ICML2023 ! 🏖️🧑‍💻🌎 Bonus: @tungnd_13 will also be giving a spotlight talk next week at the upcoming #ICLR2023 workshop on Tackling Climate Change with AI.
@adityagrover_
Aditya Grover
1 year
What makes weather and climate any different from language or vision? A lot if you ask a physicist, but less so under the lens of today's AI. Very excited to release ClimaX! Our vision is to develop a new scientific basis for Earth systems driven by ML foundation models.
4
10
100
1
8
42
@adityagrover_
Aditya Grover
3 years
Excited to share our latest work rethinking RL from a sequence modeling perspective!
@IMordatch
Igor Mordatch
3 years
Can RL algorithms be replaced with transformer-based language models? We’ve looked at this question with our work on Decision Transformer: Website: Code: 1/8
18
257
1K
0
1
41
@adityagrover_
Aditya Grover
4 years
Special JSTAT issue on selected conference papers interfacing ML and physics: Includes our extended #NeurIPS2018 paper on solving constraint satisfaction problems (e.g., SAT) using variational inference (w/ @tachim @ermonste ) and many other cool works!
Tweet media one
Tweet media two
Tweet media three
0
4
40
@adityagrover_
Aditya Grover
2 years
The restriction to offline settings was a key limitation of prior works in RL via autoregressive generative models (eg, GPT). Our latest work sheds light on enabling online+offline RL while retaining much of the simplicity and effectiveness of sequence modeling.
@qqyuzu
Qinqing Zheng
2 years
Can sequence models balance exploration and exploitation? Excited to share our latest work proposing online decision transformers for online+offline reinforcement learning via sequence modeling! w/ @yayitsamyzhang @adityagrover_
Tweet media one
2
23
139
0
1
37
@adityagrover_
Aditya Grover
9 years
Think beyond state-of-the-art deep learning in NLP; a great paper by a DL pioneer, but first a linguist @chrmanning
0
21
34
@adityagrover_
Aditya Grover
2 years
Causal reasoning begins with a causal model. Our #NeurIPS2021 paper proposes Bayesian Causal Discovery (BCD) Nets - a VAE style approach for learning a distribution over causal models using only observational data. Simple and scalable, works extremely well in practice! Details 👇
@ChrisCundy
Chris Cundy
2 years
How can you scalably form a posterior over possible causal mechanisms generating some observational data? Find out in our NeurIPS poster tomorrow (Tuesday), 8.30-10am pacific time! Joint work with @StefanoErmon , @adityagrover_ (1/13)
2
5
47
1
5
35
@adityagrover_
Aditya Grover
3 years
Looking forward to hosting the research roundtable on "Graph-based and combinatorial machine learning" (Table #23 )! w/ @ProfJenNeville #Neurips2020
@WiMLworkshop
WiML
3 years
We would like to express our sincere gratitude to the inspiring cohort of #WiML2020 MENTORS! We have over 120 mentors from academia & industry! Mentorship roundtables are divided into 3 areas Research (Tables 1–28), Career & Life Advice (Tables 29–50) and Sponsor (Tables 51–63).
3
6
95
0
2
33
@adityagrover_
Aditya Grover
5 years
Our @icmlconf paper on Graphite introduces a GNN decoder for latent variable modeling of graphs based on iterative refinement of low-rank approximations. Talk on Thurs at 10 AM in Hall A and poster at 6:30 PM. w/ Aaron Zweig, @ermonste
Tweet media one
0
10
33
@adityagrover_
Aditya Grover
11 months
Introducing Decision Stacks: A new foundation model for RL ✅Modular design for flexible use of generative models (transformer, diffusers, etc) ✅ AR chaining of modules for maximal expressivity ✅Simple and v. effective! Beats all previous transformer, diffusion policies 🧵👇
@siyan_zhao
Siyan Zhao
11 months
Modularity is critical for design of successful software and AI systems🧩. How can we apply this principle to foundation models for Reinforcement Learning? Our latest work, Decision Stacks, improves flexibility of RL through modular generative models. w/ @adityagrover_ 🧵👇(1/8)
3
25
118
0
4
33
@adityagrover_
Aditya Grover
2 years
Excited to be co-organizing the first AI for Climate Science Bridge Program at #AAA23 ! Submit your papers by Nov 18. Further details here: w/ @rejuvyesh @akapoor_av8r @manmeet3591 @niyogidev cc: @RealAAAI @ClimateChangeAI
Tweet media one
@RealAAAI
AAAI
2 years
AAAI seeks proposals for bridges to be held during #AAAI23 . Each bridge combines elements of education, collaboration and outreach. Bridge submissions are due to organizers on Friday, November 18. Details and deadlines:
0
1
4
0
10
32
@adityagrover_
Aditya Grover
3 years
My thesis was titled "Learning to Represent and Reason under Limited Supervision". Link: I will also be presenting my award talk at KDD on August 15 from 4-5 PM PDT.
3
3
31
@adityagrover_
Aditya Grover
6 months
Great work coming from Google DeepMind on medium-range weather forecasting! Thank you @Melissahei @techreview for quoting my thoughts on AI-powered forecasts steadily becoming the new normal👇
@techreview
MIT Technology Review
6 months
Google DeepMind’s weather AI can forecast extreme weather faster and more accurately
2
14
48
2
2
28
@adityagrover_
Aditya Grover
2 years
📢 New #ICML2022 paper: Divergences between probability distributions are well understood. How do we compare a sequence of time-indexed prob dists (e.g., in continuous normalizing flows)? We use mass conservation laws to derive new "probability path divergence" objectives 🧵👇
@helibenhamu
Heli Ben-Hamu
2 years
In our new ICML paper, we introduce CNFM: Continuous Normalizing Flow Matching. A new approach for scalable CNFs on manifolds, the first generative model to scale to moderately high dimensional manifolds! (1/n)
1
32
122
0
0
27
@adityagrover_
Aditya Grover
7 months
Tired of prompt engineering to make your language models work? Check out GPO, our new few-shot alignment algorithm to customize language models to preferences of different groups. w/ @siyan_zhao and @johnamqdang 🧵👇
Tweet media one
@siyan_zhao
Siyan Zhao
7 months
How can we align LLM outputs to better represent group preferences of diverse groups🌎? We introduce Group Preference Optimization (GPO): a transformer module for few-shot alignment of LLMs to group preferences. It can adapt to unseen group preferences in a gradient-free and
2
11
80
0
4
24
@adityagrover_
Aditya Grover
1 year
Many promising results in using ML for weather & climate. But, problems abound: outdated models and non-standardized datasets and evaluation protocols. ClimateLearn reimagines the entire ML stack for easy, rigorous, and reproducible data-driven climate science. (2/n)
Tweet media one
1
3
24
@adityagrover_
Aditya Grover
1 year
We also previewed an early version of our library at a spotlight (!) tutorial at the NeurIPS @ClimateChangeAI Workshop. Slides and recording here:
1
4
23
@adityagrover_
Aditya Grover
1 year
Many more features: Integrated wandb support for monitoring your ML models across a range of custom metrics. Easily produce visualizations to gain insights into model performance.
Tweet media one
1
2
23
@adityagrover_
Aditya Grover
1 year
Our code is open-sourced at: We welcome community contributions and feedback of any kind: new datasets, models, feature requests, bug reports/fixes.
2
2
23
@adityagrover_
Aditya Grover
5 years
Thanks for the invitation @guyvdb and @zengola ! Great time interacting with the ML community at @UCLA about our latest #NeurIPS2019 work:
@guyvdb
Guy Van den Broeck
5 years
We have @adityagrover_ visiting #UCLA today, giving a talk about "Mitigating Bias in Generative Modeling".
Tweet media one
0
1
31
0
1
23
@adityagrover_
Aditya Grover
5 years
@sam_sinha_ @ermonste Yes, lecture videos will be posted online in due course!
12
1
21
@adityagrover_
Aditya Grover
5 years
Turning a bug into a feature via the continuous Bernoulli distribution!
@StatMLPapers
Stat.ML Papers
5 years
The continuous Bernoulli: fixing a pervasive error in variational autoencoders. (arXiv:1907.06845v1 [])
0
2
11
0
1
21
@adityagrover_
Aditya Grover
2 years
How does exploration vs exploitation affect reward estimation? Excited to share our #AISTATS2022 paper that constructs optimal reward estimators by leveraging the demonstrator's behavior en route to optimality. 🧵:
Tweet media one
1
1
20
@adityagrover_
Aditya Grover
1 year
Dataset: Load your favorite weather and climate dataset in just a few lines of code, slice it as you want. Our dataset library is growing fast, suggestions welcome!
Tweet media one
Tweet media two
1
2
20
@adityagrover_
Aditya Grover
1 year
Models: Train state-of-the-art deep learning models for weather forecasting, climate downscaling, climate projections. Have more models/tasks in mind? Let us know!
Tweet media one
1
2
19
@adityagrover_
Aditya Grover
6 years
For those attending @IJCAI_ECAI_18 / @icmlconf , @ermonste and I will be giving a tutorial on deep generative models from 2-6pm today (Sat) at Hall T2. From the basics of Bayes nets to the latest Glow-ing research, everything is on the menu. Outline & Slides:
0
6
19
@adityagrover_
Aditya Grover
1 year
Excited to share 2 new works to be presented at tomorrow's @ClimateChangeAI workshop at #NeurIPS2022 : 1. Tutorial on ML for predicting climate extremes. Oral Presentation! 3:30-4pm PST w/ my students @hbXNov @_shashankgoel_ @tungnd_13 @jasonjewik (1/2)
@ClimateChangeAI
Climate Change AI
1 year
Our workshop, Tackling Climate Change with ML, is happening at #NeurIPS2022 next Friday, Dec 9! Keynotes about work at the intersection of #ML and #ClimateChange by Gustau Camps-Valls ( @isp_uv_es ), @inesliaz and @yuqirose . Find out more at:
Tweet media one
0
15
50
1
4
19
@adityagrover_
Aditya Grover
6 years
How do we accelerate scientific research and discovery when experiments are time-intensive? Check out our #AISTATS2018 paper on experimental design with delayed feedback w/ @ermonste and many other amazing collaborators. Paper: Code:
1
7
19
@adityagrover_
Aditya Grover
6 months
Fully agree with @QuanquanGu . There is nothing limiting autoregressive (AR) models to plan prior to generating (action/word) tokens. In fact, any generative model can do it - AR, diffusion, etc. Our #NeurIPS2023 paper drives this message quite clearly
@QuanquanGu
Quanquan Gu
6 months
I agree with Yann's view that current auto-regressive LLMs may not constitute the ultimate solution for achieving AGI. However, it's important to acknowledge that: Can LLMs do reasoning? Yes. LLM can implement first-order logic, see e.g., by Abulhair
17
50
411
0
3
17
@adityagrover_
Aditya Grover
6 years
For those of you attending #AAAI2018 , happy to chat more about two papers I'll be presenting: - Boosted Generative Models w. @ermonste (Sun, 2-3:30) - Flow-GAN: Combining Maximum Likelihood and Adversarial Learning in Generative Models w. Manik Dhar and @ermonste (Mon, 2-3:30)
0
3
17
@adityagrover_
Aditya Grover
2 years
@dustinvtran @ylecun +1 on probabilistic reasoning significantly advancing research in generative models. Autoregressive models, though, feel oblivious to much of PGM research from 2000s which focused on conditional independences and inference beyond density estimation (eg, marginalization).
0
0
16
@adityagrover_
Aditya Grover
1 year
Corrected blog link:
1
2
16
@adityagrover_
Aditya Grover
1 year
24 hours later: To be or not to be. "Real-time" editing of LLMs is hard!
Tweet media one
@katecrawford
Kate Crawford
1 year
Umm, anyone a little concerned that Bard is saying its training dataset includes... Gmail? I'm assuming that's flat out wrong, otherwise Google is crossing some serious legal boundaries.
Tweet media one
151
450
3K
2
0
16
@adityagrover_
Aditya Grover
10 months
Now with expanded support for new models, tasks, datasets, pretrained checkpoints, and more! Also check out our fresh preprint with benchmarking and analysis on robustness and transfer perf:
@tungnd_13
Tung Nguyen
10 months
ClimateLearn, our PyTorch-based ML library for accessing climate datasets, state-of-the-art models, diverse evaluation metrics, and high-quality visualizations, just reached version 1.0.0! arXiv: Quickstart:
1
32
136
0
1
15
@adityagrover_
Aditya Grover
3 years
Check out the @berkeley_ai blog post on our recent preprint!
@_kevinlu
Kevin Lu
3 years
Complementary blog post to our paper, Pretrained Transformers as Universal Computation Engines, has been released!
0
18
58
0
0
15
@adityagrover_
Aditya Grover
6 years
A truly thoughtful and remarkable researcher+advisor! Congratulations @ermonste !
@IJCAIconf
IJCAIconf
6 years
A great pleasure to announce. Congratulations Stefano Ermon, @ermonste , Stanford University, the winner of the 2018 IJCAI Computers and Thought Award.
Tweet media one
5
18
69
0
2
15
@adityagrover_
Aditya Grover
1 year
A key challenge with sequential decision making is that task specifications lack a universal API. Our latest work at #NeurIPS2022 looks at masking as a highly effective pretraining paradigm that enables flexible downstream RL for goal reaching, skill prompting, offline RL.
@haoliuhl
Hao Liu
1 year
In our #NeurIPS2022 work, we explore the generality of masked token prediction for generalizable and flexible reinforcement learning. A 🧵 on the paper
Tweet media one
3
21
83
1
2
14
@adityagrover_
Aditya Grover
1 year
Congrats @tungnd_13 ! Very well deserved 🎉
@tungnd_13
Tung Nguyen
1 year
Very honored to be selected as one of the Amazon fellows at UCLA! And also thanks to my PhD advisor @adityagrover_ for his constant support. #AmazonScience #MINT
6
2
39
1
0
14
@adityagrover_
Aditya Grover
1 year
Developed with an amazing team of students: @tungnd_13 @hbXNov @_shashankgoel_ @jasonjewik @siddnandy Seongbin Park @tang_jingchen
1
2
14
@adityagrover_
Aditya Grover
2 years
@poolio Time to read only the titles
1
0
14
@adityagrover_
Aditya Grover
2 years
Check out our latest work that makes a first step using generative transformers for pretraining black-box optimizers: To appear as an Oral at the NeurIPS 2022 workshop on Foundation Models for Decision Making! w/ S. Krishnamoorthy, S. Mashkaria (2/2)
0
5
12
@adityagrover_
Aditya Grover
6 years
Submission deadline for the R2L Workshop in less than two weeks! We welcome novel research contributions, papers recently published or currently under review, position papers as well as papers introducing datasets and competitions! #NIPS2018
@adityagrover_
Aditya Grover
6 years
Website for our #NIPS2018 Workshop on Relational Representation Learning (R2L) is up! Workshop date: 8th Dec. Papers due on 19th Oct. @paroma_varma @fredsala #stevenholtzen @ProfJenNeville @ermonste @HazyResearch
1
34
67
0
3
12
@adityagrover_
Aditya Grover
6 years
Check out this blog post on our #AAAI2018 paper on Flow-GANs which takes a small step towards a principled quantitative evaluation of maximum likelihood and adversarial learning in generative models. Paper: Code:
@StefanoErmon
Stefano Ermon
6 years
New blog post on our #AAAI2018 FlowGAN paper by @adityagrover_ and Manik Dhar: surprising results comparing GAN vs. mixture of Gaussians! (btw Manik is applying to PhD programs - check out his folder if you are hiring)
0
28
70
2
2
12
@adityagrover_
Aditya Grover
3 years
@AlexGDimakis On its own, the discriminator is meaningless coz it is trained wrt a specific generator. But, in conjunction with the generator (post-training), we have found that it can parameterize an even more powerful generative model.
0
0
12
@adityagrover_
Aditya Grover
2 years
Generating abstract objects, such as task curricula, is challenging due to the plethora of qualitative desiderata and lack of any supervision. Our latest work shows that effective curricula can naturally emerge in appropriate multiagent scenarios. #ICLR2022
@d_yuqing
Yuqing Du
2 years
Humans excel at generating curricula of tasks simply by interacting with other agents. Can RL agents generate similar curricula? We tackle this in our #ICLR2022 paper, Multiagent Selfplay for Automatic Curriculum Generation! w/ @pabbeel , @adityagrover_ 1/8
1
34
172
0
1
12
@adityagrover_
Aditya Grover
6 months
Video-language alignment is critical for controlling agent behaviors in space and time. The large degrees of freedom for misalignment make naive data sourcing and scaling infeasible. What if we can generate and finetune specifically on misaligned scenarios? 🧵👇
@hbXNov
Hritik Bansal
6 months
📢 📽✍️We introduce VideoCon, a video-text dataset for training SOTA alignment model. It resolves a typical issue in video-text alignment models that struggles with robustness. w/ @YonatanBitton , Idan Szpektor, @kaiwei_chang , @adityagrover_ 🧵 1/
3
25
80
0
0
9
@adityagrover_
Aditya Grover
8 years
In spite of being average, Google Scholar was not really a choice until now. AI2 just made our lives easier!
0
4
9
@adityagrover_
Aditya Grover
5 years
Very cool work on continual learning for adaptive Internet video streaming! Also check out the paper:
@justinesherry
billions of packets
5 years
For those of you who missed the news: my friend and colleague Keith Winstein at Stanford will let you watch US TV for free *legally* in your browser here: (And watching it helps networking research and science!)
5
61
205
1
3
8
@adityagrover_
Aditya Grover
6 years
So well deserved! Excellent example of rigorous empirical research. Congrats @alshedivat et al.!
@OpenAI
OpenAI
6 years
Continuous Adaptation via Meta-Learning in Nonstationary and Competitive Environments (, tl;dr: agents that meta-learn outcompete agents that don't, ). Best paper award, Thursday May 3rd, 10:15-10:30am, Exhibition Hall A
3
18
61
0
2
8
@adityagrover_
Aditya Grover
4 years
@an_open_mind Evidence of the harmful biases of generative models in both language and vision is becoming an unfortunate new normal. Not all hope is lost: our recent work at #ICML2020 mitigates such biases for SOTA models *with NO label annotations*.
1
0
8
@adityagrover_
Aditya Grover
1 year
For representative work, see ClimaX: the first foundation model for weather and climate
@tungnd_13
Tung Nguyen
1 year
Introducing ClimaX, the first foundation model for weather and climate. A fast and accurate one-stop AI solution for a range of atmospheric science tasks. Paper: Blog: Thread🧵 #ML #Climate #Weather #FoundationModel
Tweet media one
35
179
851
1
1
8
@adityagrover_
Aditya Grover
1 year
@SaggitariusA Corrected blog link:
0
1
8
@adityagrover_
Aditya Grover
3 years
1
0
8
@adityagrover_
Aditya Grover
1 year
(2) ClimateLearn: open source PyTorch library for standardizing ML for climate science
@adityagrover_
Aditya Grover
1 year
📢Introducing ClimateLearn, a new PyTorch library for accessing climate datasets, state-of-the-art ML models, and high quality training and visualization pipelines. Blog: Docs: Quickstart Colab: 🧵 (1/n)
17
277
1K
1
1
7
@adityagrover_
Aditya Grover
3 years
@pranavrajpurkar @HarvardDBMI @harvardmed Congratulations, Pranav! Harvard is lucky to have you! :)
0
0
7
@adityagrover_
Aditya Grover
1 year
8 years ago, I developed one of the first (basic!) deep nets for wind forecasting w/ @akapoor_av8r @erichorvitz (KDD'15 -). With ClimaX, we can forecast/downscale/project any variable in arbitrary space/time better or comparable to operational systems!
1
0
7