Leo Chen Profile Banner
Leo Chen Profile
Leo Chen

@LeoTZ03

1,236
Followers
234
Following
107
Media
211
Statuses

Protein Designer🪄 @DukeU , Visiting @DeboraMarksLab | Share Daily Papers, Molecular Machine Learning

North Carolina, USA
Joined March 2024
Don't wanna be here? Send us removal request.
@LeoTZ03
Leo Chen
1 month
CRISPR-GPT: An LLM Agent for Automated Design of Gene-Editing Experiments Use Agent "to facilitate the process of selecting CRISPR systems, designing guide RNAs, recommending cellular delivery methods, drafting protocols, and designing validation experiments to confirm editing…
Tweet media one
2
76
208
@LeoTZ03
Leo Chen
20 days
ProteinCLIP: enhancing protein language models with natural language - CLIP-like Model aligns embeddings from protein language model and language models describing protein functions - Excels in PPI, Homology, and Mutation identification
Tweet media one
3
23
141
@LeoTZ03
Leo Chen
2 months
RNA language models predict mutations that improve RNA function A great work from Jennifer Dounda and Jamie Cate's Lab! - Use Genome Taxonomy Database (GTDB) to build the GARNET (Gtdb Acquired RNa with Environmental Temperatures) database - Train a generative GNN model using a…
Tweet media one
Tweet media two
3
27
137
@LeoTZ03
Leo Chen
2 months
Predictomes: A classifier-curated database of AlphaFold-modeled protein-protein interactions Interesting PPI virtual screening work based on AlphaFold-Multimer. - SPOC (Structure Prediction and Omics informed Classifier) is a random forest-based classifier that accurately…
Tweet media one
Tweet media two
Tweet media three
4
43
139
@LeoTZ03
Leo Chen
1 month
Learning to design protein-protein interactions with enhanced generalization ICLR24' - PPIRef, the largest and non-redundant dataset of 3D protein–protein interactions, - PPIformer, a new SE(3)- equivariant model generalizing across diverse protein-binder variants. - Finetune…
Tweet media one
0
24
127
@LeoTZ03
Leo Chen
14 days
Accurate Conformation Sampling via Protein Structural Diffusion - Diffold, a diffusion-based model for robust sampling of diverse protein conformations using amino acid sequences - Transforms AlphaFold2 into a diffusion model, and applies hierarchical reweighting based on…
Tweet media one
0
26
121
@LeoTZ03
Leo Chen
24 days
PocketGen: Generating Full-Atom Ligand-Binding Protein Pockets - Co-designs the residue sequence and full-atom structure of protein pockets for binding - Uses a bilevel graph transformer to model multi-granularity (atom and residue/ligand level) and multi-aspect (intra-protein…
Tweet media one
1
35
116
@LeoTZ03
Leo Chen
22 days
Multi-Scale Protein Language Model for Unified Molecular Modeling | ICML 2024 - ESM-AA, a multi-scale protein language model that enables unified modeling at both the residue and atom scales - Pre-training on multi-scale code-switch protein sequences that randomly unzip residues…
Tweet media one
Tweet media two
1
22
104
@LeoTZ03
Leo Chen
2 months
A 5′ UTR language model for decoding untranslated regions of mRNA and function predictions A new 5′ untranslated region (UTR) Language Model! - Pretrain the model with mask prediction, 5' UTR secondary strcuture prediction, and minimum free energy prediction - Finetune [CLS]…
Tweet media one
1
19
101
@LeoTZ03
Leo Chen
2 months
InstructPLM: Aligning Protein Language Models to Follow Protein Structure Instructions - Use an adapter to connect the Structure Encoder (ProteinMPNN and others) with pLM (ProGen2) - Outperforms ProteinMPNN in terms of Perplexity and Recovery Rate - Validate the model by…
Tweet media one
Tweet media two
1
18
96
@LeoTZ03
Leo Chen
2 months
Designing molecular RNA switches with Restricted Boltzmann machines - Use Restricted Boltzmann machines to design artificial SAM-I riboswitches, focusing on their aptamer domain - The designed sequences were validated through chemical probing, with approximately 30% demonstrating…
Tweet media one
0
21
90
@LeoTZ03
Leo Chen
30 days
Accurate structure prediction of biomolecular interactions with AlphaFold 3 | Nature They use Diffusion with Transformer😲. Replaced invariant point attention with a relatively standard non-equivariant point-cloud diffusion model over all atoms
Tweet media one
Tweet media two
Tweet media three
1
18
89
@LeoTZ03
Leo Chen
2 months
Preference optimization of protein language models as a multi-objective binder design paradigm Use DPO (Direct Pereference Optimization) for Peptide Binder Design with ProGPT2
Tweet media one
1
17
87
@LeoTZ03
Leo Chen
28 days
FAPM: Functional Annotation of Proteins using Multi-Modal Models Beyond Structural Modeling - BLIP-2 for Protein Annotation - Combine ESM2 and Mistral-7B - Outperforms DeepGo series
Tweet media one
1
30
87
@LeoTZ03
Leo Chen
13 days
Mamba for Protein Design
@damiano_sga
Damiano Sgarbossa
13 days
🚀 Excited to introduce ProtMamba! Our novel protein language model designed to facilitate protein design.🔬📄 🔧 1/n
Tweet media one
9
53
238
0
24
77
@LeoTZ03
Leo Chen
1 month
De novo generation of multi-target compounds using deep generative chemistry @NatureComms POLYGON, a VAE based model with reinforcement learning for programmatic generation of new polypharmacology compounds that inhibit multiple protein targets
Tweet media one
Tweet media two
0
12
75
@LeoTZ03
Leo Chen
15 days
Aligning protein generative models with experimental fitness via Direct Preference Optimization @talaldotpdb - ProteinDPO, aligns ESM-IF1 with experimental stability fitness using Direct Preference Optimization - Capable of improved binding affinity prediction and stabilized…
Tweet media one
0
9
75
@LeoTZ03
Leo Chen
1 month
Evolution-Inspired Loss Functions for Protein Representation Learning Evolutionary Ranking (EvoRank) incorporates evolutionary dynamics from MSA-based Soft Labels to learn more diverse protein representations
Tweet media one
1
19
82
@LeoTZ03
Leo Chen
2 months
Latent-based Directed Evolution accelerated by Gradient Ascent for Protein Sequence Design Combining Directed Evolution with Protein Language Models by Gradient Ascent One small question: It seems that Figure 2 directly uses an image from another source without proper citation.…
Tweet media one
Tweet media two
1
12
73
@LeoTZ03
Leo Chen
1 month
PROflow: An iterative refinement model for PROTAC-induced structure prediction - PROFLOW, a flow-matching based approach for PROTAC-induced structure prediction - Created a new pseudo-ternary dataset from binary protein-protein complexes and PROTAC linker graphs…
Tweet media one
0
11
71
@LeoTZ03
Leo Chen
2 months
DiscDiff: Latent Diffusion Model for DNA Sequence Generation VAE-based Latent Diffusion for DNA
Tweet media one
1
10
73
@LeoTZ03
Leo Chen
1 month
Large scale paired antibody language models IgBert and IgT5, trained on two billion unpaired sequences and two million paired sequences of light and heavy chains
Tweet media one
1
17
71
@LeoTZ03
Leo Chen
13 days
Accurate and robust protein sequence design with CarbonDesign @NatMachIntell - Inverseformer to learn from backbone structures, a Markov random field for sequence decoding, and network recycling and multitask learning for side chain design
Tweet media one
0
15
68
@LeoTZ03
Leo Chen
26 days
Predicting equilibrium distributions for molecular systems with deep learning @NatMachIntell "we introduce Distributional Graphormer (DiG), a new deep learning framework for predicting protein structures according to their equilibrium distribution."
Tweet media one
1
18
68
@LeoTZ03
Leo Chen
10 days
PILOT: Equivariant diffusion for pocket conditioned de novo ligand generation with multi-objective guidance via importance sampling - PILOT, Pocket-induced Ligand Optimization Tool - Unconditional pre-training, pocket-conditioned fine-tuning, and property-guided inference…
Tweet media one
1
17
65
@LeoTZ03
Leo Chen
11 days
A generative foundation model for antibody sequence understanding - FAbCon, a causal language model for understanding and designing antibody sequences (144M, 297M, and 2.4B) Preprint: HuggingFace Model:
Tweet media one
0
15
64
@LeoTZ03
Leo Chen
21 days
Cramming Protein Language Model Training in 24 GPU Hours - Cramming challenge, train a transformer-based pLM from scratch in ≤24 hours on 1 GPU, using UniRef50 dataset, and evaluate on FLIP and PPI downstream tasks - Remove query/key/value biases and intermediate linear layer…
Tweet media one
0
14
59
@LeoTZ03
Leo Chen
2 months
HelixFold-Multimer: Elevating Protein Complex Structure Prediction to New Heights HelixFold extends to Multimer case. They showed better performance against AlphaFold and RoseTTAFold
Tweet media one
Tweet media two
Tweet media three
1
11
58
@LeoTZ03
Leo Chen
22 days
Protein Embeddings Predict Binding Residues in Disordered Regions - IDBindT5, ProtT5 for IDR binding sites prediction (I love the style of Figure 1😂)
Tweet media one
2
8
57
@LeoTZ03
Leo Chen
26 days
The Continuous Language of Protein Structure - Generative Invariant Angle Transformer (GIAT), an autoregressive language model that generates protein backbone structures by sampling dihedral angles one residue at a time - Neural von Mises Mixture (NVMM) layer parameterizes the…
Tweet media one
Tweet media two
0
8
57
@LeoTZ03
Leo Chen
1 month
ICLR '24 GEM (Generative and Experimental Perspectives for Biomolecular Design) Workshop acceptances are now open to the public. You can find many interesting works here.
Tweet media one
0
16
54
@LeoTZ03
Leo Chen
2 months
This is really exciting!! - The most extensive dataset of CRISPR operons curated to-date - Finetune ProGen2 to generate diverse CRISPR-Cas proteins I still remember the first protein design project assigned to me (back to 2022) was to design Cas proteins with protein language…
Tweet media one
@jeffruffolo
Jeff Ruffolo
2 months
One of the first projects we took on at @ProfluentBio was designing novel gene editing proteins with language models. This grew into an initiative called OpenCRISPR, and today I’m excited to share that work (including the sequences we designed)!
2
54
242
1
13
52
@LeoTZ03
Leo Chen
14 days
MethylBERT: A Transformer-based model for read-level DNA methylation pattern identification and tumour deconvolution Language Model on Epigenomics? - Uses BERT for encoding methylation patterns, classifies reads into tumor or normal cells, and estimates tumor purity…
Tweet media one
0
7
50
@LeoTZ03
Leo Chen
15 days
ProtT3: Protein-to-Text Generation for Text-based Protein Understanding - Use Q-Fomer to connect ESM2 and Galactica for Protein Text QA, Annotation and Retrival - Two stage training: Retreival and then Generation - Collect and set up new benchmark for Protein Text Evaluation…
Tweet media one
0
9
49
@LeoTZ03
Leo Chen
1 month
Topology-Driven Negative Sampling Enhances Generalizability in Protein-Protein Interaction Prediction - Topological PPNI (Protein-Protein Non-Interaction) to sample high-quality hard negatives using PPI network topology - Contrastive-L3 (CL3) hypothesis to capture…
Tweet media one
1
9
48
@LeoTZ03
Leo Chen
11 days
Genesis: A Modular Protein Language Modelling Approach to Immunogenicity Prediction - Genesis for predicting the immunogenicity of neoantigens - pMHC Encoding Module, TCR Encoding Module, and Context-specific Immunogenicity Prediction Heads Preprint:
Tweet media one
Tweet media two
0
4
47
@LeoTZ03
Leo Chen
25 days
Multi-purpose RNA language modelling with motif-aware pretraining and type-guided fine-tuning @NatMachIntell - Motif-aware pretraining, use base, subsequence and motif-level masking - Type-guided fine-tuning that predicts RNA types and guide feature embedding for downstream…
Tweet media one
Tweet media two
1
9
43
@LeoTZ03
Leo Chen
2 months
RiboDiffusion: Tertiary Structure-based RNA Inverse Folding with Generative Diffusion Models
Tweet media one
Tweet media two
1
9
41
@LeoTZ03
Leo Chen
1 month
GENERATIVE ACTIVE LEARNING FOR THE SEARCH OF SMALL-MOLECULE PROTEIN BINDERS - LAMBDAZERO, a generative active learning approach to efficiently search for synthesizable small molecules with desired properties - Applied with molecular docking to design novel small molecules that…
Tweet media one
0
7
39
@LeoTZ03
Leo Chen
2 months
gRNAde: Geometric Deep Learning for 3D RNA inverse design A Geometric RNA Design Pipeline that Converts Conformational Ensembles into RNA Sequences by @chaitjo . It can be regarded as an equivalent tool to ProteinMPNN for protein design.
Tweet media one
1
11
37
@LeoTZ03
Leo Chen
9 days
A protein sequence-based deep transfer learning framework for identifying human proteome-wide deubiquitinase-substrate interactions @NatureComms - TransDSI (Transfer Learning for Deubiquitinase-Substrate Interactions), protein sequence information + variational graph autoencoder…
Tweet media one
0
6
47
@LeoTZ03
Leo Chen
17 days
UniCorn: A Unified Contrastive Learning Approach for Multi-view Molecular Representation Learning Multi-view representations: - 2D graph masking - 2D-3D contrastive learning - 3D denoising
Tweet media one
1
6
35
@LeoTZ03
Leo Chen
9 days
Functional Protein Design with Local Domain Alignment - PAAG (Protein-Annotation Alignment Generation), a framework for protein design using textual annotations - ProtBERT for proteins and SciBERT for annotation - Align embeddings at both the domain and protein levels using…
Tweet media one
0
5
36
@LeoTZ03
Leo Chen
14 days
Are Genomic Language Models All You Need? Exploring Genomic Language Models on Protein Downstream Tasks - gLMs perform pLMs on protein tasks like melting point prediction and stability prediction - Investigate 6-mer and 3-mer tokenization schemes impact on Performance - Highlight…
Tweet media one
0
8
32
@LeoTZ03
Leo Chen
28 days
Scaffold-Lab: Critical Evaluation and Ranking of Protein Backbone Generation Methods in A Unified Framework Evaluated Genie, FrameDiff, RFdiffusion, TDS, FrameFlow, GPDL, and Chroma
Tweet media one
0
5
28
@LeoTZ03
Leo Chen
1 month
Integrating Large-Scale Protein Structure Prediction into Human Genetics Research Interesting ANNUAL REVIEW
Tweet media one
1
5
27
@LeoTZ03
Leo Chen
9 days
Decoupled Sequence and Structure Generation for Realistic Antibody Design - Two-stage approach: sequence design followed by structure prediction - Composition-based objective with REINFORCE trick to address token repetition in non-autoregressive models. Use ESM-2 (+LoRA) and…
Tweet media one
0
6
27
@LeoTZ03
Leo Chen
1 month
Large-scale chemoproteomics expedites ligand discovery and predicts ligand behavior in cells - Chemoproteomics revealed 47,658 interactions between 407 fragments and 2600 proteins, mostly without prior ligands - Ligands developed for specific proteins, showing translational…
Tweet media one
0
3
14
@LeoTZ03
Leo Chen
9 days
AlphaFold3, a secret sauce for predicting mutational effects on protein-protein interactions - Benchmark AlphaFold3 server over SKEMPI Preprint:
Tweet media one
2
7
52
@LeoTZ03
Leo Chen
10 days
A lot exciting protein ML papers came out today
2
0
12
@LeoTZ03
Leo Chen
9 days
AptaGPT: Advancing aptamer design with a generative pre-trained language model - GPT-2 like model, 86 million parameters, BPE. - Uses SELEX data (35 NT length) from the 3rd, 6th, and 20th rounds of enrichment, focusing on B-cell maturation antigen (BCMA) - Train on 3rd round…
Tweet media one
0
3
12
@LeoTZ03
Leo Chen
9 days
Learning the Language of Antibody Hypervariability - AbMAP (Antibody Mutagenesis-Augmented Processing), a transfer learning framework + pLMs for better antibody modeling - in silico Mutagenesis and multi-tasking learning of structural and functional similarity Preprint:…
Tweet media one
0
12
38
@LeoTZ03
Leo Chen
3 months
[Transformer Model Generated Bacteriophage Genomes are Compositionally Distinct from Natural Sequences] A Generative Model for Viral Genome based on MegaByte Model from @AIatMeta
@jeremydratcliff
Jeremy Ratcliff
3 months
For my first solo author venture, I performed an in-depth stress test of the outputs of the megaDNA transformer model. Through applying fun math I offer some guidance for the intersection of language models and genomics.
2
9
56
1
2
6
@LeoTZ03
Leo Chen
17 days
Metabuli: sensitive and specific metagenomic classification via joint analysis of amino acid and DNA @naturemethods - Leverages both DNA and AA information by metamers, novel k-mers that jointly encode DNA and AA information - Finds exact AA matches and uses DNA information for…
Tweet media one
0
0
7
@LeoTZ03
Leo Chen
3 months
Very interesting work. Definitely worth further reading!
@anthonygitter
Anthony Gitter
3 months
Our manuscript "Biophysics-based protein language models for protein engineering" with @romerolab1 is now on bioRxiv. We present Mutational Effect Transfer Learning (METL), a protein language model trained on biophysical simulations, and showcase it for protein engineering. 1/
Tweet media one
6
70
357
0
2
6
@LeoTZ03
Leo Chen
2 months
Diffusion on language model embeddings for protein sequence generation They trained a Diffusion Model on the Continuous Latent Space of ESM-2-8M, and it outperfoms other sequence-based generative models. It is very similar to what we have done for AMP-Diffusion (Diffusion on…
Tweet media one
2
2
5
@LeoTZ03
Leo Chen
3 months
Unified Generative Modeling of 3D Molecules with Bayesian Flow Networks GeoBFN extends Bayesian Flow Network to 3D molecules generation. Geometric Bayesian Flow Networks (GeoBFN) is a new generative model for 3D molecule geometry that operates in the differentiable parameter…
Tweet media one
0
0
5
@LeoTZ03
Leo Chen
3 months
Quiet-STaR: Language Models Can Teach Themselves to Think Before Speaking "We present Quiet-STaR, a generalization of STaR in which LMs learn to generate rationales at each token to explain future text, improving their predictions. We address key…
Tweet media one
0
1
5
@LeoTZ03
Leo Chen
30 days
just took sequences from CASP 16 Target: M1209 and ran AF3. Here is the output. Thoughts 🤔? Did I do anything wrong?
Tweet media one
0
0
4
@LeoTZ03
Leo Chen
3 months
So ManyFolds, So Little Time: Efficient Protein Structure Prediction With pLMs and MSAs A Folding prediction model that uses pLM and MSA
Tweet media one
0
2
5
@LeoTZ03
Leo Chen
2 months
Rapid and Sensitive Protein Complex Alignment with Foldseek-Multimer Foldseek now supports Multimer Alignment! - Foldseek-Multimer allows fast querying of input complex(es) against millions of targets - It represents each chain-to-chain alignment as a superposition and uses the…
Tweet media one
Tweet media two
1
2
5
@LeoTZ03
Leo Chen
1 month
Totally agree. Disagreement Fuels Progress, Not Disrespect.
@thegautamkamath
Gautam Kamath
1 month
A few of my research principles (& recommend to others): -don't call the earnest contents of peoples' papers "garbage" -laud (don't shame) work for being forthcoming with flaws -accept that the 1st paper on a topic won't be uniformly SotA vs *the most popular topic in ML* 1/2
21
34
504
2
0
5
@LeoTZ03
Leo Chen
2 months
Really cool application! “In this study, we use a dataset of over 524,000 Trastuzumab variants to show that standard computational methods such as BLOSUM, AbLang, ESM, and ProteinMPNN can be used to design highly diverse, binder-enriched antibody libraries from just a single…
@LewisChinery
Lewis Chinery
2 months
Happy to share our recent work on antibody library design, affinity prediction, and optimisation - 'Baselining the Buzz. Trastuzumab-HER2 affinity, and beyond!' Preprint (inc. SI) - Code - Data - 🧵
6
17
93
1
2
5
@LeoTZ03
Leo Chen
2 months
Diffusion on language model embeddings for protein sequence generation [] AMP-Diffusion []
0
1
4
@LeoTZ03
Leo Chen
3 months
@instadeepai team has two interesting preprints for protein design. Contrasting Sequence with Structure: Pre-training Graph Representations with PLMs They use CLIP module to align protein sequence and strucure at both residue and chain level.
Tweet media one
1
0
4
@LeoTZ03
Leo Chen
3 months
NeuroFold: A Multimodal Approach to Generating Novel Protein Variants in silico NeuroFold is a novel multimodal model that integrates sequence, structural, and evolutionary information to design functional enzyme variants, as demonstrated by the…
Tweet media one
0
0
4
@LeoTZ03
Leo Chen
2 months
Nicheformer: a foundation model for single-cell and spatial omics Nicheformer Dropped!
Tweet media one
0
0
4
@LeoTZ03
Leo Chen
3 months
ProLLaMA: A Protein Large Language Model for Multi-Task Protein Language Processing ProLLaMA introduces a novel training framework that transforms general LLMs into multi-task Protein LLMs capable of understanding both protein sequences and natural…
Tweet media one
0
1
4
@LeoTZ03
Leo Chen
1 month
Identification of clinically relevant T cell receptors for personalized T cell therapy using combinatorial algorithms @NatureBiotech - TRTpred, an antigen-agnostic in silico predictor of tumor-reactive T cell receptors (TCRs), using classic machine learning techniques (elastic…
Tweet media one
0
1
4
@LeoTZ03
Leo Chen
3 months
Machine learning is IID; science is OOD
@davidwhogg
David W Hogg
3 months
Simon Batzner (DeepMind), speaking @iaifi_news two days ago; “Machine learning is IID; science is OOD.” I couldn’t agree more.
3
22
149
1
1
4
@LeoTZ03
Leo Chen
2 months
@jure @ChanZuckerberg 1,000 is wild for a lab
0
0
4
@LeoTZ03
Leo Chen
3 months
Branched chemically modified poly(A) tails enhance the translation capacity of mRNA “… designed and synthesized topologically and chemically modified mRNAs with multiple synthetic poly(A) tails. Here we demonstrate that the optimized multitailed mRNA yielded ~4.7–19.5-fold…
Tweet media one
0
2
4
@LeoTZ03
Leo Chen
3 months
FoldToken: Learning Protein Language via Vector Quantization and Beyond They trained VA-VAE Like model to learn discrect folding tokens and then a GPT-like model for generation and modeling.
Tweet media one
0
1
4
@LeoTZ03
Leo Chen
2 months
Two papers came out on Nature discussing how FOXO1 can potentially enhance CAR-T therapy. FOXO1 enhances CAR T cell stemness, metabolic fitness and efficacy FOXO1-ADA (a constitutively active variant of FOXO1) overexpression leads to increased polyfunctionality of CAR T cells,…
Tweet media one
Tweet media two
1
1
3
@LeoTZ03
Leo Chen
2 months
Structural basis of Integrator-dependent RNA polymerase II termination
Tweet media one
Tweet media two
Tweet media three
0
1
3
@LeoTZ03
Leo Chen
2 months
Bilingual Language Model for Protein Sequence and Structure Seq2Seq Strucutre (FoldSeek Tokens) and Sequence co-training
Tweet media one
0
1
3
@LeoTZ03
Leo Chen
3 months
Continuous evolution of compact protein degradation tags regulated by selective molecular glues
Tweet media one
0
0
3
@LeoTZ03
Leo Chen
16 days
0
0
3
@LeoTZ03
Leo Chen
2 months
@json_yim So true lol. Already saw two of them
1
0
3
@LeoTZ03
Leo Chen
2 months
Engineering highly active and diverse nuclease enzymes by combining machine learning and ultra-high-throughput screening "...we combined ultra-high-throughput functional screening with ML and compared to parallel in-vitro directed evolution (DE) and in-silico hit recombination…
Tweet media one
0
1
2
@LeoTZ03
Leo Chen
3 months
New Preprint from Baker’s Lab! “Here we demonstrate that a fine-tuned RFdiffusion network is capable of designing de novo antibody variable heavy chains (VHH's) that bind user-specified epitopes.”
0
1
3
@LeoTZ03
Leo Chen
12 days
@damiano_sga How did you make the mamba figure btw? It looks so cool :)
1
0
3
@LeoTZ03
Leo Chen
2 months
@amelie_iska @huggingface 🔥💯💯 let’s go! Maybe I should write some blog posts as well lol
1
0
2
@LeoTZ03
Leo Chen
2 months
@SimMat20 @anthonygitter @countablyfinite @LucyColwell37 It’s super cool, but I don’t think it’s his work tho?
0
0
0
@LeoTZ03
Leo Chen
3 months
PPIscreenML: Structure-based screening for protein-protein interactions using AlphaFold It uses AF2 confidence measures and energetic terms from the Rosetta scoring function to classify interacting proteins and compelling decoy pairs.
Tweet media one
0
0
1
@LeoTZ03
Leo Chen
2 months
PLMSearch: Protein language model powers accurate and fast sequence search for remote homology "PLMSearch (Protein Language Model), a homologous protein search method with only sequences as input. PLMSearch uses deep representations from a pre-trained protein language model and…
Tweet media one
0
1
2
@LeoTZ03
Leo Chen
3 months
Programmable protein expression using a genetically encoded m6A sensor The study introduces GEMS, a novel genetically encoded sensor for detecting m6A RNA methylation, by producing fluorescence in its presence. This technique leverages a fusion protein…
0
0
2
@LeoTZ03
Leo Chen
2 months
Model Stock: All we need is just a few fine-tuned models This paper introduces an efficient fine-tuning method for large pre-trained models, offering strong in-distribution (ID) and out-of distributions (OOD) performance
Tweet media one
1
1
2
@LeoTZ03
Leo Chen
3 months
IMPROVING ALPHAFOLD2 PERFORMANCE WITH A GLOBAL METAGENOMIC & BIOLOGICAL DATA SUPPLY CHAIN The study introduces BaseFold, an advancement of AlphaFold2, by incorporating a comprehensive global metagenomic and biological data supply chain. This innovative…
0
0
2