Forwarded from Data Mining & Machine learning (Hirah Tang)
Medium
Understand Classifier Guidance and Classifier-free Guidance in diffusion models via Python pseudo-code
We introduce conditional controls in diffusion models in generative AI, which involves classifier guidance and classifier-free guidance.
Forwarded from Graph Machine Learning
GraphML News (Oct 5th) - ICLR 2025 Graph and Geometric DL Submissions
📚 Brace yourselves, for your browser is about to endure 50+ new tabs. All accepted NeurIPS 2024 papers are now visible (titles and abstracts), and a new batch of goodies from ICLR’25 has just arrived. Tried to select the papers that haven't yet appeared during the ICML/NeurIPS cycles. PDFs will be available on the respective OpenReview pages shortly:
Towards Graph Foundation Models:
GraphProp: Training the Graph Foundation Models using Graph Properties
GFSE: A Foundational Model For Graph Structural Encoding
Towards Neural Scaling Laws for Foundation Models on Temporal Graphs
Graph Generative Models:
Quality Measures for Dynamic Graph Generative Models
Improving Graph Generation with Flow Matching and Optimal Transport
Equivariant Denoisers Cannot Copy Graphs: Align Your Graph Diffusion Models
Topology-aware Graph Diffusion Model with Persistent Homology
Hierarchical Equivariant Graph Generation
Smooth Probabilistic Interpolation Benefits Generative Modeling for Discrete Graphs
GNN Theory:
Towards a Complete Logical Framework for GNN Expressiveness
Rethinking the Expressiveness of GNNs: A Computational Model Perspective
Learning Efficient Positional Encodings with Graph Neural Networks
Equivariant GNNs:
Improving Equivariant Networks with Probabilistic Symmetry Breaking
Does equivariance matter at scale?
Beyond Canonicalization: How Tensorial Messages Improve Equivariant Message Passing
Spacetime E(n) Transformer: Equivariant Attention for Spatio-temporal Graphs
Rethinking Efficient 3D Equivariant Graph Neural Networks
Generative modeling with molecules (hundreds of them actually):
AssembleFlow: Rigid Flow Matching with Inertial Frames for Molecular Assembly
RoFt-Mol: Benchmarking Robust Fine-tuning with Molecular Graph Foundation Models
Multi-Modal Foundation Models Induce Interpretable Molecular Graph Languages
MolSpectra: Pre-training 3D Molecular Representation with Multi-modal Energy Spectra
Reaction Graph: Toward Modeling Chemical Reactions with 3D Molecular Structures
Accelerating 3D Molecule Generation via Jointly Geometric Optimal Transport
📚 Brace yourselves, for your browser is about to endure 50+ new tabs. All accepted NeurIPS 2024 papers are now visible (titles and abstracts), and a new batch of goodies from ICLR’25 has just arrived. Tried to select the papers that haven't yet appeared during the ICML/NeurIPS cycles. PDFs will be available on the respective OpenReview pages shortly:
Towards Graph Foundation Models:
GraphProp: Training the Graph Foundation Models using Graph Properties
GFSE: A Foundational Model For Graph Structural Encoding
Towards Neural Scaling Laws for Foundation Models on Temporal Graphs
Graph Generative Models:
Quality Measures for Dynamic Graph Generative Models
Improving Graph Generation with Flow Matching and Optimal Transport
Equivariant Denoisers Cannot Copy Graphs: Align Your Graph Diffusion Models
Topology-aware Graph Diffusion Model with Persistent Homology
Hierarchical Equivariant Graph Generation
Smooth Probabilistic Interpolation Benefits Generative Modeling for Discrete Graphs
GNN Theory:
Towards a Complete Logical Framework for GNN Expressiveness
Rethinking the Expressiveness of GNNs: A Computational Model Perspective
Learning Efficient Positional Encodings with Graph Neural Networks
Equivariant GNNs:
Improving Equivariant Networks with Probabilistic Symmetry Breaking
Does equivariance matter at scale?
Beyond Canonicalization: How Tensorial Messages Improve Equivariant Message Passing
Spacetime E(n) Transformer: Equivariant Attention for Spatio-temporal Graphs
Rethinking Efficient 3D Equivariant Graph Neural Networks
Generative modeling with molecules (hundreds of them actually):
AssembleFlow: Rigid Flow Matching with Inertial Frames for Molecular Assembly
RoFt-Mol: Benchmarking Robust Fine-tuning with Molecular Graph Foundation Models
Multi-Modal Foundation Models Induce Interpretable Molecular Graph Languages
MolSpectra: Pre-training 3D Molecular Representation with Multi-modal Energy Spectra
Reaction Graph: Toward Modeling Chemical Reactions with 3D Molecular Structures
Accelerating 3D Molecule Generation via Jointly Geometric Optimal Transport
Forwarded from Graph Machine Learning
Generative modeling with proteins (hundreds of them either):
EquiJump: Protein Dynamics Simulation via SO(3)-Equivariant Stochastic Interpolants
Design of Ligand-Binding Proteins with Atomic Flow Matching
RapidDock: Unlocking Proteome-scale Molecular Docking
Deep Learning for Protein-Ligand Docking: Are We There Yet?
ProteinBench: A Holistic Evaluation of Protein Foundation Models
Fast and Accurate Blind Flexible Docking
Solving Inverse Problems in Protein Space Using Diffusion-Based Priors
Crystals and Materials:
Flow Matching for Accelerated Simulation of Atomic Transport in Materials
MOFFlow: Flow Matching for Structure Prediction of Metal-Organic Frameworks
Learning the Hamiltonian of Disordered Materials with Equivariant Graph Networks
Designing Mechanical Meta-Materials by Learning Equivariant Flows
SymmCD: Symmetry-Preserving Crystal Generation with Diffusion Models
Rethinking the role of frames for SE(3)-invariant crystal structure modeling
A Periodic Bayesian Flow for Material Generation
ECD: A Machine Learning Benchmark for Predicting Enhanced-Precision Electronic Charge Density in Crystalline Inorganic Materials
Wyckoff Transformer: Generation of Symmetric Crystals
PDDFormer: Pairwise Distance Distribution Graph Transformer for Crystal Material Property Prediction
EquiJump: Protein Dynamics Simulation via SO(3)-Equivariant Stochastic Interpolants
Design of Ligand-Binding Proteins with Atomic Flow Matching
RapidDock: Unlocking Proteome-scale Molecular Docking
Deep Learning for Protein-Ligand Docking: Are We There Yet?
ProteinBench: A Holistic Evaluation of Protein Foundation Models
Fast and Accurate Blind Flexible Docking
Solving Inverse Problems in Protein Space Using Diffusion-Based Priors
Crystals and Materials:
Flow Matching for Accelerated Simulation of Atomic Transport in Materials
MOFFlow: Flow Matching for Structure Prediction of Metal-Organic Frameworks
Learning the Hamiltonian of Disordered Materials with Equivariant Graph Networks
Designing Mechanical Meta-Materials by Learning Equivariant Flows
SymmCD: Symmetry-Preserving Crystal Generation with Diffusion Models
Rethinking the role of frames for SE(3)-invariant crystal structure modeling
A Periodic Bayesian Flow for Material Generation
ECD: A Machine Learning Benchmark for Predicting Enhanced-Precision Electronic Charge Density in Crystalline Inorganic Materials
Wyckoff Transformer: Generation of Symmetric Crystals
PDDFormer: Pairwise Distance Distribution Graph Transformer for Crystal Material Property Prediction
Forwarded from Data Mining & Machine learning (Hirah Tang)
Forwarded from Data Mining & Machine learning (Hirah Tang)
Quanta Magazine
How AI Revolutionized Protein Science, but Didn’t End It | Quanta Magazine
Three years ago, Google’s AlphaFold pulled off the biggest artificial intelligence breakthrough in science to date, accelerating molecular research and kindling deep questions about why we do science.
Forwarded from Data Mining & Machine learning (Hirah Tang)
Forwarded from 懒人的梦呓 (virusyu🅥)
推荐吴恩达的新书:《How to build your career in AI》,不需要技术门槛也能阅读,这本书提供了全方位的AI职业发展建议,包括AI基础技能需要什么,求职面试,如何打造作品集,怎么建立人脉网络等,如果你想往AI方面发展,很值得一读。
下载地址:https://info.deeplearning.ai/how-to-build-a-career-in-ai-book
下载地址:https://info.deeplearning.ai/how-to-build-a-career-in-ai-book
info.deeplearning.ai
How to Build Your Career in AI eBook - Andrew Ng Collected Insights
Get The How to Build Your Career in AI eBook By Andrew NG | Free download | an introductory book about starting and building a successful career in AI