[2102.10056] MolCLR: Molecular Contrastive Learning of Representations via Graph Neural Networks

Molecular machine learning bears promise for efficient molecule property prediction and drug discovery. However, due to the limited labeled data and the giant chemical space, machine learning models trained via supervised learning perform poorly in generalization. This greatly limits the applications of machine learning methods for molecular design and discovery. In this work, we present MolCLR: Molecular Contrastive Learning of Representations via Graph Neural Networks (GNNs), a self-supervised learning framework for large unlabeled molecule datasets. Specifically, we first build a molecular graph, where each node represents an atom and each edge represents a chemical bond. A GNN is then used to encode the molecule graph. We propose three novel molecule graph augmentations: atom masking, bond deletion, and subgraph removal. A contrastive estimator is utilized to maximize the agreement of different graph augmentations from the same molecule. Experiments show that molecule representatio

12 mentions: @AmirBaratiF@hillbig@janhjensen@KevinKaichuang@hillbig@Montreal_AI@AkiraTOSEI@saqibali_ca
Date: 2021/02/23 15:52

Referring Tweets

@hillbig MolCLRは分子の自己教師あり表現学習。分子にデータオーグメンテーションとして、ランダムに原子、結合、部分グラフをマスク/消去を適用後、同じ分子由来のペアか異なる分子かを対比学習。後続タスクで追加学習。学習データが少ないタスクで特に有効 t.co/9Pgtp31xqQ
@hillbig MolCLR is self-supervised molecular representation learning by contrastive learning. For data augmentation, they used atom masking, bond deletion, and subgraph removal. Especially effective for tasks with a limited number of training data. t.co/9Pgtp31xqQ
@AkiraTOSEI t.co/WryLhODLbo 分子の自己教師あり学習手法MolCLRを提案。原子にマスクかけたりや結合を排除した分子同士でも同じ表現が得られるように学習させる。微調整により様々なタスクでSotA性能を獲得した。 t.co/uuxykIhNzn
@AkiraTOSEI t.co/WryLhODLbo Proposed MolCLR, a self-supervised learning method for molecules. The method is designed to learn the same representation of a molecule even if the atoms are masked or the bonds are eliminated. By fine-tuning, SotA performance was achieved in various tasks t.co/O9VMpebpXs
@KwhRd100 MolCLR:グラフニューラルネットワークを介した表現の分子対照学習 GNNと3つの分子グラフ増大戦略(原子マスキング、結合削除、部分グラフ削除)を用いた分子表現の自己学習(MolCLR)の提案。MolCLRによって学習された分子表現が複数の下流の分子特性予測タスクに転送できる。t.co/XRGsRNuYx8
@AmirBaratiF We developed MolCLR: #Molecular #Contrastive #Learning of #Representations via #Graph #Neural #Networks: Outperforming the state-of-the-art classification on many benchmarks datasets. #molecules #MachineLearning : Check our preprint here: t.co/Oe6pQN3C6M

Related Entries

Read more chainer.training.extensions.Evaluator — Chainer 7.1.0 documentation
0 users, 1 mentions 2020/02/18 06:51
Read more [2002.11296] Sparse Sinkhorn Attentioncontact arXivarXiv Twitter
0 users, 2 mentions 2020/04/08 00:52
Read more [1807.08654] From Bare Metal to Virtual: Lessons Learned when a Supercomputing Institute Deploys its...
0 users, 1 mentions 2020/12/15 09:51
Read more [2012.12124] Non-autoregressive electron flow generation for reaction prediction
0 users, 2 mentions 2021/01/03 15:51
Read more [2010.03409] Learning Mesh-Based Simulation with Graph Networks
0 users, 4 mentions 2021/01/15 00:53