[2009.03393] Generative Language Modeling for Automated Theorem Provingopen searchopen navigation menucontact arXivsubscribe to arXiv mailings

We explore the application of transformer-based language models to automated theorem proving. This work is motivated by the possibility that a major limitation of automated theorem provers compared to humans -- the generation of original mathematical terms -- might be addressable via generation from language models. We present an automated prover and proof assistant, GPT-f, for the Metamath formalization language, and analyze its performance. GPT-f found new short proofs that were accepted into the main Metamath library, which is to our knowledge, the first time a deep-learning based system has contributed proofs that were adopted by a formal mathematics community.

29 mentions: @hillbig@jaguring1@arankomatsuzaki@icoxfog417@ak92501@spolu@NPCollapse@evolvingstuff
Date: 2020/09/09 02:21

Referring Tweets

@jaguring1 OpenAIのAI研究(GPT-f) 言語モデルを用いて自動定理証明。GPT-fが、Metamathライブラリに受け入れられた新しい短い証明を発見。これは、ディープラーニングがおそらく初めて正式な数学コミュニティに採用された証明を提供した。トランスフォーマーは形式推論にも適してる t.co/z2PMMgyoLJ
@hillbig GPT-fはTransformerを使った言語モデルで自動定理証明を行う。ウェブ収集された数学関連の論文などで事前学習し、証明目標で条件付した証明経路の生成確率を使って探索する。発見された中で23個の証明はこれまでの証明より短くわかりやすくMetamath証明ライブラリに加えられたt.co/RiMM5QGQQB
@icoxfog417 自動定理証明を言語モデルで行う研究。A=Cを証明したい場合A=BでB=C、といったステップを踏むことになるがこのステップを言語モデルにより推定していく(証明過程=系列とみなせる)。最終的に最も証明確率が高いルートを採用する。事前学習によりブースト可能なことも確認 t.co/UonZqGeA2p
@spolu Posted my first paper on arXiv💥🙌 GPT-f is a Transformer-based automated theorem prover. We show that Transformer + Search is suitable to formal reasoning and continuous self-improvement 🦾 t.co/VllDcCV3Kc t.co/5ttVX0MNBC
@arankomatsuzaki Generative Language Modeling for Automated Theorem Proving GPT-f, a Transformer LM pretrained on arXiv and trained through continuous self improvement for theorem proving, finds novel short proofs. t.co/NKIY7WgVRm t.co/iYzqnHAddJ
@NPCollapse Compute is eating AI. @OpenAI trains a state of the art theorem prover using a GPT architecture. I was waiting for this to happen, neat! Though for real...wtf is this proof? Maybe the gains were just that you need an AI to understand this language haha! t.co/VXV4gWZcNK t.co/cNpqNiZmcc
@mattshumer_ The transformer architecture is going to become way more common in the near future. Its flexibility is incredible. Paper on GPT models trained to prove theorems: t.co/ruKJyMEMHm
@evolvingstuff Generative Language Modeling for Automated Theorem Proving "... to our knowledge, the first time a deep-learning based system has contributed proofs that were adopted by a formal mathematics community." t.co/8SwnivR8wz t.co/yi67uEv2gh
@ak92501 Generative Language Modeling for Automated Theorem Proving pdf: t.co/gmD9HK1R80 abs: t.co/qstcD7UYyZ t.co/nKKECgFhpT

Bookmark Comments

Related Entries

Read more [1904.00962] Reducing BERT Pre-Training Time from 3 Days to 76 Minutes
0 users, 12 mentions 2019/04/08 11:15
Read more [1911.05722] Momentum Contrast for Unsupervised Visual Representation Learning
0 users, 11 mentions 2019/11/18 23:21
Read more [2001.07685] FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidencecontact a...
0 users, 11 mentions 2020/01/22 20:20
Read more The advantages of the Matthews correlation coefficient (MCC) over F1 score and accuracy in binary cl...
0 users, 1 mentions 2020/04/27 20:21
Read more Visual Grounding in Video for Unsupervised Word Translation | DeepMind
0 users, 2 mentions 2020/06/19 15:51