How to Contribute
Sign up · Log in
1 year ago
Paper Summary: Neural Generation of Regular Expressions from Natural Language with Minimal Domain Knowledge
Task of translating natural language queries into regular expressions without using domain specific knowledge.
Proposes a methodology for collecting a large corpus of regular expressions to natural language pairs.
Reports performance gain of 19.6% over state-of-the-art models.
Link to the paper
LSTM based sequence to sequence neural network (with attention)
One-word embedding layer
Two encoder layers
Two decoder layers
One dense output layer.
Attention over encoder layer.
Dropout with the probability of 0.25.
20 epochs, minibatch size of 32 and learning rate of 1 (with decay rate of 0.5)
Created a public dataset -
- with 10K pair of (regular expression, natural language)
Two step generate-and-paraphrase approach
: Use handcrafted grammar to translate regular expressions to natural language.
: Crowdsourcing the task of translating the rigid descriptions into more natural expressions.
: Functional equality check (called DFA-Equal) as same regular expression could be written in many ways.
Proposed architecture outperforms both the baselines - Nearest Neighbor classifier using Bag of Words (BoWNN) and Semantic-Unify
Read more…(178 words)
About the author:
Join the discussion. Add a reply…
Table of contents
Ready to join our community?
Sign up below to automatically get notified of new courses, get
to finish ones you subscribe to, and
lessons to read later.
Continue with Facebook
— OR —
Your Full Name
I have an account. Log in instead
By signing up, you agree to our
Natural Language Processing
UX & UI Design
How to Contribute
Upgrade to Pro
Help and FAQ
Privacy, Terms & Refunds
Get in touch
Copyright 2016-18, Compose Labs Inc. All rights reserved.