π Emad Elwany/CTO at Lexion on using deep learning to reimagine contract management systems
Deep learning techniques, the role of transformers, and other secrets of the AI-powered contract management startup
Weβve done interviews with ML practitioners from VC funds and ML startups, today weβd like to offer you a perspective from an implementation standpoint. Emad Elwany from the AI-powered contract management platform Lexion shared how exactly they use machine learning in their business. ShareΒ this interview if you find it insightful. No subscription is needed.
You can also leave a comment or ask a question in the comment section below.
π€Β Quick bio / Emad Elwany
Tell us a bit about yourself. Your background, current role and how did you get started in machine learning?Β
Emad Elwany (EE): I'm the CTO and co-founder of Lexion, which I started after working at Microsoft for 8 years. Initially, I was in Search and Paid Search, but then transitioned to Microsoft Research, working on NLP-powered enterprise AI applications. I was the founding engineer for Microsoft's AI-powered calendar scheduling service, and a member of the founding team and a development lead for Microsoft's Conversational AI platform: Azure Bot Service, and the Microsoft Language Understanding Service (LUIS). For my undergrad, I studied Computer Engineering at Alexandria University, Egypt. Then I went to Stanford for grad school to study computer science with an emphasis on ML and Systems Engineering.Β Β Β
π ML WorkΒ Β
Lexion is leveraging AI to modernize a very legacy field such as contract management systems. Can you share what specific deep learning techniques you are applying to this domain?Β Β
EE:Β We leverage transfer learning, contextualized embeddings (e.g., ELMo, BERT), deep neural networks and weakly supervised learning to build the engine that powers our Lexion Contract Management System. Transfer learning has allowed us to leverage major advancements in the fields of NLP via reusing large language models and only fine-tuning them on our domain instead of training them from scratch. For instance, contextualized embeddings have allowed us to build NLP models with a deep semantic understanding of legal agreements. Deep Neural Networks made it possible to build high-capacity models that are able to capture and distill knowledge from the legal domain. Weak Supervision has enabled us to create massive high-quality training sets for hundreds of different concepts with a small team.
In recent years, transformer architectures such as BERT or GPT-3 have revolutionized natural language understanding (NLU) problems such as text summarization, question-answering, reading comprehension, and many others. What is the relevance of transformers to the field of contract management systems?Β Β
EE:Β They are extremely relevant. Transformers have allowed for a step function improvement in both the precision and recall of NLU models that extract structured information from contracts. Using an out-of-the-box model like BERT with a bit of fine-tuning on high-quality training data gave our models a boost that was hard to replicate without building large data sets and doing a lot of feature engineering. We published a paper in NeurIPS 2019 that describes some of our early work (which has evolved quite a bit now):Β https://arxiv.org/abs/1911.00473
Lexion works across many industries and sectors. Are you able train models that reuse knowledge across different domains using techniques such as transfer learning? Or are we still far away from that promise?Β
EE:Β Absolutely we can. Some parts of our ML pipeline can be reused across domains, e.g., a fine-tuned OCR engine or a document segmentation model.
As a philosophy, we try to share as much as possible across the pipeline and only deviate at the latest stages (final task). This allows our pipeline to reuse knowledge from all relevant tasks; it simplifies our operations and reduces training costs. That said, there are always sufficiently unique models for which we need to create separate pipelines (e.g., table parsing), but we're always on the lookout for ways to reuse that knowledge, even if in the simplest forms, like for using statistics on high-value words.
In addition, we don't only try to combine knowledge from different models, but even from different types of models, e.g., most of our NLU models are actually a combination of an NLU task and a computer vision task, where we try to combine both the visual cues and text to get enhanced accuracy. You can learn more about our work in that area from a recent paper we published at KDD:Β https://arxiv.org/abs/2107.08128Β Β
Most advancements in NLU are still constrained to high-resource languages such as English, Spanish or French. How do you think about the challenge of making NLU models that work across many languages?
EE:Β We are currently focused on processing documents only in English. However, we see a path to expanding across other languages. Recent advancements in machine translation are a path to explore. In addition, our investment in NLU tooling for weak supervision, model training, and deployment would also make it viable for us to build training sets and develop models targeted at other languages at a low cost.
What milestones should research reach in the next few years to have an even more profound impact on contract management systems?
EE:Β There is a set of problems that continue to be challenging and are of high value for contract management systems, these include OCR for mixed digital/handwritten text, parsing structured information out of deeply nested and complicated tables as well as linking related documents and correctly merging the extracted information across a hierarchy of documents. In addition, building higher capacity models with deeper representations would allow for an even deeper understanding of what's in contracts and deciphering nuanced legal language that could surpass the performance of a human attorney.Β
π Letβs connect
Follow us on Twitter. We share lots of useful information, events, educational materials, and informative threads that help you strengthen your knowledge.
π₯ Miscellaneous β a set of rapid-fire questionsΒ Β
Favorite math paradox?Β
EE: The Monty Hall problem. I like this problem for multiple reasons. It's an excellent problem for demonstrating that some seemingly trivial problems are actually more nuanced and require deeper thought and correct mathematical tools (in this case, probability theory) to derive the right solution. In addition, I'm particularly inspired by the Monty Hall problem because it also has a history that demonstrates flaws in the way people make judgments, as seen in the infamous Marilyn vos Savant incident where thousands of parties, some with advanced training in mathematics, challenged her correct solution for an extended period before accepting it as the correct solution.
Many problems in NLU have the same characteristics, where they seem simple at face value, but once you peel some layers, you face many challenges that require deeper analysis and more complicated solutions. Even something as simple as extracting the effective date of a contract, something people might expect to be solvable by using a Regex, turns out to be much more complicated in practice.
Is the Turing Test still relevant? Any clever alternatives ?
EE:Β It definitely has its uses, but it is not the most helpful tool for people working in applied ML. With the current state of technology, ML models can certainly solve some problems or some parts of problems with near human parity. That said, these tools remain at best assistive tools to help human experts solve a more complex problem. While I believe it's important to track how far ML models can go compared to humans, it's not the most critical question when working on ML-powered enterprise applications. Instead, the more important question is: does ML provide sufficient value (e.g., time savings, accuracy enhancement, etc.) when employed by a human? Ultimately, business users don't care much if the AI is as sophisticated as a human (it isn't today), they just want to get their jobs done efficiently and correctly.
Any book you would recommend to aspiring data scientists?Β Β
EE:Β There are some obvious choices that are fun and go deep into NLP theory, like Jacob Eisenstein's Introduction to NLP.
But I would also like to suggest books that are more about general machine learning in a practical and pragmatic way, like Deep Learning for Coders by Jeremy Howard and Sylvain Gugger.
And I recommend that engineers who want to get into ML not only focus on ML/NLP but also study the critical peripheral components around ML, such as data processing at scale and working with data in general. Designing Data-Intensive Applications by Martin Kleppmann and Computing with Data by Guy Lebanon and Mohamed El Geish are excellent in that regard.Β
Is P equals NP?
EE:Β If I knew the answer to that question, Lexion would probably already be a multi-billion dollar company (we're on track to get there, but it'll take a little more time). Since computational theory isn't really my field, I'm not qualified to give the most intelligent answer. Still, as a computer scientist and technology enthusiast, I'm very interested in this problem and continually monitor recent advancements. I certainly hope we find the answer during my lifetime. We have no shortage of brilliant folks who can get us there. Some of these brilliant folks work for Lexion, solving other similarly interesting/challenging problems.