Together Duke: Advancing Excellence Through Community
Natural Language Processing Winter School
January 5-7, 2020
Together Duke is pleased to announce the Natural Language Processing Winter School (NLP-WS), being offered in January 2020, as a three-day class that provides lectures on the fundamentals of machine learning, and natural language processing.
Summary
Machine learning is a field characterized by development of algorithms that are implemented in software and run on a machine (e.g., computer, mobile device, etc.). Each such algorithm is characterized by a set of parameters, and particular parameter settings yield associated algorithm characteristics. The algorithms have the capacity to learn, based on observed data. By “learn” it is meant that the algorithm can rigorously quantify which parameter settings are best matched to the data of interest.
Recently, with increasing access to massive datasets, and to significant advances in computing resources, the quality of machine learning performance has improved markedly. Further, over the last five years, significant advances have been made in a subfield of machine learning called “deep learning.”
In the Natural Language Processing (NLP) Winter School (WS), a focus will be placed on an area of machine learning that is impacting many areas of life: the capacity of machine learning to “read,” analyze and synthesize natural text. The NLP-WS will introduce participants to the deep-learning technology that has revolutionized (within the last several years) the capacity of machines to perform language translation, to answer questions posed for given text, and to generate (synthesize) text that is near human-generated quality.
Professors Lawrence Carin, David Carlson, and Ricardo Henao will co-lead the NLP-WS.
Who Should Attend
The NLP-WS is targeted to individuals interested in learning about machine learning, with a specific focus on an area for which it is expected to make a big impact in the lives of people: natural language processing. The NLP-WS will introduce the mathematics and statistics at the foundation of applying deep learning to state-of-the-art NLP. Additionally, the NLP-WS will provide details on case studies, demonstrating how this technology is used in practice.
The NLP-WS is meant to provide value to students at multiple levels of mathematical sophistication (including with limited such background). On each day of the NLP-WS, an initial emphasis will be placed on presenting the concepts as intuitively as possible, with minimum math. As the concepts are developed further, more math will be introduced, but only the minimum necessary to explain the concepts. Finally, case studies will show how the technology is used in practice, and these discussions should be accessible to most students (concepts emphasized over detailed math). Strength in mathematics and statistics is a significant plus, and will make all NLP-WS material accessible; however, it is not required to benefit from much of the program. The class will also introduce participants to the coding software used to make such technology work in practice.
Program Format
During each day of the NLP-WS, the morning sessions will be devoted to introducing fundamentals on the area of focus that day. The first afternoon session will be devoted to case studies, of the methods applied to specific application areas. Finally, in the last session of each day, software implementation will be discussed, based on modern TensorFlow software.
Each day of the NLP-WS will be arranged as follows:
- 9:00-10:15am Lecture 1: Mathematically-light introduction to the focus of the day
- 10:45am-noon Lecture 2: Mathematically rigorous discussion of the focus of the day
- 1:30-2:30pm Case Study: Example of the machine learning concept in practice
- 3:00-4:15pm Discussion of how the concept of the day is implemented in software
Curriculum
The broad areas of emphasis for the three-day class are as follows:
January 5:
Basic concepts in machine learning
Introduction to model building
Scaling to “big data” with stochastic gradient descent
Backpropagation as an efficient computation method
January 6:
Neural networks applied to natural language processing
Word embeddings
Recurrent neural networks (long short-term memory; i.e., LSTM models)
Sentiment analysis
January 7:
Attention-based models in neural network analysis and synthesis of text
The Transformer neural network
Application of the Transformer network for language translation and language synthesis
Program Details: Location, Registration and Cost
NLP-WS is being held in the HCA Auditorium, which is in Breeden Hall at the Fuqua School of Business on Duke’s West Campus. Visitor parking is available in the nearby Bryan Center Parking Garage.
Students (with a valid ID, at Duke or other universities) will pay a course fee of $100; the fee for non-students is $250, payable through the registration site. All fees are non-refundable.
NLP-WS will be available for up to 200 participants. We will maintain a waitlist beyond the maximum registration, and will contact those on the waitlist as spots become available.
Lecturers
Lecturers in MLWS include:
Registration for the January 2020 Natural Language Processing Winter School closes on January 3, 2020 at 11:59pm. For help or for more information, contact Carolyn Mackman at carolyn.mackman@duke.edu.