Laura Lyman
Instructor of Mathematics, Statistics, and Computer Science (MSCS)
Macalester College |
Laura Lyman is an instructor of mathematics, statistics, and computer science at Macalester College, a top-tier liberal arts school located in Saint Paul, Minnesota. She is an applied mathematician who researches how uncertainty propagates through models in science and engineering using a class of tools called spectral methods.
Prior to her current role, Laura did her PhD work at Stanford University in the Institute for Computational and Mathematical Engineering (ICME). She is a recipient of the Stanford Centennial Teaching Award and the ICME Instructor Award for 2021-2022. |
Workshop: Introduction to Linear Regression
December 7, 2022; 8:00-9:00 am PST
December 7, 2022; 8:00-9:00 am PST
Linear regression is a fundamental tool in statistics and data science for modeling the relationship between different parameters. It can be used for prediction, forecasting and error reduction by fitting a predictive model between a response variable and a collection of explanatory variables based on an observed data set. Through linear regression analysis, we can quantify the strength of the linear relationship between the response and different explanatory variables, and we can identify parameters that may contain redundant information.
This workshop introduces the basics of simple and multiple linear regression. We will present both mathematical theory and applications in the context of real data sets — ranging from survey results collected by the US National Center for Health Statistics (NHANES), to real estate listings in Sacramento, CA. After the talk, the R code used will be provided, so attendees can revisit examples of how to apply this foundational modeling method.
This workshop introduces the basics of simple and multiple linear regression. We will present both mathematical theory and applications in the context of real data sets — ranging from survey results collected by the US National Center for Health Statistics (NHANES), to real estate listings in Sacramento, CA. After the talk, the R code used will be provided, so attendees can revisit examples of how to apply this foundational modeling method.
Workshop: What would we do without Linear Algebra, part III: Singular Value Decomposition and Principal Component Analysis
September 29, 2021; 8:00-9:15 am PST
September 29, 2021; 8:00-9:15 am PST
In this third workshop in linear algebra, we will investigate the link between Principal Component Analysis and the Singular Value Decomposition.
Along the way, we are introduced to several linear algebra concepts including linear regression, eigenvalues and eigenvectors and conditioning of a system. We will use shared python scripts and several examples to demonstrate
the ideas discussed.
This workshop builds on the previous 2 workshops in linear algebra (Part I and Part II), and we will assume that the linear algebra concepts introduced in those workshops are familiar to the audience. They include: vector algebra (including inner products, angle between vectors), matrix-vector multiplications, matrix-matrix multiplications, matrix-vectors solves, singularity, and singular values.
Along the way, we are introduced to several linear algebra concepts including linear regression, eigenvalues and eigenvectors and conditioning of a system. We will use shared python scripts and several examples to demonstrate
the ideas discussed.
This workshop builds on the previous 2 workshops in linear algebra (Part I and Part II), and we will assume that the linear algebra concepts introduced in those workshops are familiar to the audience. They include: vector algebra (including inner products, angle between vectors), matrix-vector multiplications, matrix-matrix multiplications, matrix-vectors solves, singularity, and singular values.
Workshop: What would we do without Linear Algebra, part II: Diving Deeper into the Singular Value Decomposition
May 26, 2021; 8:00-9:15 am PST
May 26, 2021; 8:00-9:15 am PST
Prerequisite: We will assume that you are familiar with the vector and matrix algebra discussed in part I (see below).
This is the second of our workshops devoted to linear algebra, which forms the foundation of many algorithms in data science. In part I of the series we introduced vector and matrix algebra, and briefly looked at the intriguing and ever so useful Singular Value Decomposition (SVD). In this workshop, we will take a deeper into the SVD. We will explain how it is derived, how it can be computed, and also how it is used.
This is the second of our workshops devoted to linear algebra, which forms the foundation of many algorithms in data science. In part I of the series we introduced vector and matrix algebra, and briefly looked at the intriguing and ever so useful Singular Value Decomposition (SVD). In this workshop, we will take a deeper into the SVD. We will explain how it is derived, how it can be computed, and also how it is used.
Back to Workshop Instructors