Project MOSAIC Kick-Off Workshop

Project MOSAIC’s first event was a Kick-Off Workshop, sponsored by the Institute for Mathematics and its Applications. The workshop, held on Tuesday to Friday June 29 through July 2, 2010, provided an opportunity to to present some curricular innovations that have already developed, to discuss the overall organization of a curriculum that would be effective at unifying the MSCC subjects, and to establish strategic plans for Project MOSAIC.

Wednesday June 30

Fadil Santosa : Introduction to the IMA

Daniel Kaplan: Introduction to MOSAIC

MOSAIC is intended to improve university-level mathematics education for science, technology, and engineering students by highlighting the connections among modeling, statistics, computation, and calculus. I review the motivation for MOSAIC, stemming largely from the MAA CRAFTY reports of the last decade and the famous Bio2010 report. I analyze why the impact of these reports has been slight. The structure of the MOSAIC project is outlined.

Links to slides and video

The workshop participants introduce themselves: video


Project MOSAIC initiatives come in all sizes, from individual examples, demonstrations, and assigned activities, to revisions of entire courses, to the integration of MOSAIC ideas throughout multiple courses in a curriculum. In these sessions, Eric Marland (Appalachian State), Chris Gole (Smith College), Jeff Knisley (East Tennessee), and Saber Elaydi (Trinity University) will discuss initiatives on their campuses at the larger end of the MOSAIC spectrum.

Eric Marland

Links to slides and video

Chris Golé

Links to slides and video

Jeff Knisley

At East Tennessee State University, the mathematics department works closely with the Department of Biological Sciences, the College of Public Health, and other life-sciences departments, which has resulted in several curricular experiments and modifications. For example, we have designed and taught a 3-semester introductory lab science sequence (supported by a Howard Hughes Medical Institute grant) that integrates biology, statistics, calculus, computation, and topics from other mathematics courses like differential equations, graph theory, and linear algebra. To date, this lab science sequence has been taught to relatively small cohorts of students, but beginning in Fall 2011, it will be required for all biology and pre-med majors. This will impact approximately 1000 students a year.

In the mathematics department itself, we have also transformed several courses as well as modifying our major. We now have a “quantitative sciences” version of the mathematics major – with emphasis on quantitative biology and featuring several different types of modeling courses – statistical modeling, discrete models, and predictive modeling, among others.

All this began approximately 15 years ago with the requirement that every student at ETSU must take either an introductory statistics course or a sciences/majors calculus course. About 10 years ago, the general education introductory statistics course, with funding from the National Science Foundation, was moved into a computer lab known affectionately as the “Stat Cave.” We now have approximately 2500 students per year whose general education mathematics experience is a data-driven, technology-based introductory statistics course. Likewise, several other courses – multivariable calculus, mathematical computing, and differential equations, to name a few – are now taught in a computer lab and require significant uses of technology.

Links to slides and video

Saber Elaydi: Discrete mathematical models in life sciences.

Link to video

Nathan Tintle : Randomization-Based Inference

Hope College has adoped a new approach to teaching introductory statistics that is based completely on randomization. Study design is discussed from the beginning of the course and used throughout in the context of its impact on inference. A textbook for the course is being developed: see An Active Approach to Statistical Inference.

Links to slides and video

Daniel Kaplan: Identifying Modeling Concepts

Modeling is about making the connection between real world phenomena and the mathematical structures that can be analyzed and manipulated. As such, it’s commonplace to support teaching modeling. But the set of skills and concepts needed to modeling is rarely or never defined. That’s a gap that needs badly to be filled. I’ll look at some of the calls for increased modeling, some of the ways that modeling shows up in current textbooks, and then make a start on developing a list of modeling concepts and skills and the ways they can be related to the content of calculus, statistics, and computing courses.

Links to slides and video


Leading an M-CAST does not need to be a stressful endeavor. We are looking for innovative ideas and active discussion. In this session we will show some examples of M-CASTs, discuss characteristics of a good M-CAST, and discuss possibilities for new ones that we would like to present or like to see. — Eric Marland moderator.

Nicholas Horton : Being Warren Buffet

This M-CAST presents a classroom activity that combines a simulation (using dice) and computer simulated data. Students have a hard time making the connection between variance and risk. To convey the connection, Foster and Stine ( Being Warren Buffett: A classroom simulation of risk and wealth when investing in the stock market, The American Statistician, 2006, 60:53-60) developed a classroom simulation. In the simulation, groups of students roll three colored dice that determine the success of three “investments”. The simulated investments behave quite differently. The value of one remains almost constant, another drifts slowly upward, and the third climbs to extremes or plummets. As the simulation proceeds, some groups have great success with this last investment–they become the “Warren Buffetts” of the class. For most groups, however, this last investment leads to ruin because of variance in its returns. The marked difference in outcomes shows students how hard it is to separate luck from skill. The simulation also demonstrates how portfolios, weighted combinations of investments, reduce the variance. In the simulation, a mixture of two poor investments is surprisingly good. A straightforward computer simulation complements the classroom activity, and can be easily extended.

Links to slides and video

Daniel Kaplan : A Day at the Lake

This M-CAST will transport us (in spirit) from the IMA to one of the famously under-counted “10,000” lakes in Minnesota for his “Day at the Lake” M-CAST. Calculus courses often feature problems to calculate areas and volumes. The Day-at-the-Lake will consider the techniques one might use to find the volume of an actual lake from realistic data, and how those techniques can illuminate and extend the conventional content of calculus courses.

Link to slides and video

Discussion: Brainstorming ideas for M-CASTS

Link to video

Jeff Knisley : Investigation and Case-Based Learning

Learning by investigations: a context for integrating statistics, mathematics, and computation. Not only are science, statistics, and mathematics becoming increasingly more integrated, but each is also increasingly becoming more computational. Correspondingly, we will increasingly be challenged by the question of how to pedagogically integrate mathematics and statistics both with each other and with computational approaches and modern scientific applications. One possible answer to this question is an extension of case-based and problem-based learning that we call an investigation. This presentation introduces the concept of a mathematical investigation and illustrates how it can be used as a common context for mathematics, statistics, and computation. We will also discuss how we design, implement, and assess an investigation. Primarily, however, we will introduce and explore several examples of investigations, including investigations which allow us to combine discrete dynamical systems with statistical inference, correlation with trigonometry, fractal geometry with regression, and resampling methods with differentiation, to name a few (time permitting). Moreover, each of these investigations integrates not only mathematics and statistics, but also require computation via one or more of Maple, Sage, R, Python, or Netlogo.

Links to slides and video

Rich Neidinger : Automatic Differentiation

An introduction to both automatic differentiation and object-oriented programming, and a discussion of how they can enrich a calculus course. Automatic differentiation consists of exact algorithms on floating-point arguments. This implementation overloads standard elementary operators and functions in MATLAB with a derivative rule in addition to the function value; for example, sin u will also compute (cos u) * u´ where u and u´ are numerical values. These methods are mostly one-line programs that operate on a class of value-and-derivative objects, providing a simple example of object-oriented programming in MATLAB using the new (as of release 2008a) class definition structure. The resulting powerful tool computes derivative values and multivariable gradients, and is applied to Newton’s method for root-finding in both single and multivariable settings. To compute higher-order derivatives of a single variable function, another class of series objects keep Taylor polynomial coefficients up to some order. Overloading multiplication on series objects is a combination (discrete convolution) of coefficients. This idea leads to algorithms for other operations and functions on series objects.

Links to slides and video

Thursday July 1

Daniel Kaplan : Introduction to Concept Inventories

Following up on the previous session on “Identifying Modeling Concepts,” this session is intended to lead us toward defining modeling more precisely and — to be very specific — starting to create questions that can be used to probe whether students have developed a working knowledge of modeling concepts. The form I will suggest is that of the “Concept Inventory,” designed to involve little or no calculation and contrast a proper understanding with commonly held misconceptions. After a short review of the history of concept inventories, I’ll give examples of concept inventory questions in physics, calculus, and statistics, with an eye toward the difference between good questions and bad. Working in small groups, we’ll identify some modeling misconceptions and draft corresponding questions.

Links to slides and video


James Caristi : Simulate That!

Powerful and simple to use software exists that allows us to build very complex models in only a few minutes. Models can include discrete and continuous components, and the software automatically produces most of the statistics that are of interest. Students can work on an unlimited variety of projects, and the results can often be published at undergraduate research conferences.

Links to Slides and video

Anthony Tongen : Keeping it R.E.A.L.!

Anthony Tongen (James Madison University) will speak on “Keeping it R.E.A.L.”. He will unveil the meaning of the acronym R.E.A.L.! Carla D. Martin and he recently wrote a book containing computational mathematics projects with related directions for undergraduate research. The first set of projects are amenable to students with very limited programming knowledge, yet they yield valuable information about the connection between mathematics and computation. He will present a couple of the projects from the aforementioned book that would be amenable to students in a Project MOSAIC-type class.

Links to slides and video

Bob delMas : Model Eliciting Activities

The NSF-funded CATALST (Change Agents for Teaching and Learning Statistics) project is developing a new curriculum for non-calculus based introductory statistics courses that builds on ideas from statistics education, cognitive science, and mathematics education. Instead of standard statistical procedures, students learn randomization and simulation methods to carry out inferential analyses. Key to the new curriculum are Model Eliciting Activities (MEAs), which are open-ended problems that use real and meaningful data to engage students in statistical thinking. This session will present some background information on MEAs and participants will gain first-hand experience by engaging in an MEA developed and pilot tested by the CATALST project.

Links to slides and video

Andrew Zieffler : Principles of Assesment

This presentation will focus on creating assessments for student learning. Recently, the educational measurement community has taken a more contemporary view of assessment as one in which the primary purpose is to inform teachers about the nature of student learning, allowing them to offer more structured feedback to the student, which in turn promotes better learning. In order to meet this purpose, assessments need to be constructed so that they address how a student is progressing toward a desired curricular or learning goal. Essential steps to strengthen validity of the assessment process will be highlighted using examples from mathematics and statistics.

Links to slides and video


Randall Pruim : What is Sage? and What Should I Use It For?

Links to slides and video

Christopher Kuster : Algorithms for All!!!

A demonstration of the RAPTOR package that generates computer code from diagrams of the flow of logic in computer packages.

Link to video


Eric Cytrynbaum : The First-Year Course at UBC

The University of British Columbia offers an integrated course for selected first-year science students. Taught jointly by 8 faculty from different science and math disciplines, the course has been offered for 15 years.

Link to video

Daniel Kaplan : Applied Calculus at Macalester

Applied Calculus is the introductory-level calculus class at Macalester College, taken by about one-third of the student body. It’s designed to be multivariate, oriented strongly toward developing modeling concepts, and uses the computer in a serious way. The motivation and content of the course will be reviewed. For many students, Applied Calculus provides the preparation for our calculus-level introductory statistics course: Introduction to Statistical Modeling.

Introduction to Statistical Modeling is a radical departure from the conventional introductory statistics course, focusing on multivariate statistical modeling, adjustment for covariates, and leading to a fairly strong understanding of Analysis of Covariance, topics typically found in mid- and upper-level statistics courses. About one-quarter of all Macalester students take the course.

Links to slides and video

Vittorio Addona : Statistical Modeling for Poets

The “Introduction to Statistical Modeling” course is a calculus-level introduction to statistics. Macalester also teaches a pre-calculus level statistics course. This has been a conventional course emphasizing topics such as sample means and the t-test. At a number of institutions, faculty are revising the pedagogical technique in this course to emphasize permutation tests, but the basic content of the course remains the same. In particular, the same statistical ideas are presented over and over in different settings (p test, z-test, one-sample t-test, two-sample t-test). The conventional curriculum ignores the important statistical ideas of modeling and adjustment. These ideas can be made accessible to pre-calculus students. The talk outlines how these ideas can be made accessible to pre-calculus students, outlines the syllabus of the revised course, and shows some examples of how multivariable thinking and modeling illuminate statistical issues.

Links to slides and video

Friday July 2

Nicholas Horton : Literate, Reproducible Computation

Links to slides and video

Daniel Kaplan : Principles of Teaching Computation

This was a discussion session. To start the discussion, Kaplan named the languages widely used in computer-science education (e.g., Java, JavaScript, C, C++, Scheme, Python) and referred to a list of languages featured at the SIGSCE (Special Interest Group on Computer Science Education) conference: Scratch, Embedded Xinu, Scala, Drupal, Alice, Haskell, Botworld. The point of this example is that there is essentially no connection between the syllabi in computer science education and the languages used in science education (e.g., Matlab, Mathematica, Maple, R, and the many special purposes packages such as SDATA, SPSS, Minitab, etc.) Several participants pointed out that computation for scientists is offered at their institutions, but outside of computer science. If computation is going to be taught to science students in a useful way, the initiative will have to be taken by those outside of computer science.

Kaplan offered a small set of principles to guide such instruction (see the MOSAIC Wiki site). Some of these are:

  1. It is unrealistic to anticipate science departments to require a computation course as a pre-requisite, so computation needs to be taught in a manner that is much quicker than a course. A week or two is as much as can be expected.
  2. Computation should be integrated into commonly taken courses in an essential way. It shouldn’t be optional or shunted off into labs. Two courses widely taken by science students are calculus and statistics, so these two courses should involve computation.
  3. Computation should be fluent, concise, and LITERATE. Students should be able to read and write computations for the purpose of communicating with PEOPLE.
  4. The computation taught should be comprehensive, so that students can use what they learn for a wide range of purposes, across the curriculum. Teaching students only how to use a symbolic integration operator is not being comprehensive; they need to be able to read in data, etc.
  5. The computer introduces new paradigms for doing what we do in science, statistics, and math. Use those paradigms. Examples: Finding roots graphically. Resampling. Guess and check. (Don’t deprecate guess and check, as the high-school teachers seem to do, but make it “guess and refine.”)
  6. Think like Wayne Gretzky: “skate where the puck’s going, not where it’s been.” Students already have ubiquitous access to computers, so give up on graphing calculators. Don’t get hung up on setting up computer labs or finding the money for site licenses. On-line computational services, “in the cloud,” such as RStudio will become more and more available. Free, professional-level software is now widely available.

The session finished with a demonstration of how fluent computation can be: computer syntax (the example was in R) that makes it unnecessary to teach loops in order to express the idea of repetition.

Link to video

Panel Discussion: Institutional Opportunities and Obstacles

Michael Pearson, Olcay Akman, Jeff Knisley, Andy Zieffler, Eric Marland

Link to video