Gas Laws and Deflategate

inflated football & deflated football on green background

In a previous post on exploring new ways to incorporate authentic assessments, I referenced how a wonderful activity centered around the application of gas laws could help investigate the Deflategate controversy that consumed NFL news throughout the 2014-15 season. Though I encourage everyone to read the original JChemEd article1 that inspired me to give it a try, I thought it could be useful to share my own experience with the activity, now that I have tried it two years in a row. If you are looking for ideas to create an authentic opportunity for students to apply their knowledge of gas laws while integrating some of the most important science practices, then this activity may fit your needs.

 

Context of the Activity

In January of 2015, the New England Patriots played the Indianapolis Colts to decide which team would advance to the Super Bowl. Toward the end of the first half of the game, members of the Colts staff notified the officials that the Patriots may have been playing with underinflated footballs. While this concern may seem trivial to many, playing with a slightly underinflated football brings subtle advantages that can quickly make the game unfair. Though it is not crucial for students to understand how a deflated ball impacts of the physics of playing football, I felt it was important for them to understand the reasoning behind the original claim of unfair play and it naturally created an opportunity for me to assimilate science with a game that many students are reasonably familiar with.

To that extent, we spent 10 – 15 minutes discussing some basic physics. Consider two of the most relevant actions while playing football—catching and throwing. Catching a football that is slightly deflated allows your fingers to depress the leather more easily, allowing your fingers to cover more surface area, which increases the ability to grip the ball. This same fact also increases how much spin can be placed on the ball when the Quarterback throws it, causing the ball to travel farther.2 Due to these advantages, the NFL requires the balls supplied by each team to meet a minimum gauge pressure of 12.5 psi, which is checked by two officials prior to the game.

During half-time, officials checked the pressure of the footballs used by both teams. It was found that all 11 of the Patriots’ footballs were underinflated (<12.5 psig) while all four of the Colts’ footballs tested were within the accepted pressure range of 12.5 – 13.5 psig. Based on NFL rules, the officials filled the Patriots’ balls to the appropriate pressure and the game eventually concluded with a 45 – 7 win by the Patriots. To determine the cause of the underinflated footballs, the NFL eventually hired an independent law firm to lead an investigation.

 

Identifying the Guiding Question and Data Needed

Once the background has been established, several students (mostly boys) unsurprisingly rely on their emotional response by claiming the Patriots clearly cheated and that Tom Brady (the Quarterback) is a cheater. Though this response is understandable given the initial data, moments like this provide a good opportunity for students to practice one of the most important aspects that allows the process of science to work—separating your own emotions and biases from the data.

From here, we discuss the need for clarity on what exactly we are going to investigate. I remind my students that, as scientists, our job is not to determine if someone cheated; we leave that for the bloody lawyers. However, to build their legal case, lawyers often rely upon independent scientists that apply their scientific knowledge in ways that may or may not provide sufficient evidence for a legal team. So, if we are not determining whether the Patriots cheated, what are we investigating? This has led to some interesting discussions between students but the eventual consensus ultimately tends to revolve around the question of whether it was possible for the balls to have been deflated unintentionally. With a bit of direction, I help the class build on this premise by constructing a meaningful question that will guide their entire investigation.

Could the underinflated footballs be the result of environmental conditions of the day?

 

Part 1—Identifying Important Information

With the guiding question in place, I give groups about 15 minutes to make a list of all the information they would want to answer this question. I like this part of the investigation a lot because many science teachers, myself included, too often do this step for students when doing some kind of experiment; completely bypassing the opportunity for them to develop the skillset of identifying all necessary variables relevant to an investigation and how they might be measured. Once their lists have been created, we compile all the information on the board and assess which variables are relevant and if we have access to the information. As a teacher, I like doing this because it resembles the pedagogical techniques used within the Modeling Instruction curriculum and the Argument-Driven Inquiry (ADI) used for developing an investigation. Additionally, it becomes clear to many students the need for their recently developed knowledge of gas laws to become applicable. Though I do not have a picture of a list generated in my own classroom, the sample subset of ideas below from the original JChemEd article1 reflects what I saw.

  • The outside temperature on that day
  • Atmospheric pressure on that day
  • The weather (rain, snow, sunny, etc.)
  • The temperature of the locker room where the balls were filled
  • The initial pressure of the balls
  • How were the balls used during the game (kicked, passed, not used)?
  • How many times were the pressures checked?
  • Were the gauges calibrated?

 

Part 2—What Will You Do with the Data?

With the relatively easy part out of the way, students now must consider how they will go about analyzing the data they deemed relevant. My honors class caught on a bit more quickly than my general class when it came to evaluating which equations they would need and how the results might be interpreted. Though they were not used to it, getting students to realize that engaging with the process of thinking ahead by applying their background knowledge and reasoning skills was an incredibly important step for them to maintain a clear path in their investigation. Unlike many of the labs we do, what is useful about this activity is the variety of paths students may take to arrive at their conclusion. As expected, some groups saw a clear path and were on their way without much assistance while others needed a bit more guidance. This helped me allocate my time more effectively when facilitating the thought process with certain groups.

Though I had my own intuitions as to how I would approach answering the guiding question, it is worth noting that the author, Elizabeth Megonigal, provides a wonderful document in the Supporting Information1 of her article that details three different methods students may use based on how they chose to collect evidence. These methods include the application of Gay-Lussac’s law, using average pressure differences, and using the Ideal Gas Law to analyze moles of gas within the footballs. I found this to be incredibly helpful because some of the methods were not initially obvious to me but once I saw the calculations and reasoning behind them, they helped me facilitate the variety of approaches that different groups took.

When it came to planning, one of the biggest hurdles I noticed students initially struggled to overcome was their ability to create a hypothetical “best-case scenario” for either team based on the scientific principles they chose to apply. For example, many students appeared to have a strong intuition that they could apply Gay-Lussac’s law (P1/T1 = P2/T2) to help their investigation. When speaking with them, it was clear they understood that they could simply insert the known initial pressure, initial temperature, and final temperature values in order to determine the final pressure. However, it soon became clear to me that many of them did not know precisely what the implications of their answer meant. For me, this was a classic demonstration of quantitative correctness does not equal understanding.

What many of them were failing to understand is that if they could just create the theoretical parameters to calculate the greatest pressure difference (highest initial temperature when balls were filled and lowest final temperature on the field), then they could determine the lowest pressure possible due to natural causes and compare that value to the pressure values measured by officials during half-time. If the reported pressures were lower than the calculated lowest possible pressure value, then something was a bit fishy. I found myself having to explain, or at least guide, this line of reasoning to several groups. However, once I did, I could tell a lightbulb went on in their heads and they knew what to do.

 

Part 3—Data Collection and Analysis

Students were provided with data from the complete investigative analyses, known as the Wells Report1, to help answer the guiding question. I gave students one full class period to sift through the Wells Report and start doing their analysis. The fact that the Wells Report contains a lot of data only reiterated the importance for groups to have a clear path ahead of time. This was a realization I had after the first year I did this activity and it helped quite a bit the second time around. However, the amount of data can still be overwhelming for students simply because they lack the experience identifying and understanding useful data within a larger data set. I think the following statement by Megonigal on the importance of letting students experience this challenge was spot on1:

“Too often in education we give students just the limited information they need to answer a question. This unrealistic situation does not require students to think critically about the wide variety of data that often accompany real-world problems, some of which is relevant to the problem and some of which is not.”

Slide 1: One group's claim on Google Slides

Part 4—Presenting Information

Like Megonigal’s approach, my classes wrote their scientific explanation in a Claim, Evidence, and Reasoning (CER) format. The first year I implemented this activity, students wrote their explanations on big whiteboards I have throughout the classroom. The following day, groups were given time to present their findings and their classmates, who may have reached a completely different conclusion, were given the opportunity to ask questions and critique lines of reasoning appropriately. Had I done a better job preparing my students throughout the year for analyzing and evaluating scientific arguments, I believe this would have gone much better. This is not to say it went poorly, but I often found myself witnessing the groups give their presentation in a way that was hard to follow, and few questions would arise from the audience.

Slide 2: One group's evidence supporting the Patriots on Google Slides

For the second year of implementation, I decided to focus less on the presentation aspect and more on the construction and overall validity of their ability to produce a coherent scientific explanation. To help keep it organized, I let students provide their claim, evidence, and reasoning on 3-4 slides using Google Slides. Once they shared it with me, I would evaluate it based on a scoring rubric and provide feedback as necessary. Though this approach removed the potential for lively discussion between groups while presenting, it saved a bit of time and still allowed me to evaluate their overall explanation. I have included slides 1-3 made by one group for their presentation. If interested in looking at explanations from other groups, you can find them in the supporting information.

Slide 3: One group's reasoning on Google Slides

 

1. Megonigal, Elizabeth. Nature or Naughty: Bringing “Deflategate” to the High School Chemistry Classroom. J. Chem. Educ. 2016, 93, 311—313. (This is an open access article published under an ACS AuthorChoice License, which permits copying and redistribution of the article or any adaptations for non-commercial purposes.)


Editor's Note: By logging into your ChemEd X account you can access the sample student presentations in the supporting information and comment on the post.

Concepts: 

NGSS

Analyzing data in 9–12 builds on K–8 and progresses to introducing more detailed statistical analysis, the comparison of data sets for consistency, and the use of models to generate and analyze data.

Summary:

Analyzing data in 9–12 builds on K–8 and progresses to introducing more detailed statistical analysis, the comparison of data sets for consistency, and the use of models to generate and analyze data. Analyze data using tools, technologies, and/or models (e.g., computational, mathematical) in order to make valid and reliable scientific claims or determine an optimal design solution.

Assessment Boundary:
Clarification:

Asking questions and defining problems in grades 9–12 builds from grades K–8 experiences and progresses to formulating, refining, and evaluating empirically testable questions and design problems using models and simulations.

Summary:

Asking questions and defining problems in grades 9–12 builds from grades K–8 experiences and progresses to formulating, refining, and evaluating empirically testable questions and design problems using models and simulations.

questions that challenge the premise(s) of an argument, the interpretation of a data set, or the suitability of a design.

Assessment Boundary:
Clarification:

Scientific questions arise in a variety of ways. They can be driven by curiosity about the world (e.g., Why is the sky blue?). They can be inspired by a model’s or theory’s predictions or by attempts to extend or refine a model or theory (e.g., How does the particle model of matter explain the incompressibility of liquids?). Or they can result from the need to provide better solutions to a problem. For example, the question of why it is impossible to siphon water above a height of 32 feet led Evangelista Torricelli (17th-century inventor of the barometer) to his discoveries about the atmosphere and the identification of a vacuum.

Questions are also important in engineering. Engineers must be able to ask probing questions in order to define an engineering problem. For example, they may ask: What is the need or desire that underlies the problem? What are the criteria (specifications) for a successful solution? What are the constraints? Other questions arise when generating possible solutions: Will this solution meet the design criteria? Can two or more ideas be combined to produce a better solution?

Constructing explanations and designing solutions in 9–12 builds on K–8 experiences and progresses to explanations and designs that are supported by multiple and independent student-generated sources of evidence consistent with scientific ideas, principles, and theories.

Summary:

Constructing explanations and designing solutions in 9–12 builds on K–8 experiences and progresses to explanations and designs that are supported by multiple and independent student-generated sources of evidence consistent with scientific ideas, principles, and theories. Construct and revise an explanation based on valid and reliable evidence obtained from a variety of sources (including students’ own investigations, models, theories, simulations, peer review) and the assumption that theories and laws that describe the natural world operate today as they did in the past and will continue to do so in the future.

Assessment Boundary:
Clarification:

Constructing explanations and designing solutions in 9–12 builds on K–8 experiences and progresses to explanations and designs that are supported by multiple and independent student-generated sources of evidence consistent with scientific ideas, principles, and theories.

Summary:

Constructing explanations and designing solutions in 9–12 builds on K–8 experiences and progresses to explanations and designs that are supported by multiple and independent student-generated sources of evidence consistent with scientific ideas, principles, and theories. Construct and revise an explanation based on valid and reliable evidence obtained from a variety of sources (including students’ own investigations, models, theories, simulations, peer review) and the assumption that theories and laws that describe the natural world operate today as they did in the past and will continue to do so in the future.

Assessment Boundary:
Clarification:

Engaging in argument from evidence in 9–12 builds on K–8 experiences and progresses to using appropriate and sufficient evidence and scientific reasoning to defend and critique claims and explanations about natural and designed worlds. Arguments may also come from current scientific or historical episodes in science.

Summary:

Engaging in argument from evidence in 9–12 builds on K–8 experiences and progresses to using appropriate and sufficient evidence and scientific reasoning to defend and critique claims and explanations about natural and designed worlds. Arguments may also come from current scientific or historical episodes in science.
Evaluate the claims, evidence, and reasoning behind currently accepted explanations or solutions to determine the merits of arguments.

Assessment Boundary:
Clarification:

Planning and carrying out investigations in 9-12 builds on K-8 experiences and progresses to include investigations that provide evidence for and test conceptual, mathematical, physical, and empirical models.

Summary:

Planning and carrying out investigations in 9-12 builds on K-8 experiences and progresses to include investigations that provide evidence for and test conceptual, mathematical, physical, and empirical models. Plan and conduct an investigation individually and collaboratively to produce data to serve as the basis for evidence, and in the design: decide on types, how much, and accuracy of data needed to produce reliable measurements and consider limitations on the precision of the data (e.g., number of trials, cost, risk, time), and refine the design accordingly.

Assessment Boundary:
Clarification:

Mathematical and computational thinking at the 9–12 level builds on K–8 and progresses to using algebraic thinking and analysis, a range of linear and nonlinear functions including trigonometric functions, exponentials and logarithms, and computational tools for statistical analysis to analyze, represent, and model data. Simple computational simulations are created and used based on mathematical models of basic assumptions. Use mathematical representations of phenomena to support claims.

Summary:

Mathematical and computational thinking at the 9–12 level builds on K–8 and progresses to using algebraic thinking and analysis, a range of linear and nonlinear functions including trigonometric functions, exponentials and logarithms, and computational tools for statistical analysis to analyze, represent, and model data. Simple computational simulations are created and used based on mathematical models of basic assumptions. Use mathematical representations of phenomena to support claims.

Assessment Boundary:
Clarification: