Tired of ads?
Join today and never see them again.
Physicists use models to represent one thing using another thing. What’s the point? The point of a model is to simplify something to make its analysis easier. Yeah, models are the only way to use laziness for good.
We use a ton of different models. We might create a flow chart of the mating habits of carrier pigeons, or a 3D computer model of a Big Mac to figure out the secret of its success. We might even create an actual, hands-on, physical model. But that’s another lesson.
Modeling might be part of physics, but it’s at the heart of human nature. Let’s say we’re explaining something complicated to a small child, like how rainbows happen. How can we help the child understand? Make it simpler. We might use an analogy: we could say that light is like a bag of skittles, and that, like the skittles, it’s made up of parts of many colors, so when light is split apart, we see those colors. They’ll never see the words “taste the rainbow” the same way again. Our bag of skittles just became a superb model for light. We could explain the phases of the Moon using a physical model, like a soccer ball, we could model the atmosphere using a bowl of water, the list goes on and on. Models make everything easier to understand and analyze.
In physics, the most used model is one that’s made out of math. While writing long, complex math equations might not be most people’s idea of an exciting Saturday night (maybe just ours), they are absolutely vital to physics. Unlike in math class where everything can be abstract and without a connection to anything, in physics those equations reveal true characteristics of parts of the our very real universe. So putting a bunch of those equations together, we can describe natural phenomena with huge amounts of precision, which is pretty cool.
Some physicists might enjoy studying physics for the sake of knowledge, but not all of them. People learn about physics and use its models to create the physics engines that control how objects move in video games, or to help Hollywood create CGI graphics for the latest Disney movie, or by mechanics designing the fastest cars. Use the powers you gain learning physics for good, and maybe a dollop of lighthearted evil.
Everything we’ve ever learned might be wrong.
Perhaps that’s a surprising thing to admit when our goal is to teach some sweet science, but admitting that everything we’ve ever learned might be wrong is a fundamental part of science. Admitting that is step 1. Step 2 is accepting that nothing we’ve ever discovered is certain – it can only be very, highly, extremely, extravagantly likely. Some things we know are about as close to certain as we can be without being certain, like human blood being the color red. However, we might wake up tomorrow and find out we were really in the Matrix, that human blood is green, and that chocolate is not delicious. All we can do is look at the evidence and roll with the punches.
Before anyone starts blaming us for negativity, let’s just establish: this skepticism is a great thing. We love it. The reason is that an important part of physics is realizing that everything is based on assumptions and approximations. Questioning those assumptions and approximations is how scientists check their work. Physicists should be being willing to disprove anything, no matter how well-established, when the evidence points in that direction. Since things can only be pretty gosh-darn certain, that means we never know anything for sure, which really means that there is always something to question, there is always an experiment to run.
Just look at Albert Einstein. He showed that Newton’s laws were wrong. NEWTON’S LAWS!!! The same laws every student across the world learns in physics class are just an approximation. They’re good at predicting how an object will fall, or the forces involved as a train accelerates out of the station, but when that train starts to approach the speed of light, things get weird, and Newton’s laws don’t quite work. Because he questioned everything, Einstein discovered his theories of relativity, and things have never quite been the same.
That being said, we’ll use a lot of approximations and assumptions as we study physics. The trick is just keeping track of each one. Every law of physics, or model, or panicked answer to a question on a pop-quiz, is subject to assumptions and approximations: Newton’s equations are true, as long as we’re not going too close to the speed of light. Any object will fall with an acceleration of 9.8 meters per second squared, as long as we ignore air resistance and aren’t living in the mountains or deep under the sea. Even when teachers don’t tell us what the assumptions are, they always exist.