## Assumptions

Behind every indirect fact (like a calculation) there are a number of assumptions; some of them of minor import, some of them vital. Most people ignore this aspect so much that they don't even realize the assumptions are there. (To paraphrase Ayn Rand, not only don't they "check their premises", they don't even know what those premises are.)

Here's an example: estimating the speed of a moving car. Let's say the car leaves town A at 3pm and reaches town B at 6pm. We can see on a map that the distance between the two towns is 150 miles, therefore we calculate that the car had an average speed of 50 miles per hour (150 miles divided by 3 hours).

What assumptions have we made, and how do they impact our result? For one thing, it is unlikely that the car has moved at a constant speed. It might have slowed down when passing through populated areas and sped up when in between; it might have even stopped for refueling. The actual speed was almost certainly not 50 miles per hour at least some of the time. In itself, this assumption is not so problematic for our result, since we made sure to prefix it with the word "average", acknowledging the possibility of a varying actual speed.

There is a second, more problematic assumption: that the map distance between the two towns is the same as the actual road distance. (I do not know how often this happens to be the case in the US; I know it's almost never true in Romania, where the roads mostly grew up organically, from local necessities, instead of being planned by someone.) This assumption, if incorrect, has the potential to affect the final result quite significantly. To use a distinction I'm fond of making, we're starting to leave the academic towers of science and getting into the murky waters of engineering.

What about a second method of calculating the speed of a car, one that is at the same time more likely to be accepted on its face and carrying a much more vital assumption? The speedometer of a car tells us what the current speed is. It does that by measuring how many times the tire axle revolves per second and multiplying it by the tire circumference to get the distance traveled in that second; another calculation will yield miles per hour.

There are at least two hidden assumptions here. The minor one is about the tire circumference: if you're using a nonstandard tire, e.g. one with a smaller radius, its circumference will be lower than the "known" value, which will affect the final result. Your actual speed will be lower than what the speedometer is showing.

The second assumption can completely destroy the result, though: it's the link between "tire moving" and "car moving". If your car is stuck on snow, or sand, or on a test bench in some garage, the tire will be moving without affecting the car. The speedometer readout will thus be completely useless: a 50 miles / hour with absolutely no significance in reality.

What was the point of all this? Well, just keep it in mind the next time you're reading about Kuiper belts and Oort clouds and dark matter and ...