Often in programming, you want to know the distance between two points on the screen. It could be two objects about to collide or two characters about to interact. Or maybe the artificial intelligence is waiting for the player to come within a certain distance of the enemy before it attacks. Whatever the situation, it's important to be able to quickly calculate that distance between two points on those objects. The easiest way to do that is to use the Pythagorean theorem.
The Pythagorean Theorem
a2 + b2 = c2
where a and b are the legs of a right triangle and c is the hypotenuse.
The Pythagorean theorem is illustrated in Figure 2.1.
The Pythagorean theorem works only for right triangles, ...