This is the "map" view. Looking from above. x, y and z. Think of these as coordinates on the planet. x is latitude. y is longitude. The way to remember this is that latitude is like a "ladder". Lat - ladder.
The box is in front of us. The part of the box right in front of us is 5 feet away. The back edge of the box is 10 feet away. We are viewing this from zero, zero. Notice the diagonal lines to the points on the left. When we look from our "point of view", we see these points at those angles.
Then the concept of the viewport is introduced. The viewport is our computer monitor. It is one foot in front of us and is two feet wide. The monitor is 2,000 pixels across. We are looking into the middle of the monitor, so 1,000 pixels are to the left and 1,000 are to the right. We are one foot away from the monitor (not actually, but close enough). Since we are one foot away and there are 1,000 pixels to the left on the monitor and that is one foot as well, we can say that one foot equals 1,000 pixels.
Here we do the ratio math to figure out how many pixels to the left of the center of the screen to put our dots. Ratios are:
a is to b what c is to d
In this case, a is 2.5 feet. b is 5 feet. c is unknown (what we are trying to figure out), so I am calling it n, which is typical in computer programming. And then d is one foot.
To figure out n or any ratio, we CROSS MULTIPLY AND DIVIDE. I cannot stress how important it is to remember the words CROSS MULTIPLY AND DIVIDE. 2.5 times 1 is 2.5. 5 times n is 5n. Then we divide the n side by 5 to isolate n. Then we have 2.5 divided by 5 which is .5 2.5 has five .5s in it.
An finally we make the pixel conversion. .5 times our viewport (1,000 pixels) is 500 pixels.
This shows our screen, the computer monitor. We see that we figured out the dot that is closest to us and to the left. We need to put it at 500 pixels to the left of center of the screen. Now we need to figure out the y screen coordinate (up and down, in this case down).