The Pale Blue Dot? Chapter Eight: NASA’s Cameras, Lenses and Video Equipment; An Examination

Written By Thomas Perez. July 30, 2017 at 10:02PM. Updated 2020.

Picking up from where we left off – we had mentioned that we were going to discuss NASA’s cameras, lenses and video equipment. So, without further ado, let us begin.

In Reference to Lenses

The ISS employs 11 camera’s, 3 multi-functional devices with cameras, 1 HD viewing system, 1 standard viewing CCTV and 1 EHDCA. The last three are installed hardware/experiments. A total of 17 altogether. All cameras aboard the ISS are DSLR’s (Digital Single Lens Reflex). To understand the basics of a camera, I have decided to give a small basic tutorial of how cameras work for those who may be rusty, or for those who may not understand the workings of a camera. (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19).


2. “Sand Dunes in Har Nuur (Black Lake), Western Mongolia : Image of the Day” 7 September 2006. Retrieved 2017-08-10.

3. “Fires in British Columbia : Natural Hazards”. 20 August 2003. Retrieved 2017-08-10.

4. “New York City and East Coast City Lights : Image of the Day” 18 January 2003. Retrieved 2017-08-10.

5. “Plume rises from Ulawun : Natural Hazards”. 30 November 2012. Retrieved 2017-08-10.

6. “Sarychev Peak Eruption, Kuril Islands : Natural Hazards”. 12 June 2009. Retrieved 2017-08-10.

7. Nikon (2010-06-14). “The latest Nikon equipment to be used in the Russian segment of the International Space Station: New orders received for Nikon D3S and D3X digital-SLR cameras as well as NIKKOR interchangeable lenses”. Nikon. Retrieved 2016-02-02

8. “Aurora Australis Observed from the International Space Station : Image of the Day”. 29 May 2010. Retrieved 2017-08-10

9. “Pavlof Volcano, Alaska Peninsula : Natural Hazards”. 18 May 2013. Retrieved 2017-08-10.

10. NASA. Gateway to Astronaut Photography of Earth: What are the different choices of cameras?”. NASA. Retrieved 2016-02-02.

11. NASA. “NASA Johnson”. NASA. Retrieved 2015-11-16.

12. “Nikon | News | NASA orders 53 unmodified Nikon D5 digital SLR cameras”. Retrieved 2017-08-25.

13. Sony. “The α7S II successfully captured the first ever commercial level 4K footage in space”. Sony. Retrieved 2017-10-09.

14. Impact, Alexis Kleinman Deputy Managing Editor Of; Innovation; Post, The Huffington (24 April 2013). “Even NASA Has Switched To Android”. Retrieved 2017-08-10.

15. “NASA – Socializing Science With Smartphones in Space”. Retrieved2017-08-10.

16. “iPad 2 Scheduled for Delivery to International Space Station Tomorrow – The iPad Guide”. Retrieved10 August 2017.

17. HDEV” Retrieved2017-08-10.

18. “ISS Spacewalkers install new external HD Cameras, retract Thermal Radiator – ISS Expedition 48”. Retrieved 2017-08-10.

19. “ISS Spacewalkers install new external HD Cameras, retract Thermal Radiator”. Retrieved 2017-08-10.

The Workings of a Camera

Let me start off by saying that the human eye is neither wide angle not telephoto. It’s in the middle, like most compact cameras.

1. Two Types of Cameras

A. Compact cameras are cameras that are small, you cannot change the lens, they are built in.

B. DSLR’s are larger cameras. They come with an inexpensive lens called a kit. You can buy different lenses for them separately.

2. Lenses

A. Compact cameras have lenses built in them. DSLR’s allow you to change lenses.

3. Types of Lenses

A. Prime – A 20mm, 60mm and a 100mm lens are all examples of prime lenses.

B. Zoom – Double digits like 18-55mm lens is a zoom in and zoom out lens. The smaller number is the wide end of the zoom range (zooming out), and the larger number is the telephoto end (zooming in). The numbers tell you how far you can come in and/or expand out.

C. Macro Lenses: Allows for very close shots. They are in between wide angle and telephoto, or moderately telephoto lenses.

D. Telephoto: Allows for objects far away to be taken through its zooming capability without losing much of what is in the lens.

E. Mid-Range Zooms: A lens with focal lengths between 17mm – 60mm. It is called the “Mid-Range Zoom” lens. Examples include 17 – 35mm lens, 17 – 70mm lens, and 18 – 55mm lens. Not good for true wide shots, and not good for close up Macro shots either.

F. Wide Angle: Allows for a wide angle of, say for an example, a room. They are similar to what you see on a big rectangular screen within most movie theaters. There are two types: regular – known as Rectilinear Wide-Angle Lenses. The second is known as the Fisheye Lenses. Photos taken with a fisheye lens look curved at the edges. They are wider than regular wide-angle lenses. For I.e., a 15mm fisheye is wider than a 15mm Rectilinear lens.

So if you have a DSLR, you can actually change lenses to get wide angle, telephoto, macro, and fisheye lens capabilities. These lenses are sometimes called converter lenses.

4. Focal Length

A. Wide-Angle Lenses have a small focal length; 10mm – 20mm. The smaller the number, the wider the lens.

B. Telephoto lenses have a larger focal length, I.e., 200mm – 300mm.

C. Compact and Zoom lenses have a focal length of 35-105mm lens. So, a 35mm x 105mm will have a focal strength multiplied by 3x’s. Thus 3 x 35 = 105, a 3x zoom capability. Another I.e., is the Canon G10. A 28mm x 140mm = 5. It’s 5x’s the zoom.

D. Prime Lenses – Only have one focal length. For I.e., 60mm is a prime lense (thus it’s focal length).

5. Focal Length Examples

A. A 10mm or 15mm small focal length will give you a very wide field of view (FOV). Focal lengths of 300mm or 400mm would be very large telephoto lens. Ergo, a small number equals a wide angle. A larger number equals a telephoto lens. Two numbers together, I.e., 35mm – 105 mm, means that the lens zooms from one focal point to another.

6. Compare Lenses Between Cameras

A. Focal length on a compact camera lens, as a rule of thumb (35mm), is always used as a field of reference. It is often used as an equivalent to any particular lens. In other words, it is the standard of comparison.

B. Focal lengths on a DSLR. NASA’s Nikon D3 is a Full Frame DSLR. When comparing a compact (35mm) with a D3, you must convert to the 35mm equivalent. To do that with the D3 you simply multiply the crop number, in this case the Nikon D3 crop number is 1.5. Other companies use different numbers so no big deal here. After doing so, you will get the difference in aperture.

7. Maximum Magnification (Mag’)

A. A 1.1 Mag’ means a lens can take a photo the size of the camera’s sensor.

B. 1.5 means it can take photos five times the sensor, which is less mag’ than 1.1. For the Nikon D3, the sensor is 23.6mm across. So, your lens with a 1.1 mag’ can take full frame photos of 23.6mm across. A 1.5 mag’ lens can only take a full frame subject at 11.8cm across.

8. Field of View or Angle of View

A. Field of View (FOV) or Angle of View -the width of a lens. This is expressed in degrees. For I.e., a 10mm fisheye lens will have a very wide FOV of 180 degrees. Whereas a long telephoto lens will have a very narrow FOV; like for I.e., 10 degrees.

B. It is stated as a diagonal measurement. Similar to our flat screen HDTVs. You measure them from corner to corner.

C. The FOV can be utilized for full frame cameras, or crop sensor cameras.

9. Lens Speed and Large Aperture

A. Large lenses called Fast lenses are shown to be as such. For I.e., 50mm F1.8, 17 – 35mm F2.8, and 200mm F2.0, all let in more light, focus faster, are sharper and larger.

B. Small Lenses called “Slow Lenses” are shown to be as such. For I.e., 300mm F5.6, 18 – 200mm F4.5 – F6.3, are all slow, less sharp, slow to autofocus and contains slower shutter speeds.

Now that we have covered the basics concerning cameras, let us take a closer look at the fisheye lenses that flat Earthers are so suspicious about. Often blaming the curvature of the Earth due to these lenses.

How do fisheye lenses work in comparison to regular photographs and/or videos? To answer that question let us take a look at some of the following photos.

But after viewing these photos, I thought to myself; “Is there a way that I can verify this for myself because I don’t have a DSLR Camera.” But then, at that moment, my brain told me to hold a quarter close up against my Android phone, take a picture, and see what I can do with it. That is, if I can do anything at all with just a quarter and an Android phone from Boost Mobile. I took the shot – just a regular shot and proceeded to play with it. These are my results.

1st Pic: Quarter held closely up against Android phone.

2nd Pic: Added the color blue from Android color selector. The silver color of the coin actually helped made it look “Earthly” like.

3rd Pic: I brightened the pic in order to show detail. Opting to also zoom out a bit, in order to give it a bodily form of sorts.

4th Pic: Then I discovered that my Android had a fisheye lens. So, I fisheye’ed the picture. When I did, it curved at the bottom. It actually looks like a dome. But we all know that quarters are not domes, they are flat and circular.

5th Pic: I clicked the fisheye again to see what would happen. The pic actually curved even more so. Thus, shrinking the pic. I was left with an uncompleted globe.

6th Pic: So I clicked the fisheye again.

7th Pic: …And again.

8th Pic: …And again. Until finally completed.

9th Pic: I proceeded to use the filter options in my Android camera. This was the color result of my choices. I soon had what looked like brownish continents.

10th Pic: But I wasn’t finished. I was having fun. So I enhanced the colors. But as I did, I lost the brownish patches seen in pic 9. But that’s ok. I didn’t mind.

11th Pic: I enhanced it again via the contrast option and tinting choices.

12th Pic: My final result. Looks like Earth, doesn’t it? Except for the absence of continents. I don’t have the capability to draw in land masses. But it is, without a shadow of a doubt, a planet positively rippling for the potential of life, with it’s beautiful oceans and white clouds.

Not really, it’s just a quarter, remember? And remember, I did not take the picture like this. The picture is what you see in pic 1. I did not move, alter, or change the position of the quarter, or my hand, in any form or fashion. But nevertheless, that is the result – a sphere, a globe. Unbelievable, huh? Well, it happened.

After conducting this visual lesson, I thought to myself; “if I can do all that with just a quarter and an Android phone, just think what NASA can do with their technology, and with what’s there already.” But what technology, in reference to cameras and videos, do they use anyway? Let us take a look into that question. As stated earlier, there are 15 instruments of recordable mechanisms aboard the ISS. They are as follows; the Kodak 760C, the Nikon D1, D2Xs, D200, D3, D3X, D3S, D4, D800E, the iPhone 4, the HTC Nexus One, iPad 2, HDEV – high-definition Earth viewing camera, CCTV – a 4:3 standard definition camera, and a EHDCA – a Nikon D4 in special housing with motor-controlled zoom from 28-300.

Upon researching all of the cameras and video recording devices aboard the ISS, I found them all to be DSLR compatible. Even the Kodak 760C is DSLR compatible. According to an Amazon review from a satisfied customer; “This lens works great with the Kodak play-sport. When I purchased this I had some concerns based on the picture that they show and on the one review that was negative. First off, I was concerned that the lens would cut off the corners of the movie screen making the picture round rather than filling up the whole screen, and 2, was it truly a fisheye lens? Well, to answer the question, yes! The lens does does take up the whole screen and does not cut off the corners, and it is a true fisheye lens that makes a considerable difference in the field of view.” (20).


Simply put, according to the citation above, the 760C obviously reveals a wide-angle aspect of ratio through its fisheye lenses – keeping the whole picture in view. But at the same time, it curves whatever is in view. It is powerful enough to turn a 28mm lense into a 36mm and even up towards a 42mm – multiplied by a focal length of 1.5x. It also boasts a capacity to use more horizontal pixels and vertical resolutions.

Similarly; “Another big difference is the 760’s 1.3x focal length multiplier. For wide angle work there’s certainly some advantage here, a 28mm lens on the 760 would have the equivalent picture angle of a 36mm lens, on the D1x it would be equivalent to 42mm (1.5x focal length multiplier).” (21).


In Reference to NASA’s Video Equipment:

“After being continuously inhabited for more than 13 years, it is finally possible to log into Ustream and watch the Earth spinning on its axis in glorious HD. This video feed (embedded below) comes from from four high-definition cameras, delivered by last month’s SpaceX CRS-3 resupply mission, that are attached to the outside of the International Space Station. You can open up the Ustream page at any time, and as long as it isn’t nighttime aboard the ISS, you’ll be treated to a beautiful view of the Earth from around 250 miles (400 km) up.

This rather awesome real-time video stream (which also includes the ISS-to-mission control audio feed) comes by way of the High-Definition Earth Viewing experiment. HDEV is notable because it consists of four, commercial off-the-shelf (COTS) high-definition video cameras that are each enclosed in a pressurized box, but otherwise they are exposed to the rigors of space (most notably cosmic radiation).

HDEV, which consists of just a single enclosure, was delivered to the ISS a couple of weeks ago by SpaceX CRS-3. The box was connected up to the underside of the ISS via EVA/spacewalk, with one camera pointing forward (Hitachi), two cameras facing aft (Sony/Panasonic), and one pointing nadir (Toshiba, down towards Earth). If you watch the stream, you will notice that it hops between the four cameras in sequence, with gray and black color slates in between each switch. If the feed is permanently gray, then HDEV is switched off – or communications have been lost. Also note that the ISS has an orbital period of just 93 minutes – for a considerable part of that time the station is in the Earth’s shadow and can’t see much.” Published May 2, 2014. (22).


A “Nadir” is a point on a celestial sphere directly below an observer. An “Aft,” in naval terminology, is an adjective or adverb meaning, towards the stern of a ship. The stern is the back or “aft”-most part of a ship. It is the opposite of the bow/front of a ship. So, the video cameras are all located toward the back and “aft” of the ISS. According to the citation, each video camera consists of 4 separate COTS technologies. Besides the acronym of the initials COTS, the term is also in reference to the type of lenses used during the live video feed. COTS are security-based cameras and recording devices, utilizing the fisheye format.

Moreover, according to NASA, “The High-Definition Earth Viewing HDEV primary objective is to validate the space-based performance of the cameras in a variety of operating modes to exercise and demonstrate the features and longevity of the COTS equipment for future ISS Program usage. This payload is an external earth viewing multiple camera system using a set of Commercial-off-the-shelf (COTS) cameras. The HDEV integrated assembly is composed of a camera system of four COTS cameras, integrated Command and Data Handling (C&DH) avionics (ethernet), and a power data distribution box that allows the integration of the payload’s components interface to the ISS Columbus module.

The HDEV visible HD video cameras are a fixed payload camera system that requires no zoom, no pan or tilt mechanisms. The four fixed cameras are positioned to capture imagery of the Earth’s surface and its limb as seen from the ISS (i.e., one camera forward pointed into the station’s velocity vector (the stern), two cameras aft (wake) (toward the stern), and the other one camera pointing nadir).” (23). Under the stern?


However, as of 2015, NASA installed another HDEV video camera. “The ISS has been streaming HD video of the Earth for the last year, but starting today it will be able to capture four times the detail. A new camera module from a company called UrtheCast is now up and running on the ISS, and it’s capable of capturing 4K Ultra High-Definition (UHD) video. It’s the next best thing to actually being in space / The old HDEV is set to wide-angle, so it captures much more of the surface. However, the UrtheCast 4K camera is set to operate as a telephoto camera that zooms in on the surface. This is high enough resolution that you can actually see cars driving down the road in real time.” (24).


So what does all this mean? Simply put, it means that when we are shown images, via camera or from video live streaming feeds from NASA, we are seeing the Earth from a wide-angle lens. Is this intentional – to show a curvature? It could be. Or it could simply be an innocent attempt to show us the Earth, in all her glory, within a wide angled FOV. And upon doing so, causing the Earth, unintentionally, to appear curved.

Back in the day, before the wide screen video format in films became the norm for television sets, videos always depicted the “Pan and Scan” aspect of ratio. This was necessary because all old bulky analog television sets were squared – a boxed screen. What we saw in wide angle rectangular screen movie theaters was lost due to the transference of Panavision to pan and scan. Like for I.e., seeing Mr. and Mrs. Robertson, Benjamin’s mother and father looking down at him while he’s sunbathing in the pool on the wide-angle rectangular screen, were lost in the video pan and scan release – we were left with only seeing two of the actors in that particular frame. But we knew the other two were there due to the script and the constant necessity for the camera to pan and scan to get the others that were cut off from the corners into the frame, and thus into the scene.

Wide Angle Aspect of Ratio:

Pan and Scan Aspect of Ratio:

Another example is the opening scene in Star Wars Episode 4 (1977). On the big screen, we can see the Princess’s Rebel Blockade Runner, the ‘Tantive 5,’ flying away from a very large massive Imperial Star Destroyer, hot in pursuit over the planet of Tatooine – with a blue moon on the far left. But in the old video pan and scan release, many of the stars were lost, and with them the moon. Everything also seemed so close. Too close really. Hence, no perspective.

Wide Angle Aspect of Ratio:

Pan and Scan Aspect of Ratio:

Other Examples:

From ‘Saving Private Ryan.’

Today we don’t have to worry about those things anymore with the advent of wide-screen DVD and Blu-ray releases with its black bars on the top and bottom of the film. Moreover, we now have rectangular HDTV’s that have become the norm. Many of them even have choices on how one might want to enjoy viewing a movie; pan and scan, aspect of ratio’s, letter box (traditional Panavision) – most films are made in this format. There’s even an option for extreme close ups, via the zoom in and zoom out capacity. It is meant to enhance the viewing pleasure of the audience and it also illustrates the choice of aperture used by the director. All of this is intentional.

But I suppose it doesn’t matter to some whether it was intentional or unintentional concerning the curvature or non-creative of the Earth because according to mainstream, the Earth is curved, regardless. Or is it?

Let us remember my visual object lesson. The quarter, though flat, but circular, did appear curved – because it is. But only at its edges. And as I fisheye(ed) it, it curved from underneath. It is as if I had a camera at the “nadir” position. In actuality, my Android camera was the stern because it is located in the back of my phone. But we know quarters are not balls (a globe). Could it be that the images that we are seeing is due to such a similarity? It could be. But some might be quick to point out; “fisheye lenses did not exist when they took this picture.” On April’s Fools Day, in 1960, the Tiros Satellite photographed the Earth while in motion above the Earth at low Earth orbit – LEO; or as we shall call it “low Earth circular” – LEC. It was the first photograph of the Earth as seen from a 413-mile altitude (the thermosphere). The following picture is a photograph taken, well before the Apollo Missions. The 2nd picture is a photograph taken at a much later date.

The 2nd photograph looks like my quarter, doesn’t it? Well, it is. But this time, all I did was rotate the pic, cropped and turned it into a B&W picture. I also placed a frame around it to match the 1960 photograph to the best of my ability. Or rather I should say, my Android’s ability. But with or without a fisheye lens, the 1960 photograph doesn’t prove a spherical Earth at all. All it shows is a curvature. Just like my flat circular curved quarter. “Oh but what about the math?” One thing at a time, please. I will cover that in an upcoming article. But with reference to fisheye lenses, the term was coined in 1906 by American physicist and inventor Robert W. Wood. With practical uses in the 1920’s for meteorology use. After this, fisheye lenses were mass produced in the early 1960’s, and often utilized in 35mm films. I find it rather odd that the date of that Tiros Satellite photograph occurred in 1960. So are we to trust, the human eye or the camera?
From the perspective of the human eye, which is neither wide angle, nor telephoto, but somewhere in the middle, like most compact cameras are; quarters are flat and circular. But from the eyes of a fish, it may distort – close and shrink in on itself to form a ball, like the quarter appeared to have done. In the mind of a fish (that is if they have a brain at all), they are probably thinking the reverse about us. To them, we are the ones that are not seeing everything the way they really are. For us, the quarter is a circular ball. But observations like this often lead into other fields of notable and esteem academia; namely, argument from illusion, realism, philosophy of perception and visual space. These topics will be discussed in chapter twelve.