20% off all products!   Sale ends tonight at midnight EST.

Return to Main Discussion Page
Discussion Quote Icon

Discussion

Main Menu | Search Discussions

Search Discussions
 
 

Brian Wallace

9 Years Ago

Iconic Space Photos Are Actually B&w: Here's How Nasa Colorizes Hubble Shots

Iconic Space Photos Are Actually B&W: Here’s How NASA Colorizes Hubble Shots

Did you know that the Hubble Space Telescope is only able to capture black-and-white photos? In order to capture a maximum amount of information in their space photos, NASA captures multiple black-and-white images using different filters in the camera. These images are then combined in post to create the iconic color photographs that you see published by the space agency.

The video below shows how NASA goes about colorizing the photos by compositing the individual shots.

Reply Order

Post Reply
 

Mike Savad

9 Years Ago

i'm more interested in the layer modes, i didn't see anything really that special that he used. though i don't think i'd be able to get a color from a black and white, i'm guessing because the channels would all be the same density of color.

---Mike Savad
MikeSavad.com

 

Mike Savad

9 Years Ago



this sort of explains it. using layers set to screen, each one is adjusted that way. i wonder if there is a way to do it without different images like they have. and now i wonder if these are accurate to what they look like, or just more scientific guessing and photoshop magic.


---Mike Savad
MikeSavad.com

 

Paul Cowan

9 Years Ago

The earliest colour photos were made by taking separate B&W negs with red, green and blue filters and using them to print different colours over each other - the Tsar's photographer is the earliest I recall seeing, he was making colour photos in about 1900.
I want to have a go at that sometime. Shooting with filters wouldn't be a problem but I don't know how to assign the colours to different files and then superimpose them in Photoshop. It can't be hard to do but getting the alignment right might be awkward, especially if starting with film that needed scanning.

 

Paul Cowan

9 Years Ago

You would have different densities in the differently filtered negatives, but I think you would need to shoot them all at the same exposure (and presumably underexpose by a couple of stops to allow for later addition of the three layers). I'm not sure if a digital camera would react the same way, though I don't see why it shouldn't.
Filter density would be an issue, of course, as some might be more dense than others, but once you've got the three channels superimposed you can use the colour channel sliders to increase or decrease the density of different channels..

It's actually pretty much the same as the way colour separations are prepared and printed in newspapers, except that uses CYMK.

 

Mike Savad

9 Years Ago

i think there are even older ones than that in color using that process, but they were shifty looking. i think the photochrom method used something like that. they made color postcards with it.


---Mike Savad
MikeSavad.com

 

Gregory Scott

9 Years Ago

I'll have to beg to differ. They ARE color photos. A digital camera makes 3 simultaneous exposures for sensors of Red, Green, Blue. The fact that they are not making the color separation in a single exposure doesn't mean it isn't color. They shoot with a red filter, they have a red color image; with a green filter, a green color image; with a blue a blue.

This is the way the first color photos were made, before the invention of color film. Everything old is new again.
The process is arguably higher resolution than the conventional 3 color sensor of an ordinary color digital camera, provided the subject doesn't move (not a problem) and the images are precisely aligned (also not a problem). It is NOT colorized in the sense of painted in color, such as you commonly see with Scanning Electron Microscopy, which is ALWAYS black and white, since these photos are taken using electrons, with a much shorter wavelength than visible light. Any color you see in such an image is always false color.

It's interesting to note that in astrophotography and other scientific photography, you often see false color images where instead of using RGB filters, filters of other wavelengths are used, for example infrared, or gamma rays. The result is a color photograph, but the colors are shifted (color-shift photography) to wavelengths that we can see.

I've used color infrared slide film, which is an example of single-exposure color shift photography. Here is one example:

 

Paul Cowan

9 Years Ago

Agreed, Gregory.
That IR shot - did you use a red or IR filter when taking it? It doesn't look like I would expect an IR shot to look, but I don't know if I've seen one shot on film before.

 

Gregory Scott

9 Years Ago

Frankly, I don't recall if I used a red filter or not. I don't think I did.
Here is a link to a fact sheet on a similar film, or maybe the same one I used way back then.
I don't know if this is the same film, or if it is in production.
Reading the sheet, I don't think it's necessary to use a filter with the film in the link below, but tests would be needed, of course, to be sure:
http://www.kodak.com/global/en/professional/support/techPubs/ti2323/ti2323.pdf

 

Paul Cowan

9 Years Ago

Ah, well, I can cross IR film off my list of things to do if it can't be kept for more than a week at room temperature. It would cartainly be spoiled by the time it got to the processor and probably by the time it reached me in the post.

 

This discussion is closed.