Well, call me crazy, but I just don't see how this proves a Mars hoax by NASA. Pictures 1P137165130ESF2019P2357L4M1.JPG and 1P137165242ESF2019P2357L5M1.JPG indeed appear to be taken from the same location at different times, and the dunetops appear to have shifted somewhat. Isn't that what dunes do? Wind blows, and sand gets moved around. Isn't that exactly what we should expect to see? Doesn't it prove that the scenery isn't just a painting?
Please to explain what you are seeing here, and why you think it's evidence of a NASA hoax?
"For shame, gentlemen, pack your evidence a little better against another time."
-- John Dryden, "The Vindication of The Duke of Guise" 1684
Far as I can tell, it looks as it the center of the picture didn't get processed right in later images. It looks as if the data stream just kept cropping the center portion in such as way that it looks odd like this.
Remember, never attribute to malice what you can attribute to mistake.
Is it not just overexposed, so that places the sun hits are saturating their CCD buckets, and getting depicted as 100-percent white?
The shadow on the pancam calibration target is pretty long both before and after that landscape was imaged, from 4:02 to 4:24 local time. Could the foreground have simply been in shadow, and when that majority of the image was properly exposed, then the smaller sunlit portions and bright sky were overexposed?
I guess I'd rather have that than the sliver of sunlit ground properly exposed and all the shadow-drenched foreground black.
I have a feeling that the ground the rover is standing on is at an angle, so that the hill-like feature to the left is actually more horizontal... The rover does seem to be on the outer rim of the crater...this would put the two cameras at different heights, and make the differences harder to understand by just applying the horizontal separation of the cameras...Originally Posted by ToSeek
Besides, I really don't think NASA is going to conveniently place nearly identical doctored photos next to each other as a convenience to the conspiracy theorists.
And what is the conspiracy here again? That the rovers are actually in an Area 51 hanger or something?
My best guess is that we are seeing some kind of threshold cut-off effect inherent in the software. Certainly if you look closely at some of those photos near the top center where there is a bit of a bump, there are light areas that our brain fills in as earth but it is exactly the same colour as sky. So these can't be taken completely at face value, in any case. Also, the edges where the hill areas have mysteriously disappeared are jaggy suggesting something obviously un-natural has taken place.
I invite anyone with interest in this image series to check out J Maki et al. for their explaination of the autoexposure algorithms:
This is a great example of the delicate balance they're trying to obtain when picking how to frame an image...this image was, i'm betting, misframed. In short:
When they use the autoexposure function, the image waits until a critical % (pixel fraction) of ccd bins reach a specific pixel value (DN threshold) or higher. The problem in the above image, as opposed to other similar images, is the very low % of the image that has sky in it. If the pixel fraction were 25%, and the sky covers less than 10% of the image, then the 10% sky would max exposure long before the pixels of ground reach the autoexposure DN threshold. This leaves the image exposing long after the ccd bins for the sky portion have filled, and begin to bleed over in the nearby bins. Most other images of the horizon line have been taken at least ~20 % of the image given to the sky, so that the sky brightness is the trigger of the end of the exposure.
You can see in the earlier, L2 L3 L4 frames, there are bright (in that filter's color) features in the foreground. They would contribute to the pixel fraction, and help cut off the exposure before it goes too far. The later L5 L6 L7 frames (green-blue) do not have bright features in the foreground, so the entire pixel fraction has to come from the bright horizon (and those bins which are bled into).
Its also good evidence for the nature of the light coming off the ground and sky. The filters which have the most bleed represent those which are in the blue end of the spectrum, lending more credence to Mars' consistant redness. There isn't enough blue in the scene to cut off the exposure until long after the inherient brightness of the sky overwhelms the nearby bins.
or.. that the little rover is moving up the hill so you are seeing the same view from different angles.Originally Posted by aporetic_r
Image - time
1P137165054ESF2019P2357L2M1.JPG - Sol 101 - Time 15:48:54
1P137165084ESF2019P2357L3M1.JPG - Sol 101 - Time 15:49:23
1P137165130ESF2019P2357L4M1.JPG - Sol 101 - Time 15:50:8
1P137165242ESF2019P2357L5M1.JPG - Sol 101 - Time 15:51:57
1P137165419ESF2019P2357L6M1.JPG - Sol 101 - Time 15:54:49
1P137165596ESF2019P2357L7M1.JPG - Sol 101 - Time 15:57:41
I already found other fake images by NASA, like this one:
This is the corresponding images obtained by me using the raw images:
And this is taken from this site
A couple of details:
From RAW images:
What I don't understand is why NASA "doesn't like" natural martian sky!
Are there Green Men greeting Opportunity, in the Martian sky?
Is there some unhappy employee at NASA which would like to discredit NASA?
Did somebody at NASA let a coffe cup fall on the PC, so image data were corrupted, so he had to manually repaint the image?...
Was there a searchlight visible in those photos, as the picture was taken at Universal Studios?...
Guys, I really don't know what to think, but those images are clearly retouched, this is a fact. About the reason, I don't know, I'm not an international detective...
At least two of the photos we're looking at were taken at exactly the same time. I think slinted has the best explanation (involving auto-exposure).Originally Posted by carolyn
Everything I need to know I learned through Googling.
I don't get it. When the caption to the above picture clearly refers to it as "This enhanced false-color mosaic image from the Mars Exploration Rover Spirit panoramic camera", why do you call it a fake? Were you faked out by the enhanced colors?Originally Posted by jumpjack
Of course the images are "retouched". It's what distinguishes "raw" from "processed" images.Guys, I really don't know what to think, but those images are clearly retouched, this is a fact. About the reason, I don't know, I'm not an international detective...
What's your complaint, exactly? The raw images are there, alongside the interpretations made by the NASA scientists. If you don't like their interpretations of the data, you're free to come up with your own (as you have).
I don't see how your expertise in photointerpretation trumps theirs, though.
Well, that's a really poor example to use for what the true colors should be, when the caption there says:Originally Posted by jumpjackDo you claim the images you synthesized are true color? Why?Since the raw images that are currently available were taken with varied exposure times ratios between the seperate filters (even within a single sol) and an unknown level of contrast stretching and gamma level, the images that appear here are not true color because of this distortion (bright blue rocks, purple sky, etc).
No, the colors are another, long discussed problem; but they completely deleted the sky and changed it into a nice, very Martian flat-red sky! None of the raw images has flat sky (gray, or white, or black, or what), ther is always noise. It is magically disappeared in color version. Congratulations to the creator of the program NASA uses to enhance the images... =D>Originally Posted by 01101001
This new "true-color" image is simply offending:
Do you like that nice, reddish Martian sky? :^o
And what about the reddish terrain? They just replaced the grayscale of RAW images with a nice, Martianish red-scale... unfortunately, doing so now even the rovers looks red!!! #-o Ooopss...
The soil, in Endurance, is NOT red, it's ,
brown, blue, and orange, and yellow....
In this image, we only have red. [-X Anyway, if you think we could never know the true colors of Mars, enjoy that nice red gradient for the Endurance sky! It looks like been painted by a kid.
Won't "lossy" compression algorithms introduce artifacts like this?
Whatever the source of the artifact in the processed images: what exactly is it that you are accusing NASA of?
Spot the difference between "noise" and "detail"...Originally Posted by jumpjack
Main goal in processing any photo is to make it as good as possible, whatever filters and adjustments it takes is irrelevant.
Why do you think you got the raw images handy for comparison?
When NASA first landed on Mars the first released Viking images showed the sky as being blue, like it is on Earth. Later on, however, they realized the error and corrected the changes. A full account, complete with explanations, can be found here.
You're saying NASA does not want to admit that, for some reason or another, the sky in the images should be blue. Well for awhile they thought it was blue but when they found they were wrong they admitted the error and corrected it.
I also can't help but wonder if NASA was really trying to hide something or another about the Martian sky really being blue wouldn't they have been consistant about it? Why switch over later when everyone thinks the sky on Mars is blue? It doesn't make much sense to me.
Neither for me.Originally Posted by Andromeda321
Although I didn't say they paint the sky with flat red to cover the blue; they change it, but I don't know why. :-?
A while back there was a topic about Martian color.
Amongst it was an very technical geek-explaination as why the color was suppose to be redish (for me -like many others- not at all accessable to read - call me stupid).
I just don't buy that Appollo mission had good color foto's over 30 years ago, hubble has shown us magnificant images, we have accurate satelite images from Earth orbit,etc... but when Mars images hit our screens there seems to be an endless debate about the colors, and indeed comparing some images do raise relevant questions about coloring. Note that most Rover images are nowadays black & white, which probably addresses these issues.
In speculatation as why NASA doesn't seem to have an interest in presenting 'true-color'images (note speculate!):
1) To Address the idea that the planet was caraterized a 'red' throughout history.
2) To avoid detection of possible moss, algea, thus life, by rendering images to red.
Some related topics:
with interests are specific scientific studies about possible plant life linked at:
The recent discoveries of menthane,ammonia and formaldehyde might strengthen that.
2a) Implications (religious) of life elsewhere, is not something you would consider a standard press release; remember not to long ago, we thought Earth & humanity was the central of the Universe, an God created mankind to his likeness......, along with plants....
Just my $0.02
Detail = part of the imageOriginally Posted by lek
Noise = several "dots" spread around the image, due to JPEG compression, or actual color differences, or dust in the atmosphere, or what else you prefer...
Because if you build an image from low-res or noisy images, I suppose you'll obtain a low-res or noisy image! But, in the NASA color image the noise magically disappears, and the sky is perfectly flat!Main goal in processing any photo is to make it as good as possible, whatever filters and adjustments it takes is irrelevant.
Why do you think you got the raw images handy for comparison?
Anyway, the best thing would have been to place a commercial $300 digital camera on the rover, and let it take some color snapshots: it wouldn't be very scientific, but it would have solved the "true-color" problem! When take snapshots of my room with my camera, they LOOK yellow, due to artificial light... but they ARE ACTUALLY yellow, my room IS yellow, as the artificial light IS yellow!
But, if I want, I can activate automatic white balance just by pressing a button while pointing the camera to a white sheet: the sheet "turns" to white, and the room looses its yellow component...
So, where's the fakery by NASA again?
So for some reason they removed uninteresting stuff from the pic... Whats the big deal? The caption doesnt go "Ooh look at that smooth sky" does it?Originally Posted by jumpjack
As for your digi camera... It makes tons of adjustments automagically to produce what it "thinks" counts as picture with good color balance and lighting, and typically the result isnt reality, and definately not how you saw it with plain eyes.
People want to see the color in those pics as they look like over there, not how it would look like in white light. So "calibration" like showing the camera "This here is white" wouldnt tell us anything of how the scenery looks like over there.
Ok, no fake, it's true, you are right.Originally Posted by LTC8K6
Two quick points:
For your version of the color image, the link you provided (http://www.lyle.org/mars/bysol/2-089.html) does not appear to have the proper pictures to make a RGB composite image. Someone could probably help me out here, but to make an RGB composite image, I think you would have to combine exposures taken with the L4, L5, and L6 filters. The site you reference only has the L5 exposures.
Speaking of color, I'm not sure even that would give you a "true-color" representation. It would probably be close, but color is dependent on many factors.
As for the painted sky, I think you would have to compare the raw image files (as in RAW format files). Comparing Jpegs is a little iffy since jpeg is a lossy compression format. Perhaps you could compare them if you could get the original jpeg files from the rover website.
JumpJack:Originally Posted by jumpjack
I am responsible for the images posted at http://www.lyle.org/~markoff/ and helped build the color formula's used at http://www.lyle.org. You are quoting, twice now in this thread, images from my site as proof of NASA's vast color conspiracy, even as someone has already pointed you to the warnings on the front page of my site stating, very simply, that my images ARE NOT REAL COLOR.
I, as a complete amateur to this field, would not claim for a moment that some specific color found in my images are 'correct' in any way, shape, or form over an image produced by JPL's MIPL group. They have access to the exposure data and known fully well what contrast scaling the individual frames have in comparison to each other, which I and anyone else constructing images from the raw JPGs do not.
So, why does the JPL image have such a consistent sky color and not show artifacts from compression or pixel noise? Because they did they jobs and they did them well.
1. The JPGs being distributed are not the full resolution images produced by the rovers, which use ICER compression. They are choosing to distribute the images as reasonably sized JPGs, creating some artifacts in and of itself over the full resolution images. They have previously stated that, over the course of the mission, they have had literally billions of web hits. I think its perfectly reasonable, given their budget limitations, that the images distributed would be ever so slightly downsampled in order to mitigate unreasonably high bandwidth costs.
2. Creating TRUE color using the raw JPG images requires compensating for wildly varying amounts of contrast stretching. I state clearly on the front page of my site that this is a step that I do not do, though it is completely necessary in order to create true color. I do not do this step because it requires information regarding the exposure of every individual image, which is not yet available but which will become public with the PDS release (first one comes in August).
Removing that stretching actually reduces noise from the images, as it will lessens the variation between pixels. The greatest ‘culprit’ of all the filters, in this regard, are L7 images, which were used in both of the color frames you quoted in this thread. If they used a ‘a commercial $300 digital camera on the rover’ as you stated, then each individual RGB channel would receive near equal exposure, but in the case of the rovers, taking each filter frame individually, this is not the case. The exposure data of several specific images has been released, through the Maestro Datasets, and it shows that while an average frame through the L2 filter is exposed for less than a second, frames taken through the L7 filter are often exposed 5-10 times longer, in order to pick up as much detail as possible. This means that features in the blue end of the spectrum are in fact much LESS bright than they appear to be in the L7 frames taken as is. This means that you have to scale the brightness of these images down in order to equate them with other, less bright frames to produce a single color image. I cannot quote exact numbers here (if I could, I would be doing it myself), but the scaling factor is probably 2x-3x, meaning: If you take an L7 image that has an original stretched contrast values, in 8 bit grayscale, of 0 to 255 and equalize it with relation to brightness to the same image the L7 image would have values from 0 to 128, possibly as low as 0 to 75. Downscaling the images in this fashion would reduce the pixel – pixel variation greatly, leaving a more consistent image, especially for bright features such as the sky…as was posted by JPL.
3. Between the raw images and the PDS release of calibrated images lie many steps and stages of calibration. Some of this calibration is done on the rover, though most is done on the ground in the months leading up the PDS release. I believe the raw images that are being released are just that, raw. At JPL, I’m sure MIPL is using the most calibrated images available at the time to make their color products. This means they could have already compensated for (dark current, flatfielding, etc.) after the raws have been released and used those images for color processing to create the beautiful pictures they have been posting through the press images section of the website. Every step of that calibration is likely going to reduce the noise even further beyond what is seen in the raw images, leading to a more consistent, and less noisy product.
4. The original “Heading for the Hills” panorama you selected, http://marsrovers.jpl.nasa.gov/galle...20040408a.html, states clearly in its description that is FALSE-COLOR and ENHANCED. One of the many good tools for creating a visually distinctive image with easily discernable features is a low level blur function. This evens out discrepancy between individual pixels, leading to a more consistent image. I would not be surprised if that was done in this case as well, leading to a more constant, and frankly realistic, image of the sky.
I have been posting my color images from the raws since the mission began, to fill in the blanks where NASA’s amazing imaging staff haven’t been able to post color images. I do so knowingly without requisite information to make true color images, which isn’t yet available. I look forward to working with calibrated images in the future, but I am happy to make due with what is available for now. I like to see the variation to does come out in color shots that you don’t see in the black and white individual frames, but I never intended to usurp, compete, or even compare to the validity of NASA’s releases. Frankly, I have no intention of having my images used to make claims against NASA, especially when I have tried hard to explain why they are probably right, and it is my images that are in the wrong. I would hope that you would refrain from referring to my images as anything other than they are, an amateur’s attempt to begin to get a feel for Mars and its environment in an unscientific but visual way.
How the hell should I tell it?!?Originally Posted by slinted
Let's try in this way:
I KNOW THEY ARE NOT TRUE COLOR!
This is NOT the point.
The point is the damned sky: NASA sky is FALSE; it could be green, yellow, or magenta, but is clearly FALSE, as it has no track of compression artifacts, which means the orginal uncompressed image (if any..) had PLAIN COLOR replacing the original sky.
Compression artifacts are EVERYWHERE in that damned photo, but in the sky. Is Martian sky magic, so when we build color images of it, it magically loses all noise, compression artifact and whatever else?!?