The C++ source code used to run these tests has been added to the Downloads section of the blog under the name Aqua Satellite Project (BETA).
While I said words to this effect at the end of the post, I wanted to make sure it's clear: These findings are preliminary. Although nearly two months of research has gone into them more needs to be done.
In the meantime, folks wanting to see the data for themselves can download it from NASA by following the steps here: UAH Satellite Data and downloading the viewers by following the steps here: Looking At The Aqua Satellite Data.
Killing two birds with one stone, I decided to give my AMSU C++ code a bit of a workout by doing a Quality Control spot check on the January UAH raw data. I downloaded all the January Level 1B data for the AMSU from NASA. There are 7410 files for the month of January. Here's what I found:
No Files Passed NASA's QA. None.
Of these 7410 files, 7386 of them had their automaticQualityFlag marked "suspect" and another 24 of them had the flag marked "failed". Add that up and it's all 7410 files for January were either suspect or failed. No a single file was marked as "passed".
Of the 7386 suspect files, all of them had a "Good Data" percentage of 93.33334 percent, meaning that's the percentage of data in each file that's considered reliable. There's no way such a number would actually be identical out to one one hundred thousandths of a percentage 7386 times out of 7386 tries, so I figured my code that was reading the wrong data for the good/bad/missing/special values.
Nope, my code was good and manually checking a random selection of about 20 files by opening them up in an editor revealed they all had the identical numbers in them for these fields. In other words, the good/bad/missing/special data flags seem to have hard-coded values, at least for "suspect" data.
Channel 4 Never Works (Which Means Channel 5 Never Works Either)
Each of the the files contains 45 readings from each channel. Of these 45 readings, channel 4 failed 45 times every time. The only exception was in the files that had been marked as "failed". So there was never a good reading from channel 4 for the entire month of January.
This is significant because the statistics engine used to correct the satellite readings for channel 5, the channel UAH uses to get it's anomaly information, requires data from channels 4 and 6.
So this means they're either using a non-published algorithm to calculate the anomalies, or the anomalies are wrong.
Screen Shoot Of QA Spot Check Results
(Click for larger image)
Here are a few things I think of that may affect the validity of this spot check:
- I'm not a scientist, I'm a software engineer.
- Prior to working on the Aqua satellite as documented in this blog, I had no experience on working with satellite data.
- Despite having checked my code, it is beta code, and I may have missed errors.
- It's possible that data files are not given "passed" status (as opposed to "suspect" and "failed") until some QA procedure that takes place after Level 1B occurs.
- It's possible the order of channel information in the data file does not represent the actual channel number. In other words, the fourth channel in the data file may not be channel 4.
- UAH uses other satellites in addition to the Aqua AMSU.
Still, further research needs to be done to triple check these things, and that's what I'm working on now. If I don't come across anything that changes my views, I'll forward these results to Dr. Spencer and Dr. Christy for their input.
At this point, while admitting the possibility of an error on my part, I'd have to say it seems to me the January AMSU data has no validity at all. Nor does anything derived from it. The hard-coded quality values and the complete failure of channel 4 both look like show-stoppers to me.
Previous Posts In this Series
So, About That January UAH Anomaly
A Note On UAH's High January Temperature
The Limb Adjustment of AMSU-A Observations: Methodology and Validation