Showing posts with label Raw AMSU Data. Show all posts
Showing posts with label Raw AMSU Data. Show all posts

Tuesday, May 4, 2010

Appealing NASA's FOIA Decision

I've decided to appeal the FOIA decision from NASA, JPL. To do this I have to write out a hard copy letter and snail mail it to the appeals office. Here's the copy of that letter:

Dear Sirs,

I'm writing this letter to appeal a FOIA decision involving NASA JPL. The decision involved 3 issues, the first of which has been resolved. For the second and third decision, I've included relevant background information as well as the reason I'm appealing.

Issue 1: Radiative Transfer Algorithm Used By AIRS.
Status: Resolved.

Issue 2: Atmospheric Scan Depth Of Aqua Satellite AMSU Channel 5 Footprints.

Background
The Aqua satellite has an AMSU instrument that scans the atmosphere at 15 different channels. Each of these channels scans 30 different locations in the atmosphere. These locations are sometimes referred to as "footprints". Each footprint scans at a different height in the atmosphere.

NASA Discussion On How This Data Is Used
Where reliable sensor data is available, it is applied directly to the appropriate portion of the atmosphere, taking into account the angle of the observation.
- Steven Friedman, NASA JPL, Personal Correspondance (Included)

My FOIA Request
(2) Atmospheric scan depth for each footprint on channel 5 of the AQUA AMSU.

FOIA Response
This is to advise that NASA has no responsive Government records at JPL for parts (2) and (3) of your request.

Why I'm Appealing This Decision
NASA JPL has indicated that they apply each scan to the appropriate portion of the atmosphere. This is impossible to do if it is unknown at what depth of the atmosphere a given footprint is scanning.

Issue 3: The 230000 Scan Readings And/Or The Values Of Vectors Ai And Theta Bar i Used To Synthesis Aqua AMSU Channel 4 Data

Background
Aqua's AMSU channel 4 failed in late 2007, as described in AMSU-A Channel 4 NeDT Update: 20 December 2007 at the document archive. The data for channel 4 is now artificially generated. The generation process uses two vectors, Ai and Theta Bar i. The values for these vectors is not available to the public. The data for the vectors was itself generated from 230000 scans taken from the AMSU. This data is also not available to the public.

NASA Discussion On How This Data Is Used
See AIRS/AMSU/HSB Version 5 Modification of Algorithm to Account for Increased NeDT in AMSU Channel 4 at http://disc.sci.gsfc.nasa.gov/AIRS/documentation/v5_docs/AIRS_V5_Release_User_Docs/V5-Modification-for-AMSU-Ch-4-NeDT.pdf for a detailed discussion on this data. In particular, see equation (1) for the use of Ai and Theta Bar i.

My FOIA Request
(3) In references to the creation of synthetic readings for the AQUA AMSU channel 4, the 230000 cases used to create the values for the vectors Ai and Theta Bar i, or the values of vectors Ai and Theta Bar i themselves if the 230000 readings are no longer available. These values and readings are referenced but not actually provided in the document AIRS/AMSU/HSB Version 5 Modification of Algorithm to Account for Increased NeDT in AMSU Channel 4 available online at http://disc.sci.gsfc.nasa.gov/AIRS/documentation/v5_docs/AIRS_V5_Release_User_Docs/V5-Modification-for-AMSU-Ch-4-NeDT.pdf..."

FOIA Response
This is to advise that NASA has no responsive Government records at JPL for parts (2) and (3) of your request.

Why I'm Appealing This Decision
NASA JPL has indicated that this data is used to generate the synthesized channel 4 data. This would be impossible to do if the values for vectors Ai and Theta Bar i were not known to them, as these values are two of the four values used to generate the synthesized data.

Note that NASA JPL also indicated that NASA GSFC may have responsive records for issue 3. I am in contact with them to see if this is the case, but as my ability to appeal this decision is limited to 30 days, I am simultaneously appealing.

Thank you for you time and help in this matter.

...and the e-mail sent to NASA, GSFC for the data used to generate synthetic channel 4 values:

I was informed by Dennis B. Mahon of NASA, JPL that NASA GSFC may have information regarding data used to generate synthetic channel 4 data for Aqua's AMSU channel 4. Specifically, I am looking for he 230000 cases used to create the values for the vectors Ai and Theta Bar i, or the values of vectors Ai and Theta Bar i themselves if the 230000 readings are no longer available. These values and readings are referenced but not actually provided in the document AIRS/AMSU/HSB Version 5 Modification of Algorithm to Account for Increased NeDT in AMSU Channel 4 available online at http://disc.sci.gsfc.nasa.gov/AIRS/documentation/v5_docs/AIRS_V5_Release_User_Docs/V5-Modification-for-AMSU-Ch-4-NeDT.pdf..."

See equation (1) of the referenced PDF for specific details.

Please note that this request is associated with a FOIA involving NASA, JPL that is currently under appeal.

Previous Posts In This Series
NASA Responds To FOIA Request
Quick FOIA Update
The Government Way
FOIA Request Filed With NASA

Friday, April 16, 2010

NASA Responds To FOIA Request

NASA has responded to my FOIA request.

In a nutshell, they provided me with a link to the AMSU-A Radiative Transfer Algorithm documentation and said they had no information on the scan depths for the footprints of channel 5 on the AMSU and didn't have the vector data they use to synthesis AMSU channel 4.

Both of the claims of "no information" seem impossible to me, as they're needed to process AMSU data the way NASA JPL does.

While I mull over my next move, here's their response.

Your Freedom of Information Act (FOIA) request for release of information from the files of the National Aeronautics and Space Administration (NASA) received in FOIA processing at the NASA Management Office-Jet Propulsion Laboratory on April 8, 2010. You requested the following:

"...(1) Documentation on how the AQUA AMSU-A Radiative Transfer Algorithm works. (2) Atmospheric scan depth for each footprint on channel 5 of the AQUA AMSU. (3) In references to the creation of synthetic readings for the AQUA AMSU channel 4, the 230000 cases used to create the values for the vectors Ai and Theta Bar i, or the values of vectors Ai and Theta Bar i themselves if the 230000 readings are no longer available. These values and readings are referenced but not actually provided in the document AIRS/AMSU/HSB Version 5 Modification of Algorithm to Account for Increased NeDT in AMSU Channel 4 available online at http://disc.sci.gsfc.nasa.gov/AIRS/documentation/v5_docs/AIRS_V5_Release_User_Docs/V5-Modification-for-AMSU-Ch-4-NeDT.pdf..."

This is to advise you that responsive records to part (1) of your request may be found at the following web site:

http://eospso.gsfc.nasa.gov/eos_homepage/for_scientists/atbd/docs/AIRS/atbd-airs-L2.pdf

This is to advise that NASA has no responsive Government records at JPL for parts (2) and (3) of your request. Additionally, pertaining to part (3) of your request, we have no responsive records because the specific research and development was performed by the NASA Goddard Space Flight Center (GSFC). Therefore, it is possible that GSFC may have responsive records.

You have appeal rights concerning these actions.

And I do want to thank them for not charging me for this request. :)

Previous Posts In This Series
Quick FOIA Update
The Government Way
FOIA Request Filed With NASA

Thursday, April 8, 2010

Quick FOIA Update

Just dropping a note to update the status of my FOIA. In the previous post I noted that NASA JPL wanted me to state up front how much I was willing to pay for the information.

Having no idea what these fees are for, I figured the next step is to ask why they charge these fees. That way I can give some sort of reasonable answer on what I'd pay. Here's my resonse:

Dear Mr. Mahon,

As I mentioned in my previous e-mail, this is my first FOIA request, So, if I may, I'd like to ask the purpose of the fees. Are they to cover mailing costs? Will I be actually paying the salary of the person looking up the information? Are they arbitrary fees designed to prevent "fishing expeditions? The reason I ask is so that I can provide you with an answer that is sensible for the purpose of the fees. If possible, providing an "average fee" for obtaining one document would be useful.

Previous Posts In This Series
The Government Way
FOIA Request Filed With NASA

Wednesday, April 7, 2010

The Government Way

I got a response back from NASA JPL regarding my FOIA request. They want to know up front just how much I'm willing to pay for the information. Gotta love how our government operates. :)

Here's the text (emphasis mine):

Greetings:

Please respond with your complete postal mailing address, phone and fax numbers. Also, you must state your willingness to pay fees that may be assessed in processing your request. Please reply stating your willingness to pay fees and the amount you are willing to pay so we may proceed with your request. Thank you.

So now I'll have to mull over just how much I'm willing to pay for information my taxes already paid for.

Tuesday, April 6, 2010

FOIA Request Filed With NASA

About three weeks ago, as I noted in this post, I sent off a request to NASA JPL for information needed to reproduce their synthetic creation of AMSU channel 4 data and for the atmospheric scan depths of each footprint for channel 5 on the AMSU.

I've not received a response back, and 3 weeks is one week longer than my usual waiting period of two weeks. So today I filed a FOIA with NASA JPL for the information. A copy of the request is shown below.

Dear Mr. Mahon,

This is my first FOIA request, so allow me to apologize in advance for any procedural mistakes I may make.

About three weeks ago (on or about March, 13th, 2010) I requested several pieces of information from NASA JPL via the "Ask AIRS" web interface and have received no response. So I am now requesting that information via the FOIA. The requested information is:

● Documentation on how the AQUA AMSU-A Radiative Transfer Algorithm works.

● Atmospheric scan depth for each footprint on channel 5 of the AQUA AMSU.

● In references to the creation of synthetic readings for the AQUA AMSU channel 4, the 230000 cases used to create the values for the vectors Ai and Theta Bar i, or the values of vectors Ai and Theta Bar i themselves if the 230000 readings are no longer available. These values and readings are referenced but not actually provided in the document AIRS/AMSU/HSB Version 5 Modification of Algorithm to Account for Increased NeDT in AMSU Channel 4 available online at
http://disc.sci.gsfc.nasa.gov/AIRS/documentation/v5_docs/AIRS_V5_Release_User_Docs/V5-Modification-for-AMSU-Ch-4-NeDT.pdf

Thank you for your time.

P.S.

Just in case you're wondering where my normal two-week waiting period came from, two weeks is the amount of time it took Stephen Hawking to respond to a question I had regarding black holes and quantum physics. Given Dr. Hawking's well-known medical issues, this should be a reasonable time frame for any reply from anyone else.

Saturday, March 20, 2010

One Month Of AMSU Channel 5 Data


This video shows 1 month of Aqua AMSU Channel 5 data as it was displayed during one of my debugging sessions. The video is kinda blurry due to low resolution, but don't worry, you wouldn't be able to read all the data scrolling by even at HD resolution. Enjoy :)

Update:
Here's a screen shot of the end of the video in better resolution so you can see the data is nothing but the numeric readings from the channel.

Tuesday, March 16, 2010

What's Wrong With This Picture?

Crazy Data From NASA Or Bugs In My Programs?

While I'm waiting to hear from NASA, I've started doing a QA check both of my code and the Aqua Satellite AMSU data. The above picture shows an example of some of the things I've come across. The picture shows NASA's AMSU channel 5 data for every day in January, 2008, and January, 2009 through January, 2010 for all 30 footprints (14 months total). The view along the X axis is by footprint, with an average at the end. The data was generated by my AMSUSummary program and displayed in Apple Numbers.

You can see the obvious problems with the data. Channels 17 and 18 are just whacked. Channels 25 through 30 go in the wrong direction, and channels 1 through 6 have too high a value. Compare this to the more standard picture of what such a footprint snapshot should look like, shown below.

What 30 Footprints Should Look Like.

Below is the same data again, this time the X axis represents time.


And here's daily data from December 31st, 2009. This data has been through my QA process and none of the scans for channels 24 or 25 passed QA. The QA checks to make sure none of the data is more than +/- 50% of the expected readings as defined in the literature.


So the question is, is this strange data due to bugs in my code, or is this the way the data actually looks. I've already done QA checks on my code, but for this next week, I'll be going over it again to see where the problem lies.

So for the rest of the week expect QA posts from me. In addition to checking out the code and data, these posts will give good instructions on how to use the programs I've written so far.

Monday, March 15, 2010

Request Sent To NASA For Unpublished Data And Algorithms Related To Creation Of Synthetic Channel 4 Data

As discussed in a previous post, NASA now synthesizes channel 4 data of the Aqua AMSU. Going through the steps needed to recreate this synthesis, I noticed several sets of required data are not available to the public. So I've sent off a request for this data as well as the associated (and undocumented, as far as I can tell) AMSU-A Radiative Transfer Algorithm. I've also requested information regarding the depth of the atmosphere each footprint on channel 5 scans so that I can get back to work on recreating the UAH temperatures.

The request was sent a few days ago. Hopefully they'll respond. They've already sent one piece of missing data, the at-launch noise for each channel (that's the NEDTi in the formula at the top of this post). So my thanks to NASA JPL for that.

The requested data is:

● The 230000 cases used to create the values for the vectors Ai and Theta Bar i, or the values of vectors Ai and Theta Bar i themselves if the 230000 readings are no longer available.
● Documentation on how the AMSU-A Radiative Transfer Algorithm actually works.
● Atmospheric scan depth for each footprint on channel 5.

Previous Posts In This Series:



References:
NASA Responds
AIRS/AMSU/HSB Version 5 Modification of Algorithm to Account for Increased NeDT in AMSU Channel 4

Wednesday, March 10, 2010

Note On NASA's Lack Of Limb Adjustment

I talked about limb adjustment in this post. In part of NASA's reply to my questions about their data, they noted they don't do limb adjustments to the data. Specifically, they said:
The AIRS retrieval code (statistics engine) does not incorporate a limb adjustment as you have described above. Where reliable sensor data is available, it is applied directly to the appropriate portion of the atmosphere, taking into account the angle of the observation.
So unlike NOAA who adjusts the readings to have them all at the same altitude, NASA uses the different readings at different footprints for exactly what they are: measurements at different levels of the atmosphere.

This means that I have to make sure the footprints I'm using are the footprints used by UAH. It may turn out that footprint 15 of the channel 5 data, which I used in this previous post, is not used by UAH. I may need to look further out to get the data UAH uses. So when I finally get to part two of the UAH anomaly post, I'll make sure Im using the correct footprints.

References:
AMSU Limb Adjustment
NASA Responds
Trying To Find The UAH January Anomaly In The Raw Data, Part 1 Of 2

 Previous Posts In This Series:

Tuesday, March 9, 2010

NASA Responds

UPDATE:
Re-reading their response, I noticed they didn't actually admit in the body of the e-mail to making up all of channel 4's readings. But that's what they do. You can read about it at one of the links they provided here: AIRS/AMSU/HSB Version 5 Modification of Algorithm to Account for Increased NeDT in AMSU Channel 4
===

I just got a response from NASA about the various questions I raised in this post regarding the AMSU on the Aqua satellite.

Basically, their response is that channel 4 failed sometime in late 2007 and now they invent the readings from channel 4 from whole cloth and feed those invented readings into the calculations.

Despite my post on noise, this invention of an entire channel's readings isn't as far-featched as it first sounds. The recurring patterns that exist in all the footprints for all the channels make this far easier to do than if we were dealing with something like land-based thermometer readings. For example, given a single footprint reading on any channel, you can make a reasonable guess as to what all the other footprint readings will be for that channel. The between-channel values also have a very consistent relationship to each other. An example of that consistent relationship is shown in the graphic at the beginning of this post.

They also said that having no automaticQualityFlags for a month marked as "passed" is not a big deal, and that the values for badData, etc., are not the same in every file. On that last one, I'm not so sure. I dug into the files with a text editor and looked at the badData values myself, rather than just trusting my code. They were all the same. But, for now, I'll take their word on it and assume there's something there that I'm not yet understanding.

Anyway, here's their response:


Thank you for your interest in Aqua AMSU-A data. Before your specific questions are answered, understanding the following background information about Aqua AMSU-A will be helpful. All documents referenced here can be found at our public archive at the Goddard Earth Sciences Data and Information Services Center (GES DISC) at http://disc.sci.gsfc.nasa.gov/AIRS/documentation.

Aqua AMSU-A is a microwave sounder that is very similar to AMSU-A instruments flown on many NOAA satellites as well as on the European MetOp-A satellite. Aqua AMSU-A senses the atmosphere in 15 distinct data channels. One of these channels, Channel 4, failed in late 2007, as described in AMSU-A Channel 4 NeDT Update: 20 December 2007 at the document archive.

Aqua AMSU-A products include a variety of flags to indicate data quality. However, not all data quality flags are particularly useful for indicating actual data quality. In particular, ‘AutomaticQualityFlag’ is not an effective flag to check because any datasets less than 100% complete will be marked “suspect.” You may wish to review the AIRS Version 5 Released Files Description document to find more suitable QA flags. We suggest you look at “NeDT,” “state1” and “state2” as being better data quality indicators.

Since the degradation of AMSU-04, the AIRS Level 2 retrieval code does not utilize AMSU channel 4 brightness temperatures. You may want to review AIRS/AMSU/HSB Version 5 Modification of Algorithm to Account for Increased NeDT in AMSU Channel 4 to understand how we are addressing the loss of Channel 4 within our retrieval software.

With that background, let us address your 3 specific questions:

Is it considered normal to have zero Level 1B AMSU data files for a month pass QA?
It is a mistake to characterize a granule in which AutomaticQualityFlag is set to "Suspect" as not passing QA. In fact, this is a “normal” condition since late 2007, and it merely reminds us that the dataset is not 100% complete.
Is it normal for all Level 1B AMSU data files for a month to have the exact same numbers for bad data, missing data, special data, and total data?
All Level-1B data files for the month of January 2010 do not have identical values for NumBadData, etc. For example, data collected over a spacecraft maneuver will reflect the state of the instrument at that time. This situation is typical, and most data for any given month should be in the current nominal state. The typical month will contain 1-3 short intervals of bad data from spacecraft maneuvers.
Doesn't the statistics engine used for AMSU limb adjustment require valid data from channel 4 in order to correctly adjust channel 5 data, the channel which is used to create temperature anomalies provided to the public?
The AIRS retrieval code (statistics engine) does not incorporate a limb adjustment as you have described above. Where reliable sensor data is available, it is applied directly to the appropriate portion of the atmosphere, taking into account the angle of the observation. Of the 2378 infrared and 15 microwave channels available to the AIRS retrieval algorithm, no particular channel is most important in deriving our products. Instead, the unique combination of all these channels of data allows us to develop a very complete and accurate temperature and water vapor profile throughout the entire atmosphere, and that is why our data products are very important to weather forecasting and climate studies.
Again, the AIRS Project believes that many of your technical questions can be answered reviewing the documentation at: http://disc.sci.gsfc.nasa.gov/AIRS/documentation. If you have further questions, we request that you contact us via our “Ask AIRS” portal at http://airs.jpl.nasa.gov/AskAirs/. You may also want to register as an AIRS Data User at http://airs.jpl.nasa.gov/DataRegistration/data/index.cfm. In that way you will be notified whenever a significant announcement regarding the AIRS Project or the AIRS and AMSU-A instruments is issued.
Thank you for your interest in AIRS data,

Previous Posts In This Series:

References:

Thursday, March 4, 2010

AMSU Limb Adjustment

This post looks at the Limb Adjustment done for the AMSU. We talk about what the Limb Adjustment is, how it is done, and some problems associated with it.

What Is The Limb Adjustment?
If you stand up, bend over at the waist and swing your arm in an arc under you, you'll notice your arm is closest to the floor when it's directly under you. And you'll notice your arm gets further from the floor as it continues through the arc in either the left or right direction.

The scanning of the AMSU on the Aqua satellite has a similar situation. When it scans directly below itself,  it gets data from closest to the surface, but the scans to the left and right of the satellite don't penetrate as deeply. Because of this, the temperatures read by the scans to the left and right need to be adjusted. This is called a Limb Adjustment. Publishing in the American Meteorological Society, Quanhua Liu and Fuzhong Weng (2006) had this to say about Limb Adjustment:
A remarkable effect of the cross-scan sensor is the variation of the brightness temperatures across the scan line, even though the scene temperature is homogeneous. The variation in the cross-track measurements due to the change of the scanning angle is called limb effect and can be as much as 30 K for the 23.8-GHz water vapor channel and 15 K for troposphere sounding channels (Goldberg et al. 2001). Because the limb effect is often stronger than the real variation of the signatures from scenes, the unadjusted measurements prevent the objective analysis of weather systems and may make the regression retrieval algorithm complicated. More important, averaging satellite brightness temperatures to a given grid map for climate study requires that the data be limb adjusted prior to averaging.
For microwave instruments, like the AMSU, a different form of adjustment needs to be done over land and water, due to surface emissivity. Again from Liu and Weng (2006):
It is a little complicated for the microwave channels. The surface emits either more or less than the atmosphere at the microwave range, with full dependence on the surface emissivity. The water surface may emit much less energy than the atmosphere in the microwave range. The weighting function of the microwave troposphere channel is broader than that of the infrared channel. Asymmetric behavior of AMSU-A channels on the two sides of the nadir is recognizable (Weng et al. 2003).Goldberg et al. (2001) have developed a limb-correction algorithm to overcome the difficulties for AMSU-A. They computed the limb adjustment from multiple-channel observations and the scan position–dependent coefficients. Their algorithm is routinely applied for National Oceanic and Atmospheric Administration (NOAA) operational products.
How Limb Adjustment Is Done
A collection of scans from the month of July, 1998 is used to provide a mean for each latitude for each footprint, within 2˚ latitude. These historical scans are combined with current scans from the channel being examined and its neighboring channels, and a set of  physical and statistical coefficients. Publishing in the American Meteorological Society, Mitchell D. Goldberg, David S. Crosby, and Lihang Zhou (2001) had this to say about Limb Adjustment:
 A global set of coefficients is used for channels 6–14. Separate sea and nonsea coefficients are used for channels affected by the surface—channels 1–5 and 15. The predictors are generally the channel itself plus the adjacent channel whose weighting functions peak below and above. In other words to limb adjust channel 6, we use unadjusted channels 5, 6, and 7 observations as predictors. The exceptions are channel 14 uses channels 12, 13, and 14; channel 3 uses channels 3, 4, and 5; channel 1 and 2 both use channels 1 and 2, and channel 15 uses channels 1 and 15.
So to adjust for, say, channel 5, the historical values for channel 5 at the satellites current location, the current scan values for channels 4, 5, and 6, and a set of physical and statistical coefficients are used.

Potential Problems With Limb Adjustment
From what I can see, there are two potential problems with Limb Adjustment. The first is when current scan values are outside the limits expected based on the historical scans. For example, the scan line shown at the beginning of this post has a value at foot print 1 that is 40% below the expected low limit, which is shown in the diagram at the start of this section.

The second potential problem is when a neighboring scan line used for the adjustments doesn't have any available data. For channel 5, the channel we've been looking at in this series of posts, the neighboring scan lines are 4 and 6. Channel 4 had no data at all in it for the entire month of January. A sample of this is shown in using HDFView in the image provided below.
Screen Shoot Showing No Data In Channel 4.
Click for larger image.

Previous Posts In This Series:
Noise
Proof That Temperature Area Determines Temperature Anomaly
Trying To Find The UAH January Anomaly In The Raw Data, Part 1 Of 2
Overview Of The Aqua Satellite Project, Update 1 Features
Aqua Satellite Project, Update 1 Released
Spot Checking The Spot Check
NASA, UAH Notified Of QA Spot Check Findings
About The Aqua Satellite Project
UAH January Raw Data Spot Check
So, About That January UAH Anomaly
A Note On UAH's High January Temperature

References:
Uses of NOAA-16 and -18 Satellite Measurements for Verifying the Limb-Correction Algorithm
The Limb Adjustment of AMSU-A Observations: Methodology and Validation

Monday, March 1, 2010

Trying To Find The UAH January Anomaly In The Raw Data, Part 1 Of 2

This is a two part series where we look to see if the January UAH anomaly can be found in the Aqua AMSU Level 1B data. The size of the anomaly we're looking for is 0.7˚ C. Each of the two parts of this post will use a slightly different method to try and find the anomaly.

I'll start off right away and tell you the answer for this post: it's not there. The rest of the post will show why.

Calibration With Past Anomalies, Footprint 15.
Both posts will use a method of calibrating the raw temperatures of the AMSU to published UAH anomalies for the same time period. In this post we'll look at footprint 15. We're using footprint 15 because it requires almost no statistical adjustment my the software for limb correction, so we don't have to worry about limb correction, bad data in channel 4, etc.

Calibration is done using 2 periods of published UAH anomalies. In this case, the January, 2008 and January, 2009 data are our calibration periods. We will then use that calibration information to examine the third period, January, 2010. The anomalies for all three periods were reported as:

January, 2008: -0.05
January, 2009: +0.3 (+0.35 from previous year)
January, 2010: +0.72 (+0.42 from previous year)

The raw data for channel 5, footprint 15 for those three months is shown below. All 1,000,350 scans for channel 5, footprint 15 are shown for the three months we're looking at. A trend line is added for each and we zoom in on the beginning and end of the series. January, 2010 is shown in red, January, 2009 is shown in purple, and January, 2008 is shown in green.
January 2008, 2009, 2010 Channel 5 , Footprint 15 Raw Data
Click for larger image

Let's look at the 2008 and 2009 data first. The beginning and ending values are:

January, 2008, Beginning: 246 K
January, 2008, End: 248.75 K
January, 2009, Beginning: 248.2 K
January, 2009, End: 249

We need a way to relate the raw temperature data to the temperature anomalies. Both raw data trends are straight, so we can treat each trend as a right-angled trapizium (i.e. a rectangle with one edge not parallel) with edges at the left, right, bottom, and trend line. This, in turn, can be divided into a rectangle and a triangle. An example of this is shown below for the January, 2008 trend.


From this we can calculate the area of the the first two trends and compare their difference to the difference in anomalies over the same period. This gives us an objective scale between the area of the raw data and the anomaly for the month.

January, 2008 Area: 7668.625
January, 2009 Area: 7706.6
Difference in Area (2009 - 2008): 37.975
Difference in Anomaly (2009 - 2008): +0.35
Anomaly/Area Ratio: 0.0092

So a difference in area of 37.975 increased the anomaly by +0.35 K. Now lets do the same area calculation for 2010 and see what the relationship is.

January, 2010, Beginning: 249 K
January, 2010, End: 248.75
January, 2010 Area: 7715.125
Difference in Area (2010 - 2009): 8.525
Difference in Anomaly (2010 - 2009): +0.42
Anomaly/Area Ratio: 0.0492

The change in area between 2010 and 2009 is less than ¼ the change between 2009 and 2008, yet change in anomaly between 2010 and 2009 is larger than the change in anomaly between 2009 and 2008. The differences in the effect of increasing the temperature area on the anomaly is captured in the Anomaly/Area Ratio. The ratio for the January, 2010 - January, 2009 change is much larger (0.0492) than the ratio for the January, 2009 - January, 2008 change (0.0092).

This clearly can't be right. In these examples, there's no relationship between an expanding temperature area and an anomaly. There should be, because an increase in anomaly represents nothing more than an increase in temperature area.

And since the rules of geometry aren't going to change, it seems there's nothing in this channel 5, footprint 15 data to support the anomaly of January, 2010.

Edit: I used 31 as the width here, representing 31 days. In the future, I'll probably use a width of 1, representing 1 month. This'll simplify calculations.

So Does This Prove The UAH January, 2010 Anomaly Is Wrong?
No. It only shows there's nothing in channel 5, footprint 15 to support the value of the anomaly as valid. Channel 5, footprint 16 will give similar results.

Those two footprints are the ones that really don't need any statistical limb adjustment. The other footprints do. So that means the support for the January, 2010 anomaly either doesn't exist at all, or exists somewhere in those other footprints with their statistical adjustments.

Which is what we'll look at next post. I don't know what we'll find when we look there. I haven't done the analysis yet.

Previous Posts In This Series:
Overview Of The Aqua Satellite Project, Update 1 Features
Aqua Satellite Project, Update 1 Released
Spot Checking The Spot Check
NASA, UAH Notified Of QA Spot Check Findings
About The Aqua Satellite Project
UAH January Raw Data Spot Check
So, About That January UAH Anomaly
A Note On UAH's High January Temperature

References:

Friday, February 26, 2010

Spot Checking The Spot Check

Before I posted the results of the QA spot check described here, I first checked those results to make sure they were accurate. This post describes how I checked them.

Checking Binary Data
The Aqua Satellite AMSU Level 1B data comes in compressed binary form. You can view the data using HDFView, which is a tool created by the HDF Group. I used this tool to go in to the data files exactly as they were downloaded from NASA and look at the data. The screen shot to the left shows the values of Channel 4 QA that I reported. Any value other than zero means the channel failed QA. As you can see, all of Channel 4's readings failed. Channel 4 is labeled as "3" because the numbering in the viewer starts at zero rather than one.

Checking Text Data
HDFView is an HDF viewer, not an HDF-EOS viewer. And the Aqua AMSU Level 1B files are HDF-EOS files. This means HDFView can show you the HDF parts of the file, but not the EOS parts of the file. To look at the EOS data in the file, I converted it to text and opened it up in a text editor. Conversion to text was done by taking the original NASA data file and running it through ncdump with no arguments other than the file name to be converted.

The two screen shots below show samples of EOS data for two different files. You can see the file names as the titles to the window. Looking at the highlighted data, you can see the name of the data shown and its value. The data is NumBadData and the value for both files is 1350.



These types of checks were done on about 20 randomly selected files to ensure the errors being reported by the QA code were actually due to the data in the files, not due to bugs in the QA code itself.

Previous Articles In This Series:
NASA, UAH Notified Of QA Spot Check Findings
About The Aqua Satellite Project
UAH January Raw Data Spot Check
So, About That January UAH Anomaly
A Note On UAH's High January Temperature

References:
UAH January Raw Data Spot Check
Looking At The Aqua Satellite Data

Thursday, February 25, 2010

NASA, UAH Notified Of QA Spot Check Findings

I've sent off the results of my QA Spot Check to the Aqua team at NASA and the Dr. Christy and Dr. Spencer at UAH. I couldn't find anything wrong with my results, so it's time to let the professionals have their input.

Here's a copy of the text that was sent:
I'm writing with questions regarding the January, 2010 Aqua satellite AMSU Level 1B data. I checked various QA flags in the data and found the following results:

● Of the 7410 files containing January data, 7386 of them had their automaticQualityFlag marked "suspect" and another 24 of them had the flag marked "failed". None passed QA.
● Of the 7386 suspect files, all of them had a "Good Data" percentage of 93.33334 percent. Here, "Good Data" is taken as the result of subtracting bad data, special data, and missing data from total data. "Good Data" is then divided by total data to get the percentage of "Good Data".
● Of the 333,450 Channel 4 readings for January none of them passed QA. All of them in the files marked "suspect" had been marked as failing QA and, obviously, the ones in the files marked "failed" were in files that failed QA and should not be used.

My questions are:

● Is it considered normal to have zero Level 1B AMSU data files for a month pass QA?
● Is it normal for all Level 1B AMSU data files for a month to have the exact same numbers for bad data, missing data, special data, and total data?
● Doesn't the statistics engine used for AMSU limb adjustment require valid data from channel 4 in order to correctly adjust channel 5 data?


Additionally, I asked Dr. Christy and Dr. Spencer if channel 5 from Aqua's AMSU is used to produce UAH anomalies. In an article on WUWT, Dr. Spencer said it is, but I just wanted to double check to make sure I understood correctly.

Previous Articles In This Series:
About The Aqua Satellite Project
UAH January Raw Data Spot Check
So, About That January UAH Anomaly
A Note On UAH's High January Temperature

References:
How the UAH Global Temperatures Are Produced

Tuesday, February 23, 2010

UAH January Raw Data Spot Check

Update 2:
The C++ source code used to run these tests has been added to the Downloads section of the blog under the name Aqua Satellite Project (BETA).

Update:
While I said words to this effect at the end of the post, I wanted to make sure it's clear: These findings are preliminary. Although nearly two months of research has gone into them more needs to be done.

In the meantime, folks wanting to see the data for themselves can download it from NASA by following the steps here: UAH Satellite Data and downloading the viewers by following the steps here: Looking At The Aqua Satellite Data.
======

Killing two birds with one stone, I decided to give my AMSU C++ code a bit of a workout by doing a Quality Control spot check on the January UAH raw data. I downloaded all the January Level 1B data for the AMSU from NASA. There are 7410 files for the month of January. Here's what I found:

No Files Passed NASA's QA. None.
Of these 7410 files, 7386 of them had their automaticQualityFlag marked "suspect" and another 24 of them had the flag marked "failed". Add that up and it's all 7410 files for January were either suspect or failed. No a single file was marked as "passed".

Of the 7386 suspect files, all of them had a "Good Data" percentage of 93.33334 percent, meaning that's the percentage of data in each file that's considered reliable. There's no way such a number would actually be identical out to one one hundred thousandths of a percentage 7386 times out of 7386 tries, so I figured my code that was reading the wrong data for the good/bad/missing/special values.

Nope, my code was good and manually checking a random selection of about 20 files by opening them up in an editor revealed they all had the identical numbers in them for these fields. In other words, the good/bad/missing/special data flags seem to have hard-coded values, at least for "suspect" data.

Channel 4 Never Works (Which Means Channel 5 Never Works Either)
Each of the the files contains 45 readings from each channel. Of these 45 readings, channel 4 failed 45 times every time. The only exception was in the files that had been marked as "failed". So there was never a good reading from channel 4 for the entire month of January.

This is significant because the statistics engine used to correct the satellite readings for channel 5, the channel UAH uses to get it's anomaly information, requires data from channels 4 and 6.

So this means they're either using a non-published algorithm to calculate the anomalies, or the anomalies are wrong.

Screen Shoot Of QA Spot Check Results
(Click for larger image)

Disclaimer
Here are a few things I think of that may affect the validity of this spot check:
  • I'm not a scientist, I'm a software engineer. 
  • Prior to working on the Aqua satellite as documented in this blog, I had no experience on working with satellite data. 
  • Despite having checked my code, it is beta code, and I may have missed errors.
  • It's possible that data files are not given "passed" status (as opposed to "suspect" and "failed") until some QA procedure that takes place after Level 1B occurs.
  • It's possible the order of channel information in the data file does not represent the actual channel number. In other words, the fourth channel in the data file may not be channel 4.
  • UAH uses other satellites in addition to the Aqua AMSU.
While I think it's important to point out these possible concerns, I also think it's important to point out I don't see any of them as serious threats to reversing what I'm seeing in the data now.

Still, further research needs to be done to triple check these things, and that's what I'm working on now. If I don't come across anything that changes my views, I'll forward these results to Dr. Spencer and Dr. Christy for their input.

Conclusion
At this point, while admitting the possibility of an error on my part, I'd have to say it seems to me the January AMSU data has no validity at all. Nor does anything derived from it. The hard-coded quality values and the complete failure of channel 4 both look like show-stoppers to me.

Previous Posts In this Series
So, About That January UAH Anomaly
A Note On UAH's High January Temperature

References:
The Limb Adjustment of AMSU-A Observations: Methodology and Validation

Monday, February 8, 2010

HDF Reader C++ Code Is Written!

The code to read HDF files is up. The code should be considered a beta release, not so much because of the quality (the code is actually pretty good, if I do say so myself), but because I reserve the right to modify it in the near future.

After having various problems with the code NASA supplies (missing libraries, pre-built binaries that don't link), I decided to just write the reader myself. This post will take a quick look at the code. A follow up post will show some of the results of using the code to read raw UAH data.

The code has been placed in the public domain and you can get it here:
HDFReader.h
HDFReader.cpp

Previous Posts In This Series
Prepping The Data File
The AMSR code reads un-compacted ASCII files, and HDF file are compacted binary. So you'll need to change that before reading them with the code. To do that, just use the ncdump tool. The only argument you supply to ncdump in this case is the filename. Redirect the output to a file and use that file as input to the code. And example command line is shown below:

ncdump AIRS.2010.01.01.240.L1B.AMSU_Rad.v5.0.0.0.G10005063822.hdf > MyFile.txt

If you don't yet have ncdump on your computer, see this post for instructions on downloading and using it.

The Code
The underlying code is fairy complex, so we'll skip discussing it. The code is well commented, so you can look there for specific details. Here we'll just discuss how to use the code to read data.

The code sample below shows how to create the reader, declare memory to hold data you'll get from the reader, and read the raw antenna temperatures.

// Supply location of HDF file.
HDFReader oHDFReader(string("./MyFile.txt"));
// 45 scan lines, 30 footprints, 15 channels.
float aData[45][30][15];

// Load data.
oHDFReader.getVariableData(string("antenna_temp"), aData);


Yup. That's it. Of course, with this version of the code, you need to know that the name of the antenna temperature data is antenna_temp and that it's an array of floats 45 by 30 by 15. Fear not. You can open the ASCII version of the HDF file and see all the variables listed in the variables section near the beginning of the file. These variables will have dimensions listed as constants and those constants are defined in the dimensions section at the beginning of the file.

There's also some hierarchical data in the HDF file and that data is stored in a node structure in HDFReader. To get the root of the node structure, use the getRootNode() method.

Future versions of the code will build on what's here now. Specifically, I want to add methods to access the variables without forcing the programmer to dig through the data file to see what variables are there. Code to do that should be ready soon.

Also coming soon is a look at some of the raw UAH data that gets extracted from these files.

References:
Some Useful Climate Code