It's important this be done because in the raw data we're working with temperatures, not anomalies. But the published temperature changes are in anomalies, not temperatures. We need to be able to switch between the two and compare the two with confidence.

**What Is An Anomaly?**

By definition, an anomaly is a deviation from some standard. A temperature anomaly is a deviation from some standard temperature. The value of the standard temperature doesn't matter too much, so long as it's well defined. We'll say the standard temperature has some value we'll call ⣿. Like any other temperature, we can graph ⣿ on a chart. We'll say the temperature ⣿ is graphed by the rectangle

**a**in the diagram at the beginning of this post. When we say the temperature anomaly is 0˚, that's equivalent to saying the temperature is ⣿.

**What Is The Temperature Area?**

The temperature area is simply the area of the graph below the curve showing the anomaly. So if the anomaly is zero, the temperature area is the area of the bar that maps ⣿.

The X axis of such a graph represents time. When can measure time in different ways. For example the X axis of a month with 31 days can be measured with 333,450 scan lines along the X axis. Or it can be measured as 31 days. Or it can be measure as 1 month. It doesn't really mater which unit we choose anymore than it matters if we call 36 inches 3 feet or a yard.

For convenience, we'll say the X axis is 1, for 1 month. (Note that I used 31 as the width in my previous post. That works just fine, but I'll probably use 1 from here on out to make it easy on myself.) This means when we want to calculate a temperature area, we need only be concerned with it's height. This is because the area of a rectangle equals width times height and width is always 1.

So, for example, the temperature area of ⣿ is always ⣿.

**Graphing Anomalies**

We can graph anomalies individually, or we can take the best fit to the anomalies. When we take the best fit we get a triangle above or below ⣿. If we instead graph anomalies we get lots of little triangles along the graph, but in the end the area of the anomalies is the same. Again, for convenience, we'll take the best fit and work with a single triangle.

If the anomaly at the beginning of the graph is zero and at the end of the graph is some positive number, ❢, then a triangle is formed with an area equal to ½❢ and the temperature area now equals ⣿ + ½❢. This is shown in rectangle

**b**in the diagram at the beginning of this post.

If the anomaly at the beginning and end of the graph is some positive number ❢, then two triangles are formed both with an area equal to ½❢ and the temperature area now equals ⣿ + ❢. This is shown in rectangle

**c**in the diagram at the beginning of this post.

If the anomaly at the beginning of the graph is zero and at the end of the graph is some negative number, ❧, then a triangle is formed with an area equal to ½❧ and the temperature area now equals ⣿ - ½❧. This is shown in rectangle

**d**in the diagram at the beginning of this post.

**Conclusion**

We can see from these examples that as the anomaly changes, the temperature area also changes in direct proportion. Or, equivalently, we could say that as the temperature area changes the anomaly changes in direct proportion.

So if we ever see a change in temperature area and a change in anomaly not in sync, we know something is wrong.

**Previous Posts In This Series:**

Trying To Find The UAH January Anomaly In The Raw Data, Part 1 Of 2

Overview Of The Aqua Satellite Project, Update 1 Features

Aqua Satellite Project, Update 1 Released

Spot Checking The Spot Check

NASA, UAH Notified Of QA Spot Check Findings

About The Aqua Satellite Project

UAH January Raw Data Spot Check

So, About That January UAH Anomaly

A Note On UAH's High January Temperature

"So if we ever see a change in temperature area and a change in anomaly not in sync, we know something is wrong."

ReplyDeleteLovely use of language... and logic!!!

Keep up the splendid work...

And - THANK YOU :-)

And thank you Malaga View. It's good to know this was helpful.

ReplyDeleteAre you sure you understand what the raw data represents? If each scan represents a different physical location then your temperature area would have to be weighted according to the geometry of the earth.

ReplyDeleteRaven -

ReplyDeleteThere's no doubt adjustments need to be done for the raw data to give valid temperature readings.

But whatever those adjustments are, there should be a certain similarity to them over time. For example, there's not a lot of change in the geometry of the Earth when comparing January, 2009 to January, 2010.

What that means is the ratios between the raw data and the adjusted anomalies for those time frames should be similar. A change in the anomaly should have a similar change in the accompanying temperature area.

If we don't find that, that means the adjustments being made to the data are changing over time. For example, looking at channel 5, footprint 15 for January 2010, 2009, and 2008 shows a big difference in the adjustments between 2008-2009 and 2009-2010.

This isn't proof that the anomaly is wrong, but it does raise the question as to why significant changes in the adjustments occurred.

There are other possibilities, such as introducing noise by limiting the check to a single footprint, and additional valid data from other satellites not being examined. Those possibilities will be examined as well.

Actually the adjustments are not necessarily constant over time.

ReplyDeleteConsider a satellite that crosses the equator at the same time each day. When it crosses the equator it captures a scan which covers some mix of a land and ocean. The next day (which is not 24 hours later from the perspective of the satellite) it captures a new scan but this time the mix of land and ocean changes.

Over time the adjustments will repeat but there is no guarantee that a month is long enough to capture the full cycle and you could have significant differences between months.

The way to test this would be to calculate the temperature area for longer periods and see if the ratio is periodic or trending towards a constant.

To test this theory you should

Also the raw data may need to be masked with the output of another sensor that detects cloud cover. This would means a cloudy month would have a completely different ratio than a clear month.

ReplyDeleteWell, the Aqua satellite performs 28 half orbits per day and examines each latitude on average once per day. So a month's worth of data should be pretty good.

ReplyDeleteBut I absolutely agree with you that there could be other factors that justify the different adjustments to the raw data. But the point of looking at the ratio between temperature area and adjustments isn't to show the adjustments are wrong, it to see if additional explanation is needed to justify the changes in adjustment. The explanation could very well turn out to be clouds, as you noted. So we can go look at clouds. But we need to come up with some sort of explanation.

But as the January, 2010 and January, 2009 raw scans for channel 5, footprint 15 were nearly identical, but the January 2010 anomaly was about twice the size of the January, 2009 anomaly, theres _alot_ of explanation that needs to be done.

Just to rephrase what I said in the previous comment in terms of the conclusion the post...

ReplyDeleteIf the ratio between temperature area change and anomaly change varies, something is wrong. That something may not be the anomaly. Perhaps there's noise in the temperature readings. Perhaps we neglected to look at some other factor, such as clouds.

But somewhere along the line, something is wrong.

28 half orbits a day? That is a pretty fast moving satellite.

ReplyDeleteYour point is fair enough but I just get really skeptical when bloggers start inventing new ways to analyze data (e.g. chiefio and his dT/dt).

Understood. And thank you for not giving me a free pass. :)

ReplyDeletePersonally, I think it is very positive when bloggers start inventing new ways to analyse data... the chiefio has discovered the "Bolivia Effect" where temperature stations have marched downhill to enjoy beach holidays.... Surfacestations.org has discovered all sort of issues with temperature stations and given us new ways to analyse the data... the list is long and very impressive...

ReplyDeleteMagicJava is following the grand scientific tradition of replicating published results based upon the published data. However, when all the scientific methods and techniques are NOT published then you have to be very creative and ask all the awkward questions... long may it live!

Regarding the satellite based "global monthly temperature anomaly" there are many interesting things to discover, understand and explain... especially as a "month" may contains 28, 29, 30 or 31 days worth of data... so the approach of comparing January 2009 with January 2010 seems very sensible to me... although we have absolutely no idea (at the moment) whether the data for these two months relate to exactly the same geographic locations... Additionally, the scans do not cover all of the globe... there are gaps in the coverage... and there are overlaps in the coverage... parts of the globe may be scanned once a day at the equator... while other parts of the globe may be scanned 28 times a day close the poles... so there is probably a huge polar bias in the averaged data... and this bias polar data is possibly supplemented by an almost random set of data from other non-polar locations... the implications being that 1) the calculated anomaly is NOT a meaningful global anomaly 2) month by month comparisons are not valid and 3) year by year comparisons are not valid...

So this voyage of discovery is very important... lets understand the science instead of being blinded by the science... then, and only then, will we know if we have robust science or bogus science...