Thursday, December 31, 2009

Climate Scientist Starter Kit


I've just put together a Climate Scientist Starter Kit. The kit contains a spreadsheet in Apple Numbers and Microsoft Excel formats. The spreadsheet has data on:

* Monthly Global Mean, TDYN, ENSO, and Volcano temperatures from January, 1900 till March 2009.
* Monthly UAH satellite temperature data from December, 1978 till November 2009.
* Daily and monthly Cosmic Ray data from January, 1951 till November, 2006.
* Monthly low level cloud data from July, 1983 through June, 2008.
* Ice Core CO2 data and monthly CO2 data for the years 1958 through 2008.

It nice because you have all the data in an easy to use spreadsheet, rather than having to parse up various data formats.

There's also supporting data, including raw cloud data, daily cloud data, C++ code for parsing raw cloud data files, and HTML documents that provide additional information.

You can download the zip file here. NOTE: The comment on the download page will say it's the FreePOOMA Add-On Pack. Don't worry, that's just the name of the project that contains the zip file.

Feedback for improvements for future versions of the kit is welcome.

References:
The Climate Scientist Starter Kit

I'm Unblocked From Wiki!


Update: I forgot to mention I had to double dutch promise not to post anymore links to ClimateGate emails.
==========

Wiki has unblocked me. I want to thank editor Rlevse for taking the time to do this.

There was a mia culpa on my part. It turns out I actually posted twice to the discussion page, once as an unregistered user, and once as Magicjava. The first one got deleted and when I checked a few hours latter and didn't see it, I just assumed an unregistered user couldn't post to a deletion discussion page. So I registered as Magicjava and posted it again. My apologies for the mix up.

But the important thing is now I'm free to spread all the lies and propaganda that ExxonMobile is paying me for.

BWAH HA HA HA HA!




Just kidding. ;)

Wednesday, December 30, 2009

Another Skeptic Blocked From Wikipedia - Me


Continuing Wikipedia's trend of blocking skeptics from the ClimateGate debate, another skeptic has been banned from Wikipedia. This time it's me.

I have a total of 1 edit to Wikipedia. Yes, just 1. And it's not even to an actual article.

On the ClimateGate deletion discussion page I said that William Connolley should not be allowed to vote on the ClimateGate issue, as he's personally involved in it. I provided two links to ClimateGate emails involving him.

And now I'm indefinitely blocked.


Screenshot showing I've been blocked.


Screen shot showing every edit I've ever made to Wikipedia. It's from the deletion discussion page. (Minus links to emails that someone edited out.)

I've sent off an email to the admin who blocked me, Rlevse, asking why he blocked me.

Late Edit:

Screen shot showing my Wikipedia editing history.

References:
ClimateGate Deletion Discussion Page
William Connolley email #1
William Connolley email #2
Wikipedia Blocks All Skeptics From Editing ClimateGate Page

Monday, December 28, 2009

ClimateGate Page Deleted From Wikipedia. More Attempts To Block Skeptics


UPDATE: Wikipedia has moved the deletion discussion page to here.
========

In the middle of the night on Monday morning, enough believers posted they wanted the article deleted to tie the count evenly between delete and keep. Then the polling was shut down early and the page was deleted. It's now a redirect to the Climatic Research Unit e-mail hacking incident article.

Editor Rd232 has tried to block the ClimateGate author from posting to wikipedia. This isn't the first time this has happened. Wikipedia previously blocked all known skeptics from posting on the ClimateGate article.

The deletion is under review, as it occured less than 12 hours after the delete notice was posted and there was no consensus for deletion. We need wiki editors to get over there and get this reversed.

And Rd232 needs to lose his editor privileges due to abuse.


Screen shot of editor Rd232 saying he tried to block the author of the ClimateGate article from posting.

References:
Wikipedia ClimateGate article
Wikipedia Climatic Research Unit e-mail hacking incident article
ClimateGate Deletion discussion page
ClimateGate Deletion Review Page
Wikipedia Blocks Skeptics From Posting

Sunday, December 27, 2009

Wikipedia Now Trying To Delete ClimateGate Article


It looks like the believers lost the battle to control the ClimateGate page that I talked about in this post, because now wikipedia is trying to delete the ClimateGate page altogether. Deleting a page in wikipedia means the entire page and all it's history is forever gone. The public can never view it and will never even know it existed. The plan seems to be to move the article to Climatic Research Unit e-mail_hacking incident.

Only wikipedia editors can vote on whether or not to delete an article, so if you're an editor please HURRY over to the ClimateGate article and make sure it doesn't get deleted.

Thanks. :)


Screen shot of wikipedia's attempt to delete the ClimateGate article.

References:
Wikipedia ClimateGate Article
Climatic Research Unit e-mail_hacking incident article
ClimateGate deletion discussion page
Wikipedia deletion policy page

Friday, December 25, 2009

Cosmic Rays And Climate. Part V: Cosmoclimatology



This video has been placed in the public domain.

Wednesday, December 23, 2009

Met/CRU Releases Some "Truthiness" Data And Code

Steve McIntyre is reporting the Met Office has decided to release a sub-set of their data and code obtained from CRU. The data is the "value-added" data, not the raw data. It's also not the full data set. It's unknown at this point what exactly is contained in the code.

Head on over to Climate Audit for full details and updates.

References:
Climate Audit

Tuesday, December 22, 2009

Cosmic Rays And Climate. Part IV: CO2 Global Warming



This video has been placed in the public domain.

Monday, December 21, 2009

Cosmic Rays And Climate. Part III: Clouds



This video has been placed in the public domain.

Sunday, December 20, 2009

Wikipedia Blocks All Skeptics From Editing ClimateGate Page



Update:
After removing all the skeptic's comments, they've locked the page down, leaving only the believer's comments.
=======

You may have heard from Watts Up With That that a single editor had been responsible for all articles in Wikipedia related to global warming and had been editing out skeptic's views. U.K. scientist, Green Party activist, and RealClimate.org team member William Connolley created or rewrote 5,428 Wikipedia articles. His handiwork included getting rid of the Little Ice Age and Medieval Warm period and slandering scientists he didn't agree with. Connolley had his wikipedia administrator duties taken away from him in September. But that hasn't stopped him, as he's edited more than 800 articles in December alone.

But it seems there's more to the story.

According to wikipedia editor "A Quest For Knowledge", wikipedia has blocked all known skeptics from editing their ClimateGate page. Reading the ClimateGate page you can see it's little more than propaganda copied and pasted from the Huffington Post and RealClimate.

The skeptics had to be banned for trying to maximize the damage that ClimateGate would cause, says the editor. He goes on to say a group of believers are doing the reverse, trying to minimize the damage. But for whatever reason, the believers haven't been banned by wikipedia. This despite the fact that the believers are no more neutral than the skeptics were, according to the editor.

Please contact the wikipedia editors and help get the ClimateGate page up to wikipedia's neutrality standards. See below for links.


Screen grab of administrator's entry saying all skeptics were blocked from ClimateGate page.

References:
Watts Up With That Wikipedia Story
Wikipedia statistics for Connolley
Nation Post Reports the Wikipedia Story
Statement by Wikipedia Administrator A Quest For Knowledge
Wikipedia contact page
Wikipedia ClimateGate page

COP15 Flops

video

Senator Inhofe said it best. I can only add "Get used to it."

Next up, the EPA.

References:
TreeHugger

Cosmic Rays And Climate. Part II: Plasma Physics



This video has been placed in the pubic domain.

Saturday, December 19, 2009

Happy Holidays

Obama, Scientists, Take Climate Approval Rating Hit


According to the Washington Post, Obama's approval rating on handling global warming has dropped below 50%. It's not clear how much of this is due to Climategate, as the "approve" trend was heading downward at about the same rate before the story broke. However, the "disapprove" trend does show an increase in growth after Climategate.

The poll was conducted by telephone Thursday the 10th through Sunday the 13th.



The poll showed that most people thought Obama should cap greenhouse gases by a 2-1 margin, so long as that cap didn't cost very much. If the costs to a household were $25 per month, the responses were nearly evenly split between the "shoulds" and "should nots".

According to documents obtained from the Obama administration via the Freedom of Information Act, average monthly increase in costs per household would be between $73 and $146.


Trust in scientists took a big hit, with 40% saying they don't trust scientists on this issue at all or only very little. 30% said they had moderate trust, while 29% said they trust scientists completely or a lot.


By a 2-1 margin people said they believe scientists disagree with each other over this issue.

References:
Washington Post Article
Washington Post Graph
Obama Administration FOIA Documents

Friday, December 18, 2009

Cosmic Rays And Climate. Part I: Quantum Physics



This video has been placed in the public domain.

Tuesday, December 15, 2009

Pennsylvania Senate Tells PSU "Anything Short of Absolute Science Cannot Be Tolerated"























The Pennsylvania State Senate has written Pennsylvania State University regarding the investigation of Michael Mann, saying their constituents have "requested the Commonwealth further withhold Penn State's funding" until action is taken against Dr. Mann.

The letter goes on to say that funding will not be withheld for now, but the allegations of "intellectual and scientific fraud" are serious in any case, but are especially so in this case, as world economies are influenced by his work. It goes on to say "anything short of the pursuit of absolute science cannot be accepted or tolerated".

Just in case anyone's wondering, the Pennsylvania Senate is controlled by Republicans.

The picture above is the letter in full. The two pictures below are close ups of the letter.




Reference:
Sen. Piccola Letter on PSU Prof Michael Mann

Updated (Again) Cap And Trade Video



New information keeps pouring in about the seedy connections that make up "Cap and Trade". This one includes info on the head of the IPCC.

This video has been placed in the public domain.

Sunday, December 13, 2009

R.I.P. AGW 1988 - 2009

'

This video has been placed in the public domain.

Saturday, December 12, 2009

Updated Cap And Trade Video



The new version includes information recently obtained via the Freedom of Information Act.

This video has been placed in the public domain.

Friday, December 11, 2009

ClimateGate And The Mainstream Media



Examines possible reasons the MSM is trying to ignore ClimateGate.

This video has been placed in the public domain.

Thursday, December 10, 2009

Happy Fun Ball



This video has been placed in the public domain.

Tuesday, December 8, 2009

I'm Sorry






The video and images have been placed in the public domain.

References
Goldman Sachs
Deutsche Bank
Société Générale

Monday, December 7, 2009

CRU Code Not Smoking Gun

The graph presented here shows the data manipulation taking place in one example of the computer code from CRU. Many (most? all?) of my fellow climate skeptics are saying this is the "smoking gun" that proves CRU has manipulated its climate data.

It's not.

I've been a professional computer programmer for over 25 years. There's several things that stood out as odd to me about the CRU code. I've been holding off posting about it until it all came together for me. I think it has now.

*) First of all, the code is written in a language called IDL. IDL is a popular language in the scientific community, including climate science. However, it execution is slow so it's not good at handling large data sets. It's also not a well structured language, so it's not well suited for building large, complex systems. Climate models have large data sets and are complex. Because of this, professional grade models usually use FORTRAN, not IDL.

*) Secondly, the code is terrible. Truly terrible. The hacked code is not in any way a professional product, but looks more like it was written by someone who either had no experience in programming beyond an introductory course or who was writing "throw away" code for some purpose other than creating a final professional result.

*) Third, the programmer knew they were a terrible programmer. In the comments in the code the programmer refers to his own lack of skill over and over again. I can't see such a person using that code to create data that would be even glanced at by a climate scientist.

*) Fourth, the data was terrible. According to the programmer who wrote the code, it was a complete mess and had to be heavily altered to get anything close to real world values. We can all say what we will about Phil Jones but three of the grants he received at CRU were for creating climate databases. It seems unlikely that he'd have gotten such grants repeatedly if his databases produced data that had no resemblance to the real world.

So what's it all add up to?

To me it seems the code in the leaked files is toy code using a toy database used to convert small data samples in a quick and dirty way without the need for a supercomputer and the programs running on the supercomputer. Nothing more.

But we'll never know for sure until CRU releases its official code to the world, allowing it to be reviewed by outside professionals.

References
CRU Files
NASA FORTRAN Climate Model
Wikipedia IDL page

Sunday, December 6, 2009

And The Magicjava 2009 Peace Prize Goes To...












Announcement

The Magicjava Peace Prize Committee
The Magicjava Peace Prize for 2009

The Magicjava Peace Prize Committee has decided that the Magicjava Peace Prize for 2009 is to be shared by Stephen McIntyre for his tireless efforts to ensure correct scientific data and analysis are made available to the public regarding climate change of the Earth, and by the unknown hacker or whistleblower who released years of information from the Climate Research Unit, University of East Anglia, revealing systematic attempts to hide and manipulate climate data, keeping scientists who have contrary views out of peer-review literature, and talk of destroying various files in order to prevent data being revealed under the Freedom of Information Act. It also revealed the computer code used to process climate data had numerous cases of manipulating the data in order to get the desired results.

On February 12, 2005, Stephen McIntyre and Ross McKitrick published a paper in Geophysical Research Letters that claimed various errors in the methodology of a paper published by Michael E. Mann, Raymond S. Bradley and Malcolm K. Hughes frequently referred to as the MBH98 reconstruction. The paper by Stephen McIntyre and Ross McKitrick claimed that the "Hockey Stick" shape of the Mann et al. MBH98 reconstruction was the result of an invalid principal component method. They claimed that using the same steps as Mann et al., they were able to obtain a hockey stick shape as the first principal component in 99 percent of cases even if trendless red noise was used as input. This paper was nominated as a journal highlight by the American Geophysical Union, which publishes GRL, and attracted international attention for its claims to expose flaws in the reconstructions of past climate.

In 2007, Stephen McIntyre started auditing the various corrections made to temperature records, in particular those relating to the urban heat island effect. He discovered a small discontinuity in some U.S. records in the Goddard Institute for Space Studies (GISS) dataset starting in January 2000. He emailed GISS advising them of the problem and within a couple of days GISS issued a new, corrected set of data and "thank[ed] Stephen McIntyre for bringing to our attention that such an adjustment is necessary to prevent creating an artificial jump in year 2000". The adjustment caused the average temperatures for the continental United States to be reduced about 0.15 °C during the years 2000-2006.

In the wake of the release of the Climate Research Unit, University of East Anglia data Professor Professor Jones has stepped aside as Director while an investigation into the matter takes place, Pennsylvania State University has announced it will investigate Michael Mann, Dr Rajendra Pachauri, chairman of the Intergovernmental Panel on Climate Change (IPCC) promised an investigation into claims that the CRU manipulated data to favour the conclusion that human activity is driving global warming, the Met Office, a UK agency which works with the Climate Research Unit in providing global-temperature information has said it will make its data available to the public, and Dr. John P. Holdren, advisor to the President of the United States for Science and Technology, Director of the White House Office of Science and Technology Policy, and Co-Chair of the President’s Council of Advisors on Science and Technology (PCAST) has told the United States Congress he believes all completed results of research not protected by national security concerns should be made available to the public.



References
Climate Audit
CRU Files

Palm Oil: The Green Mass Extinction


A new video is up. This one is about the dozens of animals facing extinction due to the activities of the palm oil industry. This video has been placed in the public domain .

Saturday, December 5, 2009

ClimateGate: Hear No Evil, See No Evil, Speak No Evil






I've created some ClimateGate wallpapers and a banner. These images have been placed in the public domain. Click the images for a larger picture.

Friday, December 4, 2009

Fixing Climate Science II

video
The call for more transparency in science has reached all the way up to Congress. In the video above we see Congressman Sensenbrenner asking White House science czar John Holdren that the public have access to all documents prepared with government funding, including documents given to the IPCC. It's not clear to me if by "documents" the Congressman also means data and computer code.

Mr. Holdren's response is that the public should have access to the "results" of research that they pay for, excluding classified information and information that is incomplete. By phrasing his response this way, it's not clear to me if he believes raw data should be made available, or if only the so-called "value-added" data that was released by CRU should be made available. If only the "value-added" data is made available, 3rd parties cannot reproduce the steps needed to verify all of the various assumptions and "fixes" scientists make to the data.

What's really amazing though is we've been having the Global Warming debate since 1988 and Mr. Holdren had to say the public "should have access", he couldn't say "does have access".

To see the videos in their full context, follow the links below.

References
Video 1
Video 2
Fixing Climate Science

Thursday, December 3, 2009

Two New ClimateGate Related Videos




Both videos have been placed into the public domain.

Monday, November 30, 2009

CRU Funded By British Petroleum

























Click the image for a larger picture.

I came across this while doing the "Who's Who References" video. It seems that in addition to working deals with Shell Oil, Mick Kelly's work is also funded by British Petroleum.

It seems I can't glance sideways at these guys without coming across something questionable.

Reference:
Mick Kelly's Bio At CSERGE

Friday, November 27, 2009

ClimateGate: A Who's Who



Here's a video indicating the sources of each claim in the main video.

LATE EDIT: Both videos are hereby placed in the public domain. They can be used without concern of copyright or IP issues.
=====

I've put together a primer on the ClimateGate scandal. Enjoy. :)

The information presented in the video is supported by the the information I've already presented in this blog, with the following additions:

Claim:
Arctic Sea Ice was under-reported by 193,000 square miles and was actualy at the same level as sea ice in 1979.
Source:
http://www.examiner.com/x-1586-Baltimore-Weather-Examiner%7Ey2009m2d22-Arctic-sea-ice-underestimated-by-193000-square-miles

Claim:
CRU mission Statement
Source:
CRU strategic review agenda 1.doc of the hacked files

Claim:
Phil Jones has made more than £2,275,000.00 in grants from 1990.
Source:
pdj_grant_since1990.xls of the hacked files

Wednesday, November 25, 2009

Setting The Research Agenda

It seems like there's trouble at every turn with the activities taking place at the Climate Research Unit [CRU], University of East Anglia. Here we see evidence of what seems to be CRU considering taking funding dollars from Shell Oil in return for allowing Shell to partially drive the research agenda.



Source: uea-tyndall-shell-memo.doc

Mick Kelly and Aeree Kim (CRU, ENV) met with Robert Kleiburg (Shell International’s climate change team) on July 4th primarily to discuss access to Shell information as part of Aeree’s PhD study (our initiative) and broader collaboration through postgrad. student project placements (their initiative), but Robert was also interested in plans for the Tyndall Centre (TC). What ensued was necessarily a rather speculative discussion with the following points emerging.

Shell International would give serious consideration to what I referred to in the meeting as a ‘strategic partnership’ with the TC, broadly equivalent to a ‘flagship alliance’ in the TC proposal. A strategic partnership would involve not only the provision of funding but some (limited but genuine) role in setting the research agenda etc.

Shell’s interest is not in basic science. Any work they support must have a clear and immediate relevance to ‘real-world’ activities. They are particularly interested in emissions trading and CDM.[Clean Development Mechanism]


And here we see a letter from Greenpeace to CRU indicating that the IPCC reports (which CRU plays a major role in developing) and climate negotiations are driven by the agenda of the World Trade Organization.

Source: greenpeace.txt

From: "paul horsman"
To: m.kelly@uea.ac.uk (Mike Kelly)

It was good to see you again yesterday - if briefly. One particular thing you said - and we agreed - was about the IPCC reports and the broader climate negotiations were working to the globalisation [sic] agenda driven by organisations like the WTO. So my first question is do you have anything written or published, or know of anything particularly on this subject, which talks about this in more detail?


References
Zip File of Data Taken By Hacker

Tuesday, November 24, 2009

Fixing Climate Science

A lot of the talk in the blogosphere regarding the hacked files from Climate Research Unit, University of East Anglia has centered around which newspapers are covering the story and which are glossing it over, as well as who should be fired for what happened.

I think it may be a good idea for folks to come up with ideas on how this process can be fixed. Venting frustration against persons X, Y, and Z, or publications A, B, and C may feel satisfying, but does little to correct the underlying problems in the science.

As I see it, the biggest problems to come to light from all of this are:

* Withholding of information / data

* Modifying data

* Computer models that are coded specifically to produced desired results

* Political pressure on individuals who publish papers that doesn’t please a small group of scientists and similar pressure on the journals that publish those papers


I’d suggest that in order to solve these problems, the following steps are needed at a minimum:



* All data and computer codes needed to reproduce the conclusion of a paper must be submitted with the paper before it can be published, along with any special steps needed to reproduce the conclusions.
* All such data and computer codes be placed in the public domain and be made available to anyone in the world via the internet for free.
* Pass laws making it illegal to hide any related data or computer codes behind “Intellectual Property” agreements.


I'll be writing various skeptic bloggers and scientists to see if I can get any sort of momentum behind these reforms. I'll update this blog with any progress or lack of progress that I make.

Monday, November 23, 2009

CO2ers Begin To Realize This Is Real

It looks like it may be starting to sink in that the hacked files from Climate Research Unit, University of East Anglia are real. Here's a blog entry from The Guardian's George Monbiot, who has been a very vocal proponent of man-made CO2 Global Warming:

It’s no use pretending that this isn’t a major blow. The emails extracted by a hacker from the climatic research unit at the University of East Anglia could scarcely be more damaging(1). I am now convinced that they are genuine, and I’m dismayed and deeply shaken by them.

Yes, the messages were obtained illegally. Yes, all of us say things in emails that would be excruciating if made public. Yes, some of the comments have been taken out of context. But there are some messages that require no spin to make them look bad. There appears to be evidence here of attempts to prevent scientific data from being released(2,3), and even to destroy material that was subject to a freedom of information request(4).

Worse still, some of the emails suggest efforts to prevent the publication of work by climate sceptics(5,6), or to keep it out of a report by the Intergovernmental Panel on Climate Change(7). I believe that the head of the unit, Phil Jones, should now resign. Some of the data discussed in the emails should be re-analysed.


Personally, I'm very impressed with Mr. Monbiot's admission of the damage these files have caused to the case of CO2 Global Warming. It must have been very difficult for him to come to that conclusion.

Here's another quote from the e-mails, from file 1255550975.txt, written by Kevin Trenberth at UCAR:

How come you do not agree with a statement that says we are no where close to knowing where energy is going or whether clouds are changing to make the planet brighter. We are not close to balancing the energy budget. The fact that we can not account for what is happening in the climate system makes any consideration of geoengineering quite hopeless as we will never be able to tell if it is successful or not! It is a travesty!
Kevin


So when you hear CO2 climate scientists saying in press releases there's no way that clouds can be driving the climate, you now know the truth is they have no idea if they are or not. Their models can't answer that question.

And it appears that even their climate models have been coded to artificially produce the desired results. From Mann's climate model in the file maps12.pro:

; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.

;



So it seems the CO2 models can't even reproduce the temperatures we already know, let alone make valid predictions about what we don't know.

References
George Monbiot's Personal Blog
Zip File of Data Taken By Hacker

Saturday, November 21, 2009

Some Pretty Damning Evidence Global Warming Info Has Been Manipuated

You may have heard that the Climate Research Unit, University of East Anglia, has suffered a break in and hundreds of electronic files and e-mails were posted to the internet. The information contained in these files seems to show a clear cut, repeated pattern of data manipulation, withholding data from those not "onboard" with CO2 Global Warming, and coming up with tricks to explain away the cooling that was taking place when warming was predicted.

You can download these files from here. Some interesting excerpts from them are below. The names of the scientists involved are literally a "who's who" in the CO2 Global Warming community.

From Michael E. Mann (witholding of information / data):

Dear Phil and Gabi,
I’ve attached a cleaned-up and commented version of the matlab code that I wrote for doing the Mann and Jones (2003) composites. I did this knowing that Phil and I are likely to have to respond to more crap criticisms from the idiots in the near future, so best to clean up the code and provide to some of my close colleagues in case they want to test it, etc. Please feel free to use this code for your own internal purposes, but don’t pass it along where it may get into the hands of the wrong people.


From Nick McKay (modifying data):
The Korttajarvi record was oriented in the reconstruction in the way that McIntyre said. I took a look at the original reference – the temperature proxy we looked at is x-ray density, which the author interprets to be inversely related to temperature. We had higher values as warmer in the reconstruction, so it looks to me like we got it wrong, unless we decided to reinterpret the record which I don’t remember. Darrell, does this sound right to you?


From Tom Wigley (acknowleding the urban effect):
We probably need to say more about this. Land warming since 1980 has been twice the ocean warming — and skeptics might claim that this proves that urban warming is real and important.


From Phil Jones (modification of data to hide unwanted results):
I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie from 1981 onwards) amd from 1961 for Keith’s to hide the decline.


From Kevin Trenberth (failure of computer models):
The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate.


From Michael Mann (truth doesn't matter):
Perhaps we'll do a simple update to the Yamal post, e.g. linking Keith/s new page--Gavin t? As to the issues of robustness, particularly w.r.t. inclusion of the Yamal series, we actually emphasized that (including the Osborn and Briffa '06 sensitivity test) in our original post! As we all know, this isn't about truth at all, its about plausibly deniable accusations.


From Phil Jones (witholding of data):
The skeptics seem to be building up a head of steam here! ... The IPCC comes in for a lot of stick. Leave it to you to delete as appropriate! Cheers Phil
PS I’m getting hassled by a couple of people to release the CRU station temperature data. Don’t any of you three tell anybody that the UK has a Freedom of Information Act !


From Michael E. Mann (using a website to control the message, hide dissent):
Anyway, I wanted you guys to know that you’re free to use RC [RealClimate.org - A supposed neutral climate change website] Rein any way you think would be helpful. Gavin and I are going to be careful about what comments we screen through, and we’ll be very careful to answer any questions that come up to any extent we can. On the other hand, you might want to visit the thread and post replies yourself. We can hold comments up in the queue and contact you about whether or not you think they should be screened through or not, and if so, any comments you’d like us to include.


From Phil Jones (witholding of data):
If FOIA [Freedom Of Information Act] does ever get used by anyone, there is also IPR [Intellectual Property Rights] to consider as well. Data is covered by all the agreements we sign with people, so I will be hiding behind them.


From Phil Jones (destroying of emails / evidence):
Mike, Can you delete any emails you may have had with Keith re AR4? Keith will do likewise. He’s not in at the moment – minor family crisis. Can you also email Gene and get him to do the same? I don’t have his new email address. We will be getting Caspar to do likewise.


From Tom Wigley (data modification):
Phil, Here are some speculations on correcting SSTs to partly explain the 1940s warming blip. If you look at the attached plot you will see that the land also shows the 1940s blip (as I’m sure you know). So, if we could reduce the ocean blip by, say, 0.15 degC, then this would be significant for the global mean — but we’d still have to explain the land blip. I’ve chosen 0.15 here deliberately. This still leaves an ocean blip, and i think one needs to have some form of ocean blip to explain the land blip (via either some common forcing, or ocean forcing land, or vice versa, or all of these). When you look at other blips, the land blips are 1.5 to 2 times (roughly) the ocean blips — higher sensitivity plus thermal inertia effects. My 0.15 adjustment leaves things consistent with this, so you can see where I am coming from. Removing ENSO does not affect this. It would be good to remove at least part of the 1940s blip, but we are still left with “why the blip”. Let me go further. If you look at NH vs SH and the aerosol effect (qualitatively or with MAGICC) then with a reduced ocean blip we get continuous warming in the SH, and a cooling in the NH — just as one would expect with mainly NH aerosols. The other interesting thing is (as Foukal et al. note — from MAGICC) that the 1910-40 warming cannot be solar. The Sun can get at most 10% of this with Wang et al solar, less with Foukal solar. So this may well be NADW, as Sarah and I noted in 1987 (and also Schlesinger later). A reduced SST blip in the 1940s makes the 1910-40 warming larger than the SH (which it currently is not) — but not really enough. So … why was the SH so cold around 1910? Another SST problem? (SH/NH data also attached.) This stuff is in a report I am writing for EPRI, so I’d appreciate any comments you (and Ben) might have. Tom.


From Thomas R Karl (witholding data) :
We should be able to conduct our scientific research without constant fear of an "audit" by Steven McIntyre; without having to weigh every word we write in every email we send to our scientific colleagues. In my opinion, Steven McIntyre is the self-appointed Joe McCarthy of climate science. I am unwilling to submit to this McCarthy-style investigation of my scientific research. As you know, I have refused to send McIntyre the "derived" model data he requests, since all of the primary model data necessary to replicate our results are freely available to him. I will continue to refuse such data requests in the future. Nor will I provide McIntyre with computer programs, email correspondence, etc. I feel very strongly about these issues. We should not be coerced by the scientific equivalent of a playground bully. I will be consulting LLNL's Legal Affairs Office in order to determine how the DOE and LLNL should respond to any FOI requests that we receive from McIntyre.


From Tom Wigley (ousting of a skeptic from a professional organization):
Proving bad behavior here is very difficult. If you think that Saiers is in the greenhouse skeptics camp, then, if we can find documentary evidence of this, we could go through official AGU channels to get him ousted.


From Phil Jones (forging of dates):
Gene/Caspar, Good to see these two out. Wahl/Ammann doesn't appear to be in CC's online first, but comes up if you search. You likely know that McIntyre will check this one to make sure it hasn't changed since the IPCC close-off date July 2006! Hard copies of the WG1 report from CUP have arrived here today. Ammann/Wahl - try and change the Received date! Don't give those skeptics something to amuse themselves with.


From a document titled "jones-foiathoughts.doc" (witholding of data):
Options appear to be:
1. Send them the data
2. Send them a subset removing station data from some of the countries who made us pay in the normals papers of Hulme et al. (1990s) and also any number that David can remember. This should also omit some other countries like (Australia, NZ, Canada, Antarctica). Also could extract some of the sources that Anders added in (31-38 source codes in J&M 2003). Also should remove many of the early stations that we coded up in the 1980s.
3. Send them the raw data as is, by reconstructing it from GHCN. How could this be done? Replace all stations where the WMO ID agrees with what is in GHCN. This would be the raw data, but it would annoy them.


From Mick Kelly (modifying data to hide cooling):
Yeah, it wasn’t so much 1998 and all that that I was concerned about, used to dealing with that, but the possibility that we might be going through a longer – 10 year – period of relatively stable temperatures beyond what you might expect from La Nina etc. Speculation, but if I see this as a possibility then others might also. Anyway, I’ll maybe cut the last few points off the filtered curve before I give the talk again as that’s trending down as a result of the end effects and the recent cold-ish years.


References:
Zip File of Data Taken By Hacker
The Blackboard Blog On This Issue
Philadelphia Examiner Article

Thursday, October 15, 2009

First Release Of Redesigned Particle Code Is Up


The first release of the redesigned Particle code is up and can be downloaded from SourceForge.

What's In This Release
The code has the following features:

* It has been placed in the public domain.
* Particle base class that is independent of FreePOOMA.
* Particle classes for all observed Hadrons of the Standard Model, this is in addition to the elementary particles.
* Particle decay functionality added.
* Template-based classes that allow for any type of number to be used for particle properties. For example, you can use floats, doubles, complex numbers, or quaternions for these properties.
* Math utilities that provide matrixes, vectors, vertexes, and quaternion classes.
* Programmer's Guide in both Microsoft Word and Apple Pages formats.

What's Not In This Release
This is basically a 'To Do' list. These features will be added in the near future pretty much in the order listed below.

* FreePOOMA integration. This will be added in the form of a wrapper class around the new Particle classes.
* Collision detection and reaction code.
* An atomic model, i.e. a way to combine particles into atoms.
* A molecular model, i.e. a way to combine atoms into molecules.
* Support for fields.
* OpenGL integration.

Once again I'd like to thank the folks at The Particle Data Group for helping me when I ran into problems.

The code is available as a zip file here: Particle Zip File.

In the future I'll be making posts that cover the code in a bit more detail.

Friday, September 25, 2009

An Oldie But A Goodie On Global Warming And Cosmic Rays


Here's a chart I put together a few years ago on the connection between cosmic rays and global temperatures. The red line shows changes in galactic cosmic rays hitting the earth, the blue line shows changes in low level cloud cover. The black line shows changes in global temperature averages. We can see that as global cosmic rays increase, low level cloud cover increases. And as low level cloud cover increases, global temperature averages go down.

The data shown covers the period of 1984 through 2002.

Wednesday, September 23, 2009

Quantum Physics Safe, For Now

The Particle Data Group has responded to the letter I sent them regarding the strange behavior of the Strange D, which seemed to not conserve charge in its decays. Their response is that the decay in question is really the sum of two different decays. The two actual decays are listed below the summed decay. I checked and using those two decays does indeed work correctly. As far as I know, the Strange D is the only particle that uses this "summed" notation.

Anyway, while I was waiting to hear back from them I found another unusual decay that doesn't seem to conserve charge: the Charmed Lambda. Decay #10 of this particle gives a decay whose total charge is neutral, whereas the Charmed Lambda has a negative charge. And there's no "summed" notation for this particle as far as I can see. So I've written off another letter hoping they're nice enough to respond once more.

That said, the extended Particle class is nearly done. Once these remaining few issues are cleared up, it'll be published.

Reference
Charmed Lambda Info: http://pdg.lbl.gov/2009/listings/rpp2009-list-lambdac-plus.pdf

Saturday, September 12, 2009

Apple Open Sources Grand Central Dispatch

Apple has released the source code for Snow Leopard's new Grand Central Dispatch under an Apache-style license. Grand Central Dispatch lets programmers easily take advantage of modern multi-core hardware. The Apache licensing means it can be safely used in projects wishing to keep the rest of their code proprietary. This is great news for folks writing code that needs lots of horsepower and needs to run on multiple platforms.

Links:
Grand Central Dispatch code at Mac OS Forge
MacResearch article discussing the release
Apple's Grand Central Dispatch Technology Briefing
Introducing Blocks and Grand Central Dispatch (Must be a registered Apple Developer to access this article)

Technical Note For Windows Programmers: Grand Central Dispatch uses a technology known as Blocks. Blocks require a language extension to C or C++ known as Lambdas. This language extension has been added to the publicly available GCC compiler and has been submitted for inclusion for the next version of the C programming language. But I don't think it's part of Microsoft's Visual Studio development environment. The proposed extension differs syntactically from Microsoft's Lambda extension. Bottom line, if you want to use Grand Central Dispatch on Windows, you'll want to use the GCC compiler.

Tuesday, September 8, 2009

Error In Quantum Physics?

I've been working on adding additional particles for the particle classes I discussed in a previous post. They'll be a lot more particles and more information about each particle. One of the things included is particle decay, when one particle transforms into several different particles. It was while I was working on this that I came across what looks like an error in the measured decays of a particle known as the Strange D.

Particles turning into other particles is neat, but when they do it they have to follow certain conservation laws. One of those laws is that the overall electric charge must be preserved. This means when you add up all the charges from the new particles it has to be exactly the same as the charge for the particle they decayed from.

The Strange D has an electric charge of 1. According to the the Particle Data Group the Strange D can decay into particles known as Eta, Leptons, and Eta Prime. Specifically, the decay (decay #14, btw) looks like this:

Decay
Particle__________________________Charge
Eta_______________________________0
Non-Neutrino Anti Lepton______________1
Neutrino __________________________0
Eta Prime__________________________0
Non-Neutrino Anti Lepton______________1
Neutrino __________________________0
-----------------------------
Total Decay Charge___________________2

Strange D Charge_____________________1

So the total charge of the particles from the decay is 2, while the original particle had only a charge of 1, which violates the conservation of electric charge.

When I discovered the bad redshift data a while back, I first gave the folks who produced the data a chance to respond. So I'm sending off a letter to the Particle Data Group to see if the data is bad, or if it's just a misunderstanding on my part. We'll see what they say.

References
Strange D data from the Particle Data Group
Conservation laws of physics

Tuesday, September 1, 2009

Using POOMA With Excel, OpenGL, and HippoDraw


In a prior post I covered the basics of how you can extract data from POOMA for display in your programs. I'm going to expand on that now and show how you can use POOMA with Excel or Numbers, OpenGL, and HippoDraw. The code discussed in this post can be downloaded from SourceForge at the following locations:

Download POOMAIO.h
Download POOMAIOTest.cpp

POOMAIO.h contains classes for writing POOMA Array values to CSV files, to TNT files, and to a format usable as translate values in OpenGL programs. POOMAIOTest.cpp is the electron bounce program from the previous post re-written to use these new IO capabilities.

To use these new classes, you instantiate them and call their write() method with the appropriate parameters.

Using POOMA Data In Excel Or Numbers
Both Excel and Numbers are spreadsheets that can read CSV (Comma Separated Values) files. POOMIO.h contains a class named VectorCSVOutput that will write POOMA array values in CSV format. VectorCSVOutput is a template that takes the number of dimensions of the POOMA array and the data type of values stored in that array. Once you've instantiated the class, call it's write method passing in the output stream, POOMA array to write, and character delimiter to use. The character delimiter defaults to a comma, and you'll probably want to stick with that. An example that prints the positions of electrons is shown below along with a screen shot of the resulting values as they appear in Numbers.


VectorCSVOutput<3> oVectorCSVOutput;

for (unsigned int iLoop5 = 0; iLoop5 < iNumberOfParticles; iLoop5++)
{
CustomParticle<PTraits_t>::PointType_t oThisElectronPosition = oElectron.pos(iLoop5);

cout << iLoop5 << ","; // Print out the number of each particle.
oVectorCSVOutput.write(cout, oThisElectronPosition);
cout << endl;
} // for



Using POOMA Data In HippoDraw
The next output format we're going to look at is the TNT format. The TNT format can be read by HippoDraw, which is a data analysis tool from SLAC. The class used to write POOMA arrays to the TNT format is VectorTNTOutput, which inherits from VectorCSVOutput. The only additional work you need to do for VectorCSVOutput is to call its writeHeaders() method. The write headers method takes an output stream, a title and a vector of column labels as parameters. An example of using this class is shown below.

VectorTNTOutput<3> oVectorTNTOutput;
vector<string> vLabels;

vLabels.push_back(string("Particle"));
vLabels.push_back(string("X"));
vLabels.push_back(string("Y"));
vLabels.push_back(string("Z"));

oVectorTNTOutput.writeHeaders(cout, string("Electron Bounce"), vLabels);

// .... perform work ....

for (unsigned int iLoop5 = 0; iLoop5 < iNumberOfParticles; iLoop5++)
{
CustomParticle<PTraits_t>::PointType_t oThisElectronPosition = oElectron.pos(iLoop5);

cout << iLoop5 << ",";
oVectorTNTOutput.write(cout, oThisElectronPosition);
cout << endl;
} // for



Using POOMA Data In OpenGL
OpenGL usually only need translations from POOMA. You'll create the objects you want to display as meshes and translate them to their correct location based on data provided by POOMA. However, OpenGL programs often want their position values normalized between a value of -1 and 1 in the X and Y directions and a value of 0.1 and some maximum value in the Z direction. There are two classes to help you with this translation, VectorOpenGLOutput, which writes scaled translation values to standard out as a C array of floats, and VectorOpenGLVectorOutput, which stores scaled translation values in a C++ vector. Both of these classes need to be told what the maximum value will be in your POOMA array and how you want to offset the X, Y, and Z values. The maximum values are used to scale down the values in the array to make them usable by OpenGL. The offset then moves the X, Y, and Z positions so they appear on the screen.

With VectorOpenGLOutput you can write values to standard out, and then cut and paste them directly into C or C++ code for later replay. The example below shows the code to do this.

CustomParticle<PTraits_t>::PointType_t oMaxValues;
CustomParticle<PTraits_t>::PointType_t oOffsets;
VectorOpenGLOutput<3> oVectorOpenGLOutput;

oMaxValues(0) = 99;
oMaxValues(1) = 99;
oMaxValues(2) = 99;
oOffsets(0) = 1;
oOffsets(1) = 1;
oOffsets(2) = -0.1;
cout << "float aTranslate" << iLoop2 << "[" << iNumberOfParticles << "][3]{" << endl;
for (unsigned int iLoop5 = 0; iLoop5 < iNumberOfParticles; iLoop5++)
{
CustomParticle<PTraits_t>::PointType_t oThisElectronPosition = oElectron.pos(iLoop5);

oVectorOpenGLOutput.write(cout, oThisElectronPosition, oMaxValues, oOffsets);
} // for
cout << "};" << endl;


The VectorOpenGLVectorOutput stores the scaled values in an array so you can use them right away in your program. An example is shown below.

CustomParticle<PTraits_t>::PointType_t oMaxValues;
CustomParticle<PTraits_t>::PointType_t oOffsets;
VectorOpenGLVectorOutput<3> oVectorOpenGLVectorOutput;
vector<CustomParticle<PTraits_t>::AxisType_t> vPositions;

oMaxValues(0) = 99;
oMaxValues(1) = 99;
oMaxValues(2) = 99;
oOffsets(0) = 1;
oOffsets(1) = 1;
oOffsets(2) = -0.1;
for (unsigned int iLoop5 = 0; iLoop5 < iNumberOfParticles; iLoop5++)
{
CustomParticle<PTraits_t>::PointType_t oThisElectronPosition = oElectron.pos(iLoop5);

oVectorOpenGLVectorOutput.write(vPositions, oThisElectronPosition, oMaxValues, oOffsets);
// ... Call OpenGL as needed with the values stored in vPositions.
vPositions.clear();
} // for


The OpenGL examples assume a bounding box of 0 through 99 in the POOMA program for the X, Y and Z axis. This means particles cannot travel beyond those dimensions, they'll bounce of the invisible walls instead. This is the condition that was set up in the example program.

The end result of the translation to OpenGL values is the X and Y axis will be scaled to values between -1 and 1, and the Z axis will be scaled to values between 0.1 and 2.1. These scaled values are commonly used by OpenGL programs. Your OpenGL program will use gltranslatef() to translate your 3D representation of the particles using the values provided by VectorOpenGLOutput or VectorOpenGLVectorOutput.

Monday, August 24, 2009

Custom POOMA Particles And The Standard Model


POOMA provides the ability to define custom particles that you can use in your physics simulations. Now, a "particle" in POOMA can be anything, baseballs, planets, imaginary particles, whatever you want. In this post we're going to create definitions for the particles from the standard model of quantum physics. If you want to brush up on the standard model, you can find a good description of it at Wikipedia's Standard Model page. You can download the code being discussed in this post (just one header file) from The POOMA Add-On Project at SourceForge. A short POOMA program that uses these custom particles is here.

IMPORTANT NOTE: While all the main particles from the standard model are created, they are treated in a "classical" manner in the classes we create. That is, their positions, charges, velocities, and so on, are all precisely known.

Extending POOMA's Particle Template
You create a custom particle in POOMA by extending the Particle Template. The class you create can contain any data you'd like. You'll certainly want to store the information that applies to your custom particle type in this class. However, it also seems to be customary to include Vectors of information for all instances of your class where the information changes from particle to particle. For example, because the positions of the instances of your particles will all be different, it's customary to include a Vector in your particle class to store the positions of each particle.

This post will discuss the header file that defines the particles of the standard model. First we will look at the base class, CustomParticle, and then the derived classes.

The CustomParticle Base Class
The CustomParticle class is the base class for all the other classes we'll define. It has a protected constructor, so it can't be instantiated. You'll need to instantiate one of the derived classes.

A description of the members is provided below:

*) getMass() Returns the mass of the particle, measured in MeV.
*) getCharge() Returns the charge of the particle.
*) getSpin() Returns the spin of the particle.
*) getName() Returns the name of the particle. Example names are Electron, Photon, and Charm Quark.
*) getInteractsWithStrong() Returns true if the particle interacts with the strong force, or false if it doesn't.
*) getInteractsWithElectromagnetic() Returns true if the particle interacts with the electromagnetic force, or false if it doesn't.
*) getInteractsWithWeak() Returns true if the particle interacts with the weak force, or false if it doesn't.
*) getInteractsWithGravity() Returns true if the particle interacts with the gravity, or false if it doesn't.
*) getInteractsWithParticle() Pass this function a particle and it will return true if this particle interacts with it, or false if it doesn't.
*) getLifetime() Returns the lifetime of the particle in seconds. A value of -1 means the particle lasts forever.
*) isFermion() returns true if a particle is a fermion, false if it is not.
*) isBoson() returns true if a particle is a boson, false if it is not.
*) isQuark() returns true if a particle is a quark, false if it is not.
*) isLeption() returns true if a particle is a lepton, false if it is not.
*) DynamicArray< PointType_t, AttributeEngineTag_t > pos This is an array that will store all the positions of particles of a given type.
*) DynamicArray< PointType_t, AttributeEngineTag_t > vel; This is an array that will store all the velocities of particles of a given type.
*) globalCreateAndSetRandomPositionsAndVelocities() This function will create a given number of objects, all of the same particle type and set their positions and velocities to random values. Pass this function the number of particles to create, a seed for the random number generator, and true or false depending on if you want velocities to be randomized or zero.
*) getDistance() This is a static helper function that will return the distance between two vectors. Pass it the position of two particles and it will return their distance.

The Derived Particle Classes
There are derived particle classes for each of the three types of matter, Quarks, Leptons, and Bosons. These classes have protected constructors and cannot directly be instantiated. Instead, you'll create objects from classes that derive from these classes. A class inheritance diagram is provided below. Only the classes that inherit from Quark, Lepton, and Boson can be instantiated.


Inheritance Diagram. Click picture for larger image.

There is also an enumration called ColorCharge. This defines the color charge that quarks have. There is also a GluonColorCharge enumeration for gluons. The base class for all quarks, Quark, and the Gluon class have getColorCharge() and setColorCharge() methods for their ColorCharge.

The Fermion class serves as the base for the Quark and Lepton classes. Fermion has values for weak isospin, weak hypercharge, and generation. These values are set by the constructor of derived classes and can be obtained using the getWeakIsospin(), getWeakHyperCharge(), and getGeneration() methods.

Additionally, quarks have charm, strange, top, bottom, isospin, and baryon number values. These values are set by the derived quark classes and can be obtained with the methods getCharm(), getStrange(), getTop(), getBottom(), getIsospin(), and getBaryonNumber().

Similarly, leptons have leptonic electronic number, leptonic muonic number, and leptonic tauonic number values. These values are set by the derived lepton classes and can be obtained with the methods getLeptonicElectronicNumber(), getLeptonicMuonicNumber(), and getLeptonicTauonicNumber().

Using Custom Particles
Finally, we want to use our custom particles in an actual POOMA program. If you haven't done so already, you can download the test program from here and the custom particle file from here.

The test program creates a bunch of electrons with zero velocity and places them in an imaginary box. The electrons bounce around the box driven by their own repulsive force to each other. Every time step the positions of the electrons is printed to standard out.

Sunday, August 23, 2009

Getting POOMA Data For Use In Cocoa

video
POOMA ALife Data Running In An iPhone.

In the previous post we covered how to build the physics engine known as FreePOOMA on a Macintosh and how to write a simple Cocoa application that starts and stops POOMA. In this post we'll look at how to grab the data POOMA has calculated and use it in a Cocoa application.

Using C++ Code in Cocoa Objective C Programs
The first thing we need to do is enable the use of C++ code in a Cocoa program. There are two ways to do this.

The first way is to change the file extension of any Objective C file that needs to also use C++ from ".m" to ".mm". We did this in the example from the previous post.

The second way is to right click the file in the Groups & Files window and select Get Info from the popup menu. Then click the General tab. Find the section named File Type. You see a drop down list box there. Use the list box to change the file type to sourcecode.cpp.objcpp.

You only need to do one of these methods, not both.

Directly Grabbing FreePOOMA Domains
POOMA stores data in Engines and Engines have a Domain that contains the actual data. So for a given object, you want to grab its Domain in order to gain access to its data. The example below shows how to do this for a POOMA Array.

Domain<1, MyDataType>& oDomain = MyPOOMAArray.totalDomain();

As you can see, Domains are templates. The first template parameter specifies the number of dimensions in the Domain. The second template parameter specifies the data type the Domain stores. Once you have a Domain, you can pull data from it using the bracket operator, [], as shown below.

MyDataType& oMyData = oDomain[0];

From here you can perform any operation supported by the data type being used.

Parsing POOMA Data From Strings Programatically
Many POOMA objects can send their data to a stream. You can use this functionality to send POOMA data to a string stream, and then parse the resulting string. An example is shown below.

stringstream oStream;

oStream << MyPOOMAArray;

string strData = oStream.str();

parseString(strData); // custom function to parse the string and pull the needed data from it.

In the docs directory of the FreePOOMA install you will find tutorials in HTML format. Tutorial 11 discusses sending the data contained in POOMA objects to streams.

Parsing POOMA Data From Strings Manually
Finally, when debugging your program in XCode you can print POOMA data to the gdb console by sending it to the cout stream. You can then open the gdb console, copy the data and manually edit it to fit whatever format you need. Other programs can then use this data without the need to actually run POOMA. A code sample of how to do this is shown below.

cout << MyPOOMAArray; // Put a break point here.

In this case we put a breakpoint on the line of code shown above. Just before the code executes we open the gdb console and clear the log using the "Clear Log" button located at the upper right corner of the gdb console window. Then we step over the line of code shown above. Finally, we open the gdb console again and copy the resulting printout. Paste this information into TextEdit or some other editor and change it into the format you need to read it in your program.

The images below illustrate these steps. These images were taken while running a slightly modified version of the ALife example program provided with the FreePOOMA install.


Image 1. POOMA data in the gdb console. Click image for larger picture.


Image 2. The edited version of the gdb console data. It's now a C array. Click image for larger picture.

The movie at the beginning of this post shows the POOMA ALife data being used by an application running in an iPhone. Because we manually added the data to the program, FreePOOMA does not need to run on the iPhone.