I recently was asked by Wired UK to produce a graphic to accompany an upcoming story about the TED Global event in Oxford, UK. In the process, I’ve learned some interesting things.

First of all, the job titles for TED speakers make excellent jokes (if you’ve got some good punchlines for these, leave them in the comments below):

To do some more serious explorations, I built an app in Processing that allows me to take the speaker list (scraped from the site) and get latitude/longitude values from the MetaCarta API. The data is stored in a Google Spreadsheet, which processing can remotely read and edit. These lat/lon points are then rendered on a globe, and the trip that each speaker takes to get to Oxford is shown as a paper airplane flight. Here are three renders, each with a different texture used on the globe:

Visualizing TED Global – 182,793km to Oxford (Paper) from blprnt on Vimeo.

Visualizing TED Global – 182,793km to Oxford from blprnt on Vimeo.

Visualizing TED Global – 182,793km to Oxford (B&W) from blprnt on Vimeo.

This is a very similar system to the one I used for Just Landed – the only real difference here is that the locations and travel paths are mapped onto a globe rather than onto a flat surface. Indeed, this gives me pretty much everything I need to render a spherical version of Just Landed – when I get a spare hour or two.

A side effect of mapping all of these trips was the chance to find out how much ground (or air) was covered – I estimated that the 62 speakers at TED have travelled a total of ~182,793km to get to Oxford! I’m not even going to ask about carbon credits.

I’m not sure what the final image will look like – you’ll have to buy the issue of Wired UK – but I have been having some fun playing with this system. While a full 3D environment may seem like overkill for a print project, having the system built the way it is means that I can very quickly prototype many compositional variants, and then tweak and adjust the system as needed to get a good output for print.

Great work Jer!

I’ve used the metacarta api myself, but never poked around processing yet (although ProcessingJS is pretty wonderful).. something I’ve added to my todo list.

I’m really curious as to how you mapped a lat/long to an image of the globe. Did you use the Google Earth API? I know you’ve talked about the Wolfram projections… but can you be a bit more specific?

cheers!

john

@worldlyjohn

Hi John,

Mapping to a globe is a lot easier than mapping to a flat projection. Latitude and longitude are essentially spherical coordinates (http://en.wikipedia.org/wiki/Spherical_coordinate_system). Once you convert them to radians, you can use some really simple math to extract cartesian (x,y,z) coords.

In my case I pass a 2D vector of lat,lon to two different functions, first to convert to spherical coords then to convert to cartesian:

Vec3D latToSphere(Vec2D ll, float r) {

Vec3D v = new Vec3D();

ll = toRadians(ll.x, ll.y);

v.x = ll.x;

v.y = ll.y – PI/2;

v.z = r;

return(v);

};

Vec3D sphereToCart(Vec3D ll) {

Vec3D v = new Vec3D();

v.x = ll.z * cos(ll.y) * sin(ll.x);

v.y = ll.z * sin(ll.y) * sin(ll.x);

v.z = ll.z * cos(ll.x);

return(v);

};

float degreesToRadians(float d) {

return((d/180) * PI);

};

This gives me a 3D vector with the X,Y,Z coords. Both this vector and the spherical 3D vector are really useful since I can use some of the built-in methods of the Vec3D Class (thanks, toxi!) to do a lot of the math required to make the pretty ribbons, calculate distances, etc.

Toxi’s geomutils are really really handy for this and a whole pile of other things: http://code.google.com/p/toxiclibs/