JF studied spreading centers off of Galapagos. When she did her PhD hard to get data – but she did ask for and get some, wrote a paper that ended up on the front page of Science. Now, data is widely available, but there’s almost too much, so more effort is needed to organize and work with the data. Big science efforts such as the new OOI (see my workshop info) and Ridge2000 (3-d imaging terabytes, few scientists studying).
Newish stuff: iphone seismometer, USGS twitter project to see where shaking is reported to try to find the center and the impact on the people there.
CR geologists have been tentative toward the internet. Geocoding – associating objects/data to locations – is a great addition.
Lots of geospatial data available but format varies – sometimes a static map picture, sometimes Google Earth (kml/kmz file). For recent usgs earthquake data you can get a visualization, but for older a sort of ugly list and then you’d have to visualize it yourself. Climate Wizard from the Nature Conservancy lets you visualize temperatures. Needs to search the literature by location and also by subject.
Anne – the crowd sourced stuff is in kml at best but she needs to be able to use ArcGIS to make all of the calculations she needs to make. There’s a problem there with that.
Cameron – what kind of information do you lose going back and forth from GIS to KML.
Me – about bottom of the ocean. Use geomap app from LDEO. Need something like world wide telescope for the ocean floor.
Audience q – do you need a programmer to get to the visualization you need
A: fairly simple to go from the output to a kml input file but time and money. also lots of different formats different protocols. Finding the data is very difficult.
it’s a different thing now with bad format before no data available at all.
Cameron – what pressure do you get in the UK to make data available
a (Chris) – not really, there’s broader impact but. aspiration but not insisted upon.