The SPIE Defense, Security, and Sensing event is happening next week and several of the sessions deal with photonic sensing methods used in the Deepwater Horizon spill, relevant measurement techniques, and how to use these technologies and the data they produce to improve forecasting and thus the response to large-scale environmental disasters, whether they be natural or manmade.
Richard Crout's SPIE Newsroom article, "Measurements in support of the Deepwater Horizon incident's response effort" briefly describes some of the measurement techniques used to try to identify where the oil was, either on or within the water. If you want to spend some instructive time exploring the data sets, check out the epic collection available through the National Oceanographic Data Center (NODC) and then, if you're not yet sated, wander over to the National Oceanic and Atmospheric Administration's iGulf Web site for more information on the regional collaboration team tasked with monitoring the Gulf of Mexico and nursing its ecosystem back to health.
And if you still just haven't had enough, you can read the June, 2010 interim mission report of the NOAA Ship Thomas Jefferson, dispatched to the spill to discover whether a combination of sonar and in situ measurements could reliably detect submerged oil and gives a rather more in-depth description of the instruments used and the results acquired than I can possibly give you here.
The Deepwater Horizon spill provided a ton of data to researchers. It's still providing a ton of data, in fact! And that's as it should be, because we still have an awful lot to learn about how our oceans function. We're going to have more oil spills, they're going to happen in deeper water, but at least now we'll have better tools to hand to understand, mitigate, and stop them than we did prior to April 20, 2010. The challenge now is to make sure that the efforts to develop those tools are adequately funded.