Wednesday, September 22, 2010
Computer Servers Could Help Detect Earthquakes
Computer servers in data centers could do more than respond to requests from millions of internet users.
IBM researchers have patented a technique using vibration sensors inside server hard drives to analyze information about earthquakes and predict tsunamis.
?Almost all hard drives have an accelerometer built into them, and all of that data is network-accessible,? says Bob Friedlander, master inventor at IBM. ?If we can reach in, grab the data, clean it, network it and analyze it, we can provide very fine-grained pictures of what?s happening in an earthquake.?
The aim is to accurately predict the location and timing of catastrophic events and improve the natural-disaster warning system. Seismographs that are widely used currently do not provide fine-grained data about where emergency response is needed, say the researchers.
IBM?s research is not the first time scientists have tried to use the sensors in computers to detect earthquakes. Seismologists at the University of California at Riverside and Stanford University created the Quake Catcher Network in 2008. The idea was to use the accelerometers in laptops to detect movement.
But wading through mounds of data from laptops to accurately point to information that might indicate seismic activity is not easy. For instance, how do you tell if the vibrations in a laptop accelerometer are the result of seismic activity and not a big-rig truck rolling by?
That?s why IBM researchers Friedlander and James Kraemer decided to focus on using rack-mounted servers.
?When you are looking at data from a rack that?s bolted to the floor, it?s not the same as what you get from a laptop,? says Kraemer. ?Laptops produce too much data and it?s liable to have a lot of noise.?
Servers in data centers can help researchers get detailed information because they know the machine?s orientation, its environmental conditions are much better controlled, and the noise generated by the device tends to be predictable.
?The servers in data centers are the best place you can have these machines for our software,? says Friedlander. ?We know their location, they are on 24/7,? he says. ?You know what floor they are in the building, what their orientation is. In case of an earthquake, you can calculate the shape of the motion, so it tells you about the force the structure is going to be subjected to.?
To generate reliable data, the servers have to be spread across an area. And the number of computers participating can be anywhere from 100 to a few thousand.
The servers would have to run a small piece of software that the researchers say is ?incredibly light.?
The hard-drive sensor data collected from a grid of servers is transmitted via high-speed networking to a data-processing center, which can help classify the events in real time.
With the data, researchers say they can tell exactly when an earthquake started, as well as how long it lasted, its intensity, frequency of motion and direction of motion.
IBM researchers hope companies with big data centers will participate in the project. ?It would give them an advantage,? says Friedlander. ?It would tell them about their company, their machines, and help their people.?
Over the next few months, IBM hopes to start a pilot project using its own data centers and to invite other companies to join in.
See Also:
Satellite Photos of Haiti Before and After the Earthquake
Chile Earthquake Moved Entire City 10 Feet to the West
Gallery: How to Build an Earthquake-Resistant Bridge
5 Most Dangerous U.S. Earthquake Hot Spots Beyond California
Photo: Seismograph records a 2007 earthquake in Japan. (Macten/Flickr)
Powered by WizardRSS | Full Text RSS Feeds
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment