Say data mining these days and people are likely to think you're talking about the NSA and its various surveillance techniques. However, the concept of analyzing vast quantities of data to find trends and patterns has real utility in many other, lower-profile applications.
As data acquisition equipment gets better and cheaper, and as sensors get smaller, smarter, and more frugal with power, the net result is a great increase in the amount of data collected. Rather than being limited to what they can afford to measure, people are increasingly adding sensors to measure what they want to measure.
Let me explain: if you're limited by cost constraints so you can only measure, say, temperature in a couple of places, you'll put those sensors in the most critical parts of your process—-the spots where you NEED to know the temperature. If you have an almost unlimited number of temperature sensors, you can scatter them throughout the whole process. The result? A much clearer picture of what's happening with temperature throughout the process. Not just what you need to know, but reflecting what you want to know.
The downside, of course, is that you now have a metric ton of data from which you must sift the wheat from the chaff. Luckily, there are folks out there figuring out how to handle large data sets and analyze their contents . For an introduction to this increasingly important topic, check out Knowledge Discovery from Sensor Data, which appeared in the March 2006 issue of Sensors.