facebooktwittergoogle_plusmail

The stormwater industry collects substantial amounts of point source data to comply with NPDES permits. Operators of construction sites and industrial facilities perform regular inspections, each providing a snapshot of conditions at some discrete point in time. MS4s do this for each facility, as well as monitor outfalls and receiving waters, log street sweeping activities, track illicit discharges and more. Typically, each of these reported datum are collected in isolation of the others, and collation only occurs sporadically by hand.

Beyond permit requirements, the Internet of Things (IoT) dramatically expands the potential point source data at our disposal. The EPA and regional boards now remotely monitor water quality in many areas, such as the Charles and Mystic River Watersheds, and companies like Hach, Telog and Xylem have growing product lines with falling prices that cater to this sort of monitoring. We thus certainly expect an uptick in remote monitoring data over the next decade.

IoT also enables citizen involvement in the collection of point source data, which means even more point source data for our consumption. Consider the air quality field as an example. In 2012, the University of California, San Diego,gave out 30 portable sensors to monitor air quality from smart phones. Less than a year later, a Kickstarter campaign launched to commoditize personal air quality sensors. Now, hundreds of Air Quality Eggs have been deployed across the United States, creating a “community-led air quality sensing network that gives people a way to participate in the conversation about air quality.”Programs like the Monterey Bay Sanctuary Citizen Watershed Monitoring Network already use the community to monitor watershed health, but imagine once a water equivalent of the Air Quality Egg exists.

According to Metz Inc., “Stormwater may well become a ‘big data’ industry… as the sector looks for ways to optimize… understand the efficacy of current programs… [and develop] a cohesive sector-wide message.” Indeed, at every stormwater event I’ve attended this year, data digitization has been an underlying theme; however, in many cases, the efforts have only gone as far as archiving the data, or possibly churning on it for an annual report. “We’re capturing tens of thousands of data points, but now what?” lamented one presenter at a recent conference. At CloudCompli, we’re looking hard at the “now what”, namely how we can take this point source data and provide a better understanding of the entire system.

Consider the Illicit Discharge Detection and Elimination Minimum Control Measure for Phase II MS4s, which requires permittees to develop a plan to:

  1. locate problem areas;
  2. find the source;
  3. remove/correct illicit connections; and
  4. document actions taken.

Throughout the lifecycle of an illicit discharge investigation, a number of point source data may be used: public complaints, inspection reports, monitoring results, GIS assets, work orders, citations, and more. Traditionally, someone at the city must navigate the many systems that warehouse these different pieces of data. We should go further, taking advantage of the benefits borne from digitization. For example, CloudCompli considers receiving water sampling results and may trigger an illicit discharge investigation if there’s an abrupt shift in a pertinent parameter. Further, once an investigation begins, CloudCompli helps the investigator by assessing outfalls, connections, facilities and activities and determining likely dischargers. By breaking away from looking at point source data in isolation, we’ve found we can drastically reduce costs and the time to resolution.

Illicit discharge investigations are but one of many places where we’ve looked at drawing on point source data to provide a systemic view. Drawing on InformationWeek’s article on the “Rise of Things: IoT’s Role in Business Processes”, CloudCompli’s strategy generalizes fairly well into a three-part approach:

  1. Start with the process instead of technology or data points.
  2. Look at crisis events and digitize as much of the change response as possible.
  3. Provide analytics that turn vast arrays of data into comprehensible graphs and charts.

Once we’ve established a holistic view of our stormwater system, other opportunities present themselves. In “Next Generation Compliance and Big Data”, the EPA presented a bold vision where big data enables automated feedback loops that abate pollution in real time. These loops may trigger action at the polluting facility or create market-based incentives for others to avoid exceedances at the systemic level. It also points out simpler gains, like how exposing more data in a consumable fashion enables public accountability that drives better performance.

So, with all these benefits, what’s held us back as an industry? According to a McKinsey & Co. study, there are three major barriers:

  1. To capture value from big data, organizations will have to deploy new technologies (e.g., storage, computing, and analytical software) and techniques (i.e., new types of analyses).
  2. Organizational leaders often lack the understanding of the value in big data as well as how to unlock this value.
  3. To enable transformative opportunities, companies will increasingly need to integrate information from multiple data sources.

At CloudCompli, we’re looking to address these issues. Our goal is to enable stormwater professionals to stop wading through masses of isolated point source data and instead actually make decisions that have a real and immediate impact on the water quality of the system. Borrowing a phrase from IBM’s smarter cities initiative, we should all be seeking to “turn big data into insight.”