ECID Demonstrations


Anomaly Event Detection Scenario (November 2006)


Sensors are not always reliable, and real-time data can be difficult to validate manually. An anomaly detector has been implemented within CyberIntegrator that automatically detects anomalies in real-time sensor data streams and creates events that trigger alerts to other components of the cyberenvironment.






Funding for ECID technology development comes from the National Science Foundation and the Office of Naval Research.


Interested in ECID?  Join the ecid-support list by sending email to majordomo@ncsa.uiuc.edu   with the phrase "subscribe ecid-support" in the body of the email.

In this scenario, an environmental researcher studying Corpus Christi Bay (CCBay) uses ECID to identify sensors relevant to her effort and to configure her desktop dashboard to receive notifications of anomalous readings. When the rate of false positives produced by the algorithm currently run as part of the CCBay observatory makes it unusable, she uses ECID to locate a better algorithm in use in another observatory, download it to her desktop workflow system, compare it with the current version using live sensor data, and "publish" the new algorithm to the CCBay observatory to generate a new derived event stream available (with documentation of its origin in another observatory) for her work and for that of her colleagues. Within this scenario:


  • The CyberCollaboratory and CyberDashboard serve as a means of discovering observatory resources and mapping them to the needs of individuals and project teams.
  • The Event Management framework provides a uniform infrastructure for system-wide coordination of human and automated activities.
  • The Tupelo metadata/provenance infrastructure provides rich, integrated documentation of observatory and desktop activities.
  • CI-KNOW mines metadata and provenance to provide actionable scientific recommendations based on detailed records of system activities and community interactions.
  • The CyberIntegrator enables desktop exploration of new approaches to analysis of live observatory data and the means to "publish" new capabilities to the observatory infrastructure for continuing execution.



The diagram below illustrates the architecture supporting the information flow in the above graphic.








Bacterial TMDL for Copano Bay



Researchers at the University of Texas at Austin created a fecal coliform model in ARCGIS Model Builder that predicts annual average fecal coliform concentrations in the bay. They converted that model to a macro in an Excel spreadsheet so that it could be run as a Monte Carlo simulation to predict median and 90th percentile values.


A second spreadsheet downloads USGS data using a Web service and fits distributions to the data, which were then hand-transferred to the Monte Carlo simulation.






In this demonstration, these tools were linked in CyberIntegrator to enable the researchers to automatically pass data from one tool to another at the click of button. The tools were also linked to a visualization tool called Image2Learn, which allowed simultaneous display of the likelihood of exceeding the water quality standards at each schema node (from the spreadsheet), and an ArcGIS shapefile of the watershed.



We acknowledge Ernest To, Carrie Gibson, and David Maidment from the University of Texas at Austin for sharing their tools and expertise to prepare this demonstration.





Remote Sensing Analysis at UIUC


Researchers at the University of Illinois Urbana-Champaign are investigating the multi-scale variability of vegetation and its dependence on hydrologic controls (topography, soil properties, vegetation type, and meteorology).  Remote sensing images are transformed in a sequence of steps and integrated with eco-regions to derive a regression tree model of the relationship between hydrologic parameters and the Enhanced Vegetation Index (EVI).


These remote sensing tools were registered with the CyberIntegrator to allow more flexible coupling of heterogeneous software tools to form a sequence of steps, easy re-use and modification of processing steps, and guidance about community practices from provenance information.




We acknowledge Praveen Kumar and his students from UIUC for sharing their tools and expertise to prepare this demonstration.