Web-based monitoring

The online and offline Web-Based Monitoring (WBM) system of the CMS experiment consists of a web services framework based on Jakarta/Tomcat and the ROOT data display package. The primary source of data used by WBM is the online Oracle database; the WBM tools provide browsing and transformation functions to convert database entries into HTML tables, graphical plot representations, XML, text and ROOT-based object output.

CMS Run Registry: Data Certification Bookkeeping and Publication System

The monitoring and certification of the quality of the CMS data consists of a multi-step procedure, spanning from online data taking to the offline reprocessing of data recorded earlier. The quality assessment is based on both visual inspection of data distributions by monitoring shift persons as well as algorithmic tests of the distributions against references. The Run Registry (RR), reported here, is the central workflow management and tracking tool used to certify collected data, to keep track of the certification results and to expose them to the whole CMS collaboration. The RR consists of the suite of 3 applications (online, offline and user) with a web-based user interface frontend and a dedicated database and includes facilities for the manual input of data quality decisions and the automatic collection of detector and beam conditions, as well as the querying of the data and the export of selected information onto the screen and/or into flat output files of various formats. Among other purposes, is regularly used by the CMS collaboration for the creation of official 'good-run list' files which are used as input to downstream selection of the data for reprocessings and for physics analyses. Automation of the data input and querying is possible through an API. The web based service is protected using the CERN Single-Sign-On service. Locally, on the Run Registry, an SSL-certificate system is used to authorize users for different roles: read-only, online shift, offline shift or expert.

CMS CSC Expert System: towards the detector control automation

Cathode strip chambers (CSC) compose the endcap muon system of the CMS experiment at the LHC. Two years of data taking have proven that various online systems like Detector Control System (DCS), Data Quality Monitoring (DQM), Trigger, Data Acquisition (DAQ) and other specialized applications driven by the experts are doing their task very well. But the need for better integration between these systems is starting to emerge. Automatic and fast problem identification and resolution, tracking detector performance trend, maintenance of known problems, current and past detector status and alike tasks are still hard to handle and require a lot of efforts from many experts. Moreover, this valuable expert knowledge is rarely well documented. CSC Expert system prototype is aiming to fill in these gaps and provides a solution for online systems integration and automation. Its design is based on solid industry standards - Service Bus and Application Integration, Data Warehouse and Online analytical processing (OLAP), Complex Event Processing (CEP, i.e. Rule Engine) and ontology based Knowledge Base. CSC Expert system receives and accumulates Facts (i.e. detector status, conditions, shifter/expert actions), manages Conclusions (i.e. hot device, masked chamber, weak HV segment, high radiation background), stores detector inventory - Assets (i.e. hardware, software, links) and outputs Facts, Conclusions and Assets for other applications and users. CEP engine allows experts to describe their valuable knowledge in SQL-like language and to execute it taking subsequent action in real time (e.g. sends emails, SMS'es, commands and fact requests to other applications, raise alarms). A year of running the CSC Expert system has proven the correctness of the solution and displays its applicability in detector control automation.