1. Data Processing Model for the CDF Experiment.
- Author
-
Antos, J., Babik, M., Benjamin, D., Cabrera, S., Chan, A. W., Chen, Y. C., Coca, M., Cooper, B., Farrington, S., Genser, K., Hatakeyama, K., Hou, S., Hsieh, T. L., Jayatilaka, B., Jun, S. Y., Kotwal, A. V., Kraan, A. C., Lysak, R., Mandrichenko, I. V., and Murat, P.
- Subjects
- *
ELECTRONIC data processing , *COMPUTER systems , *DATABASES , *INFORMATION storage & retrieval systems , *COMPUTER files , *FILE Transfer Protocol (Computer network protocol) , *DETECTORS , *STATISTICAL matching , *CALORIMETRY - Abstract
The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialised physics interests. The design of the processing control system makes strict requirements on bookkeeping records, which trace the status of data files and event contents during processing and storage. The computing architecture was updated to meet the mass data flow of the Run II data collection, recently upgraded to a maximum rate of 40 MByte/sec. The data processing facility consists of a large cluster of Linux computers with data movement managed by the CDF data handling system to a multi-petaByte Enstore tape library. The latest processing cycle has achieved a stable speed of 35 MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and data-handling capacity as required. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF