ADVANCES in DISTRIBUTED CONTROL SYSTEMS DATA BASES D.E. Ventzas Garani G. Karapoulios C.
[email protected] [email protected] [email protected] Control Eng. (SMISA) Software Eng AI & Data Base Eng.
Department of Computer Engineering, Infomatics & Telecommunications TEI of Larissa - Larissa - GREECE
ABSTRACT: Modern industrial DCS integrate their control, monitoring, management and information tasks with distributed architectures that comply with special process requirements. Control data base is a structured application, with high degree of organization that leads to flexibility in retrieval, updating, use and control. KEYWORDS: Data base, man-machine interface, distributed, large scale, control. INTRODUCTION: Modern process (continuous, batch or total factory) control equipments automate manufacturing, improve product quality, save energy and raw materials, speed turnaround in multi-product precision manufacturing environments and enhance safety and reliability. Process control data bases optimize instrumental data, service manuals and maintenance and service data bases. The characteristics functions of a DCS are operator, system, engineer functions. Control and instrumentation databases organize, manage and manipulate complex data. The paper defines control and instrumentation databases, sets criteria, fields, extracts data and criteria [1], [2], [3], [4], [5]. Parameters associated with the process control systems and SCADA are the nature of data, the entities, attributes, values and relationships, the data structures, the relational tables, the access structures, the numeric data, character string data, logic and
databases, temporal / textual / exotic / missing data, scales and measurements, data encoding schemes, relational model, basic relational operations, transactions and concurrency control, functional dependencies, metadata, normal/denormalization; we should eliminate redundant / inconsistent data.
Fig. 1. Instrument overview panel
Fig. 2. Trend overview panel
Fig. 3. Controllers group panel
Recent Researches in Computer Science
ISBN: 978-1-61804-019-0 516
Fig. 4. Controller tuning panel
Fig. 5. Alarm summary/history log DATA BASE ANALYSIS and DESIGN: The control system data base support characteristics, see fig. 1-5, such as all operator stations functions, communication through DCS bus, field control stations functions, process /operation of event driven alarms, on-line maintenance and servicing, system configuration tools, MMI, system documentation, etc. The main data management functions include trend recording, historical message recording, data logging and user data files. Operating and monitoring panels include alarm summary, overview, control, tuning, graphic, trend group, operator message, system builder, etc. Database design produces a detailed data model that contains all the logical and physical design choices and physical data storage parameters needed to create a database in a Data Definition Language that contains detailed attributes for each entity; in
the relational model these are the tables and views; in an object database the entities and relationships map directly to object classes and named relationships and the forms and queries used as part of the overall database application within the database management system (DBMS).
COMPUTER CONTROL INSTRUMENTS Indicators with deviation alarm Controllers PID, with dead band, PID n-type, with
batch-switch, on-off, three position on-off, pulse duration, PD, blending PI
Manual loaders
with input indicator, auto/manual
Ratio set unit
ratio controller
Signal selectors
autoselector, signal selector
Selector switches
three pole-three positionm, data set 7/14 data
Computing units
first order lag/lead, velocity limiter, dead-time, compensation, moving average, line-segment, special line-segment, calculating
Data set units
with input indicator, 6/13 zone program set, 13 step program set
Batch set unit
1/2 batch data set
Data aquisition unit
sampling and memory modules
Station data link
station interconnection
Motor control unit
2/3 position motor control unit
Other computers
Instruments data base for communication with other computers
Fig. 6. Feedback control single instrument panel parameter definition Modern symbolic programming languages and process control operations dictate special computational functions. Such a
Recent Researches in Computer Science
ISBN: 978-1-61804-019-0 517
minimal data base of a single instrument is given in fig. 6. Large Control Systems need common and special control functions to accomplish efficiently on-line active process control schemes. Feedback control common functions are: COMMON FUNCTIONS
DESCRIPTION
Input Signal Conversion
linear input, square-root, non-linear input, pulse train input, digital filter
Alarm Check Functions
input open check, high/low limit alarm check, velocity alarm check, deviation alarm check
Compensation / Totalizer
compensation / tootalizer functions
Compensation Control
input / output compensation
Output Signal Processing
output open check, output velocity limits, high/low output limits, pulse width output processing, on-off output processing, output high/low limit check, multipoint analog control I/O card output processing, connection to loop display unit
Loop Connection signal transmission function, control output signal trans-mission function, loop status/instrument mode switching
STANDARD CONTROL FUNCTIONS DATA BASE Summation Hour, minute, second with
low input (<10%) cut Square root extraction F k P= ∆ ρ for
differential flowmeters Temperature-pressure correction F
P pT t
f=++
.
pressure/temperature compensation
Average value x x xn
n1 2+ + +...
Logical OR/AND/EXOR/NOT
logical functions on 16 bits
Absolute / Rounded value
Comparison Larger / Smaller of (absolute) values
Computerized acquisition recorders offer scan functions with normal / high speed for historical trends. Real Time Trend Recording
Number of recording points
Data sampling period
Recording time
Real time 108 10 sec 1 hr Historical 108, 216,
432, 864 1, 2, 5, 10 min
1,2,4,5,10 days
Flexible disk 6 per station 10 sec continuous
Diagnostics, system information and utilities data base tests include: 1. Performance information data base 2. Data base logging and reporting 3. fault tracking and analysis 4. configuration database 5. LAN test and configuration results The test log file maintains a history of hardware errors, and facilitates fault trend analysis. An instrument maintenance management system (IMMS) includes on-line calibration, monitoring, self-documentation, downloading / uploading between computer and local intelligence, reliability and instrument faults servicing, P&IDs, Instrument Index, Loops, Cable Scheduler, Data Sheets, conclusions for future investments and orders, etc. It makes calibration more reliable and the technicians more productive. Calibration criteria assembled in a database are downloaded into a portable calibrator on a per instrument tag basis. When connected to the corresponding instrument in the field, the calibrator generates the stored values for error free calibration. At the same time it captures stored data for computer uploading. Reports formats resident in the software facilitate compliance record keeping. Long term benefits include systematic, error free and easy calibration. A good control system increases the abilities of a process, while an optimal control system increases the abilities of the operators [6], [7], [8], [9], [10], [11]. Acquiring large amounts of data or data over long periods of time, we are more interested in significant changes in data values, monitored by alarms and events and analyzed at a later date, i.e. generate an alarm, store it and recall it later. Relevant information may include when the alarm was triggered, who acknowledged it, and at what time it
Recent Researches in Computer Science
ISBN: 978-1-61804-019-0 518
was acknowledged. Expert data bases include relational knowledge data acquired in operations, maintenance, service, interlocks, industrial overall performance, etc.
Fig. 7. 3-D Plant Cad data base
(control/process loops, cabling, valves etc) design and documentation
Alarm messages include system alarms, up-load errors and recovery, communication bus failure and recovery, card failure and recovery, duplex control unit reverted to normal, equalize error, self-diagnosis error, station internal data bus failure and recovery, blown fuse, power supply card failure and recovery, I/O card failure and recovery, disk ready/not ready/error, station failure, controller out of/in service, printer alarm, annunciator message, sequence message, input/output process alarm, etc, (or all above reverted to normal). Instruments present status codes and colors, in alarms or change of mode: Tag type Status Display
colour Flashing
Normal Green Under
calibration Cyan
Input open Red Output open Red Feedback control
Input high/low limit alarm
Red
instruments Input deviation alarm
Yellow YES
Velocity (rate) limit alarm
Yellow
MV high/low Yellow
limit alarm Alarm bypass
status Blue
Alarm display masked
Blue
Instrument has no alarm
Blue
Sequence Control
Status doesn’t affect color
Blue NO
Annunciator ON status Red YES Element OFF status Green
Tag marks attributes, colors and status Standard software packages such as dBase IV ή MS ΕXCEL 2010x, enhance connectivity; they include groups of models, monitoring measurands, alarms, trips and interlocks, data, trends, system engineering changes, system operation changes, operators changes, etc. An operationally or spatially distributed control system is administrated by the data base in a distributed way around the process unit. We monitor and control systems that can easily integrate and optimize the processes running on each machine and over the network, creating a more reliable and higher performance system. Distributed systems can be separated into the system backbone and the nodes. At the top level of a distributed system is the backbone of the system (key servers and the network). The software manage network transfers, data management, data visualization, alarms and events, and security and communicate with the hardware through a variety of standard industry communication protocols and next-generation machines. As a SCADA grows, data are stored and monitored centrally with live communication, with transparent software tools, integrated with the operating system's native networking technology, that maximize throughput for stable and reliable in network disruptions. OLE for Process Control
Recent Researches in Computer Science
ISBN: 978-1-61804-019-0 519
(OPC) is an industry open standard communication interface. SCADA tasks are handled by the servers with larger processors and more efficient code distributed computing. By off-loading some of the tasks such as data acquisition, analysis, and control (control and safety shutdown and intelligence) to the node level, we use more efficiently available resources. Node execution gives faster response times due to networking dependencies elimination, which frees them up to monitor a larger system. Small amounts of data can be stored in text or spreadsheet files, while larger amounts of data need sophisticated data storage formats. The types of databases are relational and streaming databases. Relational databases are flexible, but not optimized for space and throughput. Streaming databases are designed for quickly storing large amounts of data and are suitable for measurement and control applications. Data operator visibility from the server during the acquisition (live data), or viewing historical data with a built-in OPC server/client is possible. Integrating components into an existing or new backbone is difficult and time consuming. Open software tools use industry protocols, such as OPC and TCP/IP make integration easier for the end user. Access to sensitive data and file or database modification brings up security, security codes and user profiles. Real-time operating systems offer deterministic performance, high reliability, stand alone and robust operation by traditional text-based language or customized SCADA. CONCLUSIONS: Control functions and advanced timing and synchronization deliver a highly productive platform for engineers to
create deterministic embedded systems. SCADA provides powerful analysis libraries, visualization routines, and algorithms that range from basic math to advanced signal processing. A control data base and its functions have been proposed and designed for large scale industrial control systems. REFERENCES
1. YEW, Distributed Process Control System, Field / Operator Control Stations, Technical Information, 2005
2. Autoplant Instrumentation, Data Base Software, 1993
3. W. Goppel, J. Hesse, Sensors, A Comprehensive Survey, Vol. I-VIII, VCH, Germany, 1999
4. DiagSoft, QAPlus, System Diagnostics and Performance Testing, DiagSoft Inc, 1993
5. Wetherill, C. B, Brown, D. W, Statistical Control, Theory and Practice, Chapman &Hall, 1991
6. Sclaer, S., S. Mellor. Object Oriented Systems Analysis, Prentice Hall, 1988
7. Bernstein, P. A., V. Hadzilacos, and N. Goodman. 1987. Concurrency Control and Recovery in Database Systems. Addison-Wesley Publishing
8. Date, C. J., Hugh Darwen. A Guide to the SQL Standard, 4th Edition, Addison-Wesley, 1997
9. Gray J, The Benchmark Handbook for Database & Processing Systems, Morgan Kaufmann, 1993.
10. Halpin, T. Conceptual and Relational Database, Prentice Hall, 1995.
11. Snodgrass, 1998 “Temporal Support in Standard SQL.” Database Programming & Design, 11: 44–48.
Recent Researches in Computer Science
ISBN: 978-1-61804-019-0 520