ID101: Medium-speed feedback software based on the existing control system

A. Taketani , T. Fukui, K. Kobayashi, T. Masuda R. Tanaka, T. Wada, A. Yamashita, T. Ohshima

SPring-8, Kamigori, Ako-gun, Hyogo 678-12, Japan

The Equipment Manager(EM) model has been employed SPring-8 storage-ring control system. It is a client-server system. The EM server accepts a human-oriented command from an client on an operator console via network and accesses VME I/O modules. It takes about 30msec typically for each command execution. We have developed a new framework to reduce the feedback time - EM agent(EMA) software scheme. It is based on the original EM scheme and is able to describe the user aplication with faster speed. The features are as follows: (1) It is able to run on the CPU board with other existing control programs. (2) Feedback loop algorism is described by the human-oriented command which is already used in the control system. (3) It takes typically 1msec for each I/O. (4) The VME I/O library which is already developed can be used. (5) The control command and feedback parameters are sent through the existing control system. (6) The feedback parameter can be set even the feedback loop is running. The frame of the system and an example of the RF voltage control are presented in this paper.

Submitted by: Atsushi Taketani
Full address: SPring-8, Kamigori, Ako-gun, Hyogo 678-12, Japan
E-mail address: taketani@spring8.or.jp
Fax number: +81-7915-8-0850
Keywords: equipment manager feedback VME


ID102: THE VLT CONTROL SOFTWARE DEVELOPMENT AND INSTALLATION

G.RAFFI

European Southern Observatory (ESO), Karl-Schwarzschild-Str. 2 D-85748 Garching bei Muenchen, Germany

The Very Large Telescope (VLT) control software installation is on-going quitat the new Paranal Observatory of ESO in Chile. The whole process of commissioning will take up to year 2000, when all the four VLT 8 meter telescopes will be completed. The VLT control software approach has been to use at the basis standard commercial and public domain products (Rtap by HP, Tcl/Tk). A very comprehensive software layer, called VLT common software was added on top, before any specific control software was written. This has been an in house development, provided by a team of about 20 people. The VLT control software amounts now to about one million lines of code. The common part (VLT common software) is distributed in 2 releases per year to Contractors and Consortia of Institutes associated to the VLT project and they are requested to use it in their VLT developments. The advantage of this approach is clear in terms of uniformity and maintainability of software. This has raised though a number of needs, like backwards compatibility and automatic regression testing (test scripts amount to almost another million lines). A novel approach of the VLT software in general, is that the VLT control software is well integrated into an end-to-end Data Flow concept. The whole is enabling automatic execution of observations from proposal preparation to archiving and is suitable for service observing. A relevant feature of the VLT software at this stage is that it has been validated (at about 90%) by re-engineering the New Technology Telescope (NTT) at the La Silla Observatory in Chile. This was done by an independent ESO team and the NTT is now back in service operation, providing at any new release plenty of useful feedback. This gives us the confidence that, above the obvious differences of hardware between telescopes, the VLT software will work properly also at the VLT.

Submitted by : G.RAFFI
Full address: European Southern Observatory (ESO), Karl-Schwarzschild-Str. 2 D-85748 Garching bei Muenchen, Germany
Email: graffi@eso.org
Keywords: engineering, status report, control software, software management

Participation: To be confirmed (I am afraid a final confirmation will only be possible much later, in relation to the actual status of VLT integration)


ID103: Applying Plant Information Management Concepts to Beamline Control Systems

Roberto Pugliese Fulvio Billè Alessandro Abrami Juray Krempasky Renata Krempaska Dario Giuress Luigino Battistello Rudi Sergo

Beamline Control Group perimental Division crotrone Trieste

Software developers in advanced research facilities, often face the problem of integrating a horde of different instruments, commercial tools and independent pre-existing systems in a possibly distributed heterogeneus environment. Moreover, experimental scientists ask for more and more advanced features. While simple "process control" once was enough, now the automation arena has broadened dramatically and true plant information management would really be welcome.The paper describes the Elettra beamline plant information management architecture: a set of ools based on Distributed Objects, Database, Java and WWW technologies. The tools integrate smoothly with the Elettra beamline control system and provide advanced features such as historical data managment, event tracking, advanced diagnostic, document managment, quality control and scheduled maintenance.

Submitted by: Roberto Pugliese
Full Address: Beamline Control Group Experimental Division Sincrotrone Trieste S.C.p.A. Strada Statale per Basovizza 14 km 163,5 34012 Trieste Italy
E-mail address: Roberto.Pugliese@elettra.trieste.it
Fax number: +39-40 3758565
Keywords: Databases, Distributed Objects, Java, WWW


ID104: Control System Reliability at Jefferson Lab

Karen S. White, Hari Areti, Omar Garza

Thomas Jefferson National Accelerator Facility

At Thomas Jefferson National Accelerator Facility (Jefferson Lab), the availability of the control system is crucial not only to the operation of the accelerator for experimental programs, but also for machine development and maintenance activities. Jefferson Lab's control system (EPICS) uses over 70 vxwrks based I/O controllers, 15 Unix workstations, 60 VME crates, 100 CAMAC crates and a variety of VME and CAMAC modules. The software consists of nearly 100 applications, some of which are highly complex. Down time attributable to the control system includes the time to troubleshoot and repair the problem and the time to restore the machine to operation of the scheduled program. This paper describes the availability of the control system during the last year, the contributors to down time and the response to problems. Strategies for improving the robustness of the control system are detailed. These include changes in hardware, software, procedures and processes. The improvements range from the routine preventive maintenance of the hardware to improving our ability to detect, predict and prevent problems. This paper also describes the user-friendly tools being developed to assist in control system troubleshooting, maintenance and failure recovery processes.

Submitted by: Karen S. White
Full address: 12000 Jefferson Avenue, Newport News, VA 23606
E-mail address: karen@jlab.org
Fax number: 757-2697049
Keywords: Reliability, Control Systems, EPICS Operations


ID105: Overview Of Control System For Jefferson Lab's High Power Free Electron Laser

A. S. Hofler, A. C. Grippo, M. S. Keesee, and J. Song

Thomas Jefferson National Accelerator Facility

In this paper the current plans for the control system for Jefferson Lab's IRFEL are presented. The goals for the IRFEL control system are fourfold: 1) to use EPICS and EPICS compatible tools, 2) to use VME and Industry Pack interfaces for FEL specific devices such as controls and diagnostics for the drive laser, high power optics, photocathode gun, and electron-beam diagnostics, 3) to migrate Continuous Electron Beam Accelerator Facility (CEBAF) technologies to VME when possible, and 4) to use CAMAC solutions for systems that duplicate CEBAF technologies such as SRF and DC magnets. This paper will describe the software developed for the FEL specific devices and provide an overview of the FEL control system. *This work supported by U.S. DOE Contract No. DE-AC05-84ER40150

Submitted by: Alicia S. Hofler
Full address: 12000 Jefferson Avenue, Newport News, VA 23606
E-mail address: hofler@cebaf.gov
FAX number: (757)269-7049
Keywords: FEL (Free Electron Laser), Control Syste Overview, EPICS Operations


ID106: NEW DESIGN OF THE PSI ACCELERATOR CONTROL SYSTEM DATABASE

T.Blumer, D.Anicic, H.Lutz

Paul Scherrer Institut Villigen, Switzerland

The database for the PSI accelerators contains all static information for the accelerator. The data resides in a relational database ORACLE. In the present implementation of the database nearly all entries in the tables have to be maintained individually and manually, this mechanism has now grown beyond control. A new design, making full use of the object nature of the data, with inheritance and specialization is presented.

Submitted by: T.Blumer, D.Anicic, H.Lutz
Full address: Paul Scherrer Institut Villigen, Switzerland
E-mail: Thomas.Blumer@psi.ch


ID107: REALTIME CONTROL TOOLS IN THE PSI ACCELERATOR CONTROL SYSTEM

T.Blumer, D.Anicic, I.Jirousek, A.Mezger

Paul Scherrer Institut Villigen, Switzerland

The distributed control system for the PSI accelerators provides optimized input output functions for closed loop control. These basic functions are then used to build generic applications for global online beam control. Some of these applications, including the basic mechanism used, are explained. Questions about testing and the quality of these closed loop controls are discussed. Resulting performance, timing and realtime response in the real system and under normal load are presented.

Submitted by: T.Blumer, D.Anicic, I.Jirousek, A.Mezger
Full address: Paul Scherrer Institut Villigen, Switzerland
E-mail: Thomas.Blumer@psi.ch


ID108: Implementation and Use of a Tcl/Tk API for the Accelerator Control System of the The Svedberg Laboratory

L. Thuresson, V Ziemann

The Svedberg Laboratory, Uppsala, Sweden

The installation of a Tcl/Tk Application Program Interface (API) for the The Svedberg Laboratory accelerator control system have greatly simplified the development of high level applications for steering, data acquisition and control. The Tcl scripting language has a well defined C interface that makes it easy to add user extensions in compiled code. The Tk toolkit provides a basic widget set for creating platform independent Graphical User Interfaces (GUI). Third party extension packages add high level widgets that further reduce development time for GUI based applications. We describe the implementation and discuss problems arising when maintaining a Tcl installation incorporating large numbers of extension packages. Finally, we describe some of the applications written using this frame work, such as automatic beam alignment for radiation therapy and a GUI based interface for operation and testing of the personal safety radiation protection interlock system.

Submitted by: Leif Thuresson
Full address: The Svedberg Laboratory P.O. Box 533 751 21 Uppsala Sweden
E-mail address: leif.thuresson@tsl.uu.se
Fax number: +46 18 18 38 33 ( After 27/6-97 fax number will change to +46 18 471 38 33 )
Keywords: Tcl/Tk GUI


ID109: Alarms processing software at Fermilab Tevatron

Seung-chan Ahn

A new distributed alarms processing software at Fermilab Tevatron is described. The software consists of 4 components, each with clearly defined roles. They are ALARM_DRIVER, ALARM, ALARM_DB and ALARM_DAEMON. ALARM_DRIVER handles all communications with the front-ends via Fermilab developed ACNET network protocol. ALARM_DRIVER forks 2 subprocesses; ALARM processes alarms, and ALARM_DB accesses the device database. The communication among the 3 components uses pipes. ALARM_DAEMON interacts with ALARM and various applications. The communication protocol employed between ALARM and ALARM_DAEMON is UDP multicast for general alarms and TCP/IP for a few specific, non-repetitive requests. All components run as detached background processes under VMS operating system. Presently ALARM_DAEMON is ported to Unix platforms. ALARM_DRIVER and its child processes will be ported to Unix in the future. ALARM applications are being constructed using Java.

Submtted by: Seung-chan Ahn
Address: Fermilab MS 347, Batavia, IL 60510-0500, U.S.A.
E-mail: ahn@fnal.gov
FAX: (1) 630-840-3093
Keywords: ALARM, TEVATRON


ID110: Automated Task Scheduling using Multiple FSMs at Fermilab

L. Carmichael

Fermi National Accelerator Laboratory

This paper describes the implementation of the Time-Line Generator (TLG); a task scheduler that utilizes concurrently running Finite-State Machines (FSMs) to generate current task schedules and to test new task schedules in order to ensure that they are valid and fall within a user-defined safety envelope. Tasks, such as proton injection and p-bar production, are denoted by 8 bit event masks that are generated at a 15 Hz frequency. The goal of this work is to develop a facility that provides a level of automation to complex event scheduling operations and enables these operations to be performed efficiently and at an acceptable rate. Event schedules or time-lines are specified in terms of a set of pre-defined modules. Each module is represented by a set of atomic events and a list of attributes which serve to define inner-module event spacing, spacing between modules and additional module functionality. Once a time-line is defined, it is downloaded to the FSMs operating at the TLG. The FSMs then test each time-line in order to ensure that the conditions imposed by the module attributes and a user-specified safety envelope are met. Once a time-line is deemed acceptable, the events within the time-line are played by a FSM. The concurrency and modularity of the FSMs speeds up the process of testing and playing time-line events and also allows new tests and functionality to be easily added to the architecture of the TLG. The hardware component of this architecture consists of a SHARC 6000 VME card with multiple DSPs supporting the concurrent processing utilized by the FSMs.

Submitted by: Linden Carmichael
Full address: M.S. #347 P.O.Box 500,Fermi National Accelerator Laboratory Batavia ,IL 60510
E-mail address: carmichael@cns25.fnal.gov
Fax number:630.840-3093
Keywords: Scheduling, Automation, FSM, DSP


ID112: CIS Controls at IUCF

J.C. Collins, Wm.P. Jones, Wm. Manwaring

Indiana University Cyclotron Facility (IUCF)

CIS is the 200 MeV Cooler Injector Synchrotron now being commissioned at IUCF. This paper gives a broad outline of the entire control system, emphasizing its software aspects. While this is a modest project done with small budget and staff, it includes control of an ion source, an RFQ-Linac pre-injector, a synchrotron, associated beamlines and an interface to the existing IUCF accelerators. The controls platform consists of VME hardware (using modules purchased commercially and developed in-house), DEC Alpha workstations, X-Terminals and the Vsystem software system. We discuss our hardware-software interfaces, our experience with operator generated control screens, diagnostics displays, our timing system and ramping.

Submitted by: John C. Collins
Full address: IUCF 2401 Milo B. Sampson Lane Bloomington, IN USA 47408-1368
Email: collins@iucf.indiana.edu
Keywords: software, Vista, VME, synchrotron


ID113: An automatic procedure to find and set the shiftphases for the superconducting resonators in theALPI accelerator

Stefania Canella, Marco Poggi

Istituto Nazionale di Fisica Nucleare Laboratori Nazionali di Legnaro

ALPI is a linear post-accelerator of a 15 MV XTU tandem and its accelerating elements are 56 quarter wave superconducting resonators. The LINAC may boost beams of several ion species, with different state of charge, at various injection energies. Most ALPI resonators work at 160 MHz, but in the low bet section four resonators and a buncher have an operating frequency of 80 MHz. All the superconducting resonators operate in self-excited loop mode and are independently phased through digital phase shifters with respect to a common reference at 160 and 80 MHz. The resolution of the digital phase shifters is 1.4 degree. The LINAC setup procedure involves the magnets, the diagnostics and the RF control systems, and needs from 10 to 48 hours of continuous work. From 5 to 10 hours is the time usuall necessary to find the shift phases for the cavities.A procedure is now under development to find and set automatically the shift phase for each accelerating or bunchi resonators and so, hopefully, to decrease time and man-powe needed for the whole LINAC setup.

Submitted by: Stefania Canella
Full address: Istituto Nazionale di Fisica Nucleare Laboratori Nazionali di Legnaro Via Romea, 4 35020 Legnaro Italy
Fax: 39.49.8292514
Email: canella@lnl.infn.it
Keywords: control and supervisory systems, automation


ID114: LIONs at the Stanford Linear Accelerator Center

Richard W. Zdarko

Bob Simmons Stanford Linear Accelerator Center

The term LION is an acronym for Long Ionization Chamber. This is a distributed ion chamber which is used to monitor secondary ionization along the shield walls of a beam line resulting from incorrectly steered charged particle beams in lieu of the use of many discrete ion chambers. A cone of ionizing radiation emanating from a point source as a result of incorrect steering intercepts a portion of 1 5/8 Heliax cable (about 100 meters in length) filled with Argon gas at 20 psi and induces a pulsed current which is proportional to the ionizing charge. This signal is transmitted via the cable to an integrator circuit whose output is directed to an electronic comparator, which in turn is used to turn off the accelerated primary beam when preset limits are exceeded. This device is used in the Stanford Linear Accelerator Center's (SLAC) Beam Containment System (BCS) to prevent potentially hazardous ionizing radiation resulting from incorrectly steered beams in areas, which might be occupied by people. This paper describes the design parameters and experience in use in the Final Focus Test Beam (FFTB) area of the Stanford Linear Accelerator Center.

Author: Richard W. Zdarko
Address: 2575 Sand Hill Road, MS-050, Menlo Park, CA94025
E_Mail:
FAX: 415-926-3800
Keywords: Ionization Chamber, Beam Containment System


ID115: An Accelerator Controls Network Designed for Reliability and Flexibility

William P. McDowell and Kenneth V. Sidorowicz

Advanced Photon Source,Argonne National Laboratory,9700 Cass Ave.Argonne.IL 60439, USA.

The Advanced Photon Source (APS) accelerator control system is a typical modern system based on the standard control system model which consists of operator interfaces, a network, and computer-controlled interfaces to hardware. The network provides a generalized communication path between the host computers, operator workstations, input/output crates, and other hardware that comprise the control system. Because the network is an integral part of any modern control system, its performance determines many characteristics of the control system. This paper describes the methods used to provide redundancy for various APS network system components as well as methods used to provide comprehensive monitoring of this network. The effects of archiving tens of thousands of data points on a regular basis and the resulting impact on the controls network will be discussed. Metrics are provided on the performance of the system under various conditions.

Submitted by: William P. McDowell
Full address: Advanced Photon Source, Building 400, Argonne National Laboratory, 9700 Cass Ave. Argonne. IL 60439, U.S.A.
E-mail address: wpm@aps.anl.gov
Fax number: 630-252-6123
Keywords: Controls Network Reliability


ID116: Examples of Applications of Industrial Control Systems (PLCs) for Vacuum Equipment

R. Gavaggio, P. Strubin

CERN

This paper describes some developments, in the domain of vacuum controls, using industrial control systems (PLCs). After showing the different advantages and drawbacks of the PLC for vacuum applications, when compared with a specific solution, we mention the main points that led us to make this choice. Then, we describe some realisations (pumping groups and test benches for vacuum chambers) using PLCs and the related fieldbuses to interconnect these PLCs to the upper layer of the control system. Finally, we sketch out one of the possible solutions to control some vacuum equipment of the future LHC machine.

Submitted by: Richard GAVAGGIO
Full address: CERN div. LHC-VAC, CH-1211 GENEVA 23, SWITZERLAND
Email address: Richard.Gavaggio@cern.ch
Fax number: ++ 41 22 767 51 00
Keywords: PLC, Vacuum, Industrial Control


ID118: A Comparison of Vsystem and EPICS

P. Clout, V. Martz, R. Rothrock and R. Westervelt

Vista Control Systems, Inc.

Both Vsystem and EPICS have gained considerable acceptance for use in experimental physics control systems. While other commercial packages have also been used, their use is much more limited, generally to one or two institutes. While both Vsystem and EPICS originate from the same division at Los Alamos National Laboratory, they are in fact different in their assumptions and capabilities. This paper will attempt a fair comparison of the two packages so that in the future people can make the best choice for their particular project. The comparison will emphasise the technical issues but will also point out other advantages of each package.

Submitted by: Peter Clout
Full address:Vista Control System, Inc. 134B Eastgate Drive Los Alamos, NM 87544-3304
E-mail: clout@vista-control.com
Fax number: (505) 662-2484
Keywords: Software , Tools, Control , System


ID119: The Control System for the 10.5 MeV PET-Isotope Production Accelerator at Fermilab*

Elliott McCrory

Fermi National Accelerator Laboratory,Batavia, IL 60510 USA

The PET-Isotope production accelerator control system is based on the Fermilab-standard Internet Rack Monitor (IRM), with user consoles running through LabVIEW (tm, National Instruments) and TCL-TK applications on Macintosh and Sun/UNIX workstations. Extensive hueristic displays in LabVIEW have been created, with data obtained directly from the IRM's over the internet. These displays present the accelerator components logically and schematically to the user, providing a descritptive and useful interface to the accelerator. Additiionally, these applications provide intellegent assistance for routine procedures, e.g., turning on the RF. The basic control features of alarm reporting, data logging and save/restore are implemented through C++ programs, driven by the TCL-TK user interface. Some critical functionality has been implemented in the IRM's as "local applications," run at 10 Hz, to regulate such things as the resonant cavity frequency and the beam-line magnets' fields using PID algorithms.

* Fermilab is operated by the Universities Research Association under contract by the Department of Energy, contract number DE-AC02-76H030000.

Submitted by: Elliott McCrory
Full address: MS 307, PO Box 500, Fermilab, Batavia IL 60510 USA
E-mail: mccrory@fnal.gov
FAX number: 630-840-4552
Keywords: IRM LabVIEW TCL-TK application


ID120: EPICS: Extensible Record and Device Support

M. R. Kraimer and L. R. Dalesio

Argonne National Laboratory and Los Alamos National Laboratory

An important feature of an EPICS (Experimental Physics and Industrial Control System) based IOC (Input/Output Controller) is extensible record and device support. This feature allows each site to add custom record types and device support. This paper discusses the current support and also some thoughts for making it even more flexible.

Submitted by: Martin R. Kraimer
Full address: Martin R. Kraimer Argonne National Laboratory 9700 South Cass Avenue Argonne, Illinois 60439-4803 U.S.A.
E-mail address: mrk@aps.anl.gov
Fax Number: (630)252-6123
Keywords: EPICS IOC