63
Instituto de Plasmas e Fusão Nuclear Instituto Superior Técnico Lisbon, Portugal http://www.ipfn.ist.utl.pt B. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition ITER CODAC ITER CODAC Bruno Soares Gonçalves [email protected] http://ipfn.ist.utl.pt/EU-PhD/

ITER CODAC

  • Upload
    zelia

  • View
    46

  • Download
    0

Embed Size (px)

DESCRIPTION

ITER CODAC. Bruno Soares Gonçalves [email protected] http://ipfn.ist.utl.pt/EU-PhD/. CODAC. CO ntrol, D ata A ccess and C ommunication system. Large-scale experiments: control and data aquisition challenges. The next generation of physics experiments will be highly complex - PowerPoint PPT Presentation

Citation preview

Page 1: ITER CODAC

Instituto de Plasmas e Fusão NuclearInstituto Superior TécnicoLisbon, Portugalhttp://www.ipfn.ist.utl.pt

B. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition

ITER CODACITER CODAC

Bruno Soares Gonç[email protected]

http://ipfn.ist.utl.pt/EU-PhD/

Page 2: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition2

CODACCODAC

COntrol, Data Access and Communication system

Page 3: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition3

Large-scale experiments:Large-scale experiments:control and data aquisition challengescontrol and data aquisition challenges

The next generation of physics experiments will – be highly complex– raise new challenges in the field of control and

automation systems

– demand well integrated, interoperable set of tools with a high degree of automation

– deliver and process data at a rate of up to hundred GBytes/s.

– deploy and integrate systems with different degrees of complexity and provenience

Page 4: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition4

Commercial technology will likely meet the

basic requirements on which physics

experiments can leverage for building future

control systems

Page 5: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition5

However…However…

Future systems are envisioned to be more

than an order of magnitude larger

than those of today

Page 6: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition6

ITER hardware developers/providers should be ready to

–Providing well integrated, interoperable set of tools

–Deploy and integrate systems with different degrees of complexity and provenience

Page 7: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition7

providing robust, fault tolerant,

reliable, maintainable,

secure and operable control

systems

More challenging will be… More challenging will be…

Page 8: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition8

Large-scale experiments:Large-scale experiments:R&D on control and data aquisition R&D on control and data aquisition

R&D activities target

– Self-triggered front-end electronics with

adequate output bandwidth and data processing

– MIMO controllers with efficient resource sharing between control tasks on the same unit

– Massive parallel computing capabilities.

Page 9: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition9

Is ITER any different?Is ITER any different?

ITER CODAC is a challenging endeavour

• ITER will generate a huge quantity of experimental data

– 150 plant systems– 1 000 000 diagnostic channels– 300 000 slow control channels– 5 000 fast control channels– 40 CODAC systems– 5 Gb/s data– 3Pb/year data (e.g. 12 IR cameras

in a 10 minutes discharge: 1.728 Tbytes)

In addition...

ITER will require a far higher level of availability and

reliability than previous/existing tokamaks.

Page 10: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition10

Why do we need it?Why do we need it?

Real Time Measurement and Control of magnetically confined

plasmas is a critical issue for safe operation and

high performance of fusion reactors.

It ensures:

•Reliability: control the plasma dynamic over long times

•Reproducibility: avoid deviations from the reference scenario

•Plasma & machine protection: avoid of instabilities & disruptions

Page 11: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition11investmentenvironment components

Protection to the…

Page 12: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition12

What are the functions of the What are the functions of the control systems?control systems?

Fusion machines have a wide range of Plasma Control Systems

– plasma control tools: to reach and reproduce scenarios which cannot be programmed

– operation supervisor: to protect Plasma and Machine

Both thrusts will be steered by the need to satisfy regulatory requirements while effectively

controlling a burning plasma.

Page 13: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition13

What are the inherent constraints?What are the inherent constraints?

Fusion reactors sites will be a Nuclear Registered sites

• Similar to Nuclear power station in terms of quality, audit trails, ..... etc.

• Will need control system architecture within the licensing constraints

• Requires tools to guarantee safety, protection of investment and

guaranteed operation

• Hostile environment for measurements, networks, electronics – human

access will be restricted

• Will require a far higher level of availability and reliability unforeseen on

previous/existing tokamaks

Page 14: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition14

Are there similarities with Are there similarities with fission reactors?fission reactors?

Operation supervisory tools will likely be similar to the ones

in use in fission reactors

– slow response time of the control system the order of hundreds of milliseconds.

– Typical functionality of an industrial SCADA (Supervisory Control And Data Acquisition)

• displays on mimic diagrams, • trending, • warning and alarm handling, • manual triggering of commands or changes to set-points.

However, fusion reactors are expected to explore more advanced operation scenarios capable of

– sustaining for a long duration a steady-state plasma

– suppressing plasma instabilities almost completely.

Page 15: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition15

ITER Challenges:ITER Challenges:Inherent ConstraintsInherent Constraints

ITER is assumed to be totally legacy-free for hardware and software• Methodologies will have to be tested and proved on existing fusion

devices before implementation on ITER• It will be necessary to take informed decisions based on technology

progress• Maintenance will be an issue and proliferation of technologies must be

avoided

ITER is an international project• In-kind procurement world-wide• Integration of Plant Systems from all participants• The implications of in-kind delivery of subsystems need to be recognised• Powerful remote access networks• Remote access security

Page 16: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition16

ITER Challenges:ITER Challenges:Scientific ExploitationScientific Exploitation

ITER will generate huge quantities of experimental data – PBytes per year (but still less than LHC)(e.g. 12 IR cameras data resulting from a 10 minutes shot: 1.728 TBytes)

ITER will provide tools for continuously accessing and analysing data during a pulse - Requires data indexing by events

ITER will have a very strong flexible set of diagnostics and tools for optimising the performance during a pulse - Adequate tools and methodologies need to be developed

ITER will have a limited number of pulse cycles and an unlimited number of ideas to be tested– Will schedule and reschedule many activities during a pulse

ITER will evolve both equipment and ideas over 20 years– A lifetime of 30 years including procurement – evolution must be built

into CODAC

Page 17: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition17

What is required for advanced What is required for advanced operation scenarios?operation scenarios?

For fusion burn control is essential to integrate simultaneously – multiple measurements from different sensors, – real-time plasma modelling from several tools – multiple actuators

Requires fast control loops with time constraints of the order of tenths of microseconds.

Furthermore, the control systems are mandatory to be robust, and fault tolerant

Page 18: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition18

ITER CODACITER CODAC

CODAC provides the COntrol, Data Access and Communication functions

for ITER, allowing integrated operation.

This includes:

•continuously monitoring the Plant Systems;

•displaying their status to operators including alarms;

•preparing and automating scheduled operations (including plasma pulses);

•recovering data from Plant Systems;

•storing and making all the data available.

CODAC uses multiple logical and physical networks to segregate these disparate functions.

Page 19: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition19

ITER: ITER: as seen by CODACas seen by CODAC

• CODAC integrates ALL ITER Plant Systems

• Many networks: operation, interlocks,

safety

• CODAC functions are like present tokamaks

Typical CODAC Functions

Time & Synchronisation

Data Access

Gateway

Safety NetworkCODAC Network

I nterlock Network

Coil power supply & distribution

system

TF coil system

PF coil system

Correction coil system

CS coil system

Diagnostic systems

Additional heating & power supply systems

Gas inj. system

Pellet injector system

Remote handling system

Component cooling & chilled

water system

Tokamak cooling water

system

Heat rejection system

Tritium plant

system

Cryoplant system

Glow discharge system

Radiation & Environmental

monitoring system

Cryostat & VV pressure suppression

system

FPSS VVPSS

Access control

ECH&CD system

NBH&CD system

ICH&CD system

LHH&CD system

Experiment Sites

Central Interlock System

Steady state power supply

system

Vacuum pumping system

Magnet I&C

system

Miscellaneous instrumentation

systems

Liquid distribution

Gas distribution & compressors

Central Safety System

Plant Monitoring

Operation Scheduling

Plant Operation

Page 20: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition20

ITER:ITER:Instrumentation and ControlInstrumentation and Control

I&C is in 3 clear tiers– Safety: protects

personnel, population and environment

– Interlock: protects ITER investment

– CODAC: operates ITER

I&C is in 2 layers

– Plant Systems: local responsibility

– Networks when responsibility lies across Plant Systems

Page 21: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition21

ITER Instrumentation & Control ITER Instrumentation & Control System physical architectureSystem physical architecture

Page 22: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition22

CODACCODAC

CODAC shall provide:

• Supervisory control functions as a project wide and interface specifications between CODAC and Plant Systems.

• Central data management functions i.e. data archiving, data monitoring and visualization functions.

• PSH functional profile and data exchange software for the asynchronous communication interface between CODAC and Plant System.

• Mini CODAC as a tool to carry out FAT (Factory Acceptance Test)

• Specifications for the Network Interface Units with their interface specifications.

• Self-description schema and tool for Plant System I&C designers.

• Functional mimic diagrams for Plant System data monitoring, trends, plasma discharge preparation, sequencing, and data display.

• Functions for global and plant operating states management, plasma control, data recording with time stamp, data marshalling, data archiving...

• Capability to log all commands and state transitions along with all the associated data with time stamping.

Page 23: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition23

ITER:ITER:Plant systems I&CPlant systems I&C

PSH

CODAC System / Mini-CODAC

Plant System Controller

Signal Interface

Sensors/ Actuators

Pla

nt

Sy

ste

ms

I&

C

PON

PSHPSHPlant System

ControllerPlant System

Controller

Signal Interface

Signal Interface

Sensors/ Actuators

Sensors/ Actuators

PON

Page 24: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition24

Network Interface between plant Network Interface between plant system I&C and central I&C systemssystem I&C and central I&C systems

SDN

AVNTCN

CSN

PON

CIN

PSH

Plant System Controller

Signal Interface

PISController

PSSController

Plant Control System Plant Interlock System Plant Safety System

Plant System I&C

I/F

I/F

I/F

I/F

I/F

Network Panel

Signal Interface Signal Interface

Page 25: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition25

CODAC and Plant Systems I&C CODAC and Plant Systems I&C architecturearchitecture

ITE

R_C

OD

AC

_P

CD

H_

Fig

ure

s_

Vis

io_0

00

3

CODAC Networks

Equipment Equipment Equipment

Equipment

Subsystem(PLC)Subsystem

(PLC)

Digitizers

Subsystem(PLC)

Subsystem(PLC)

Plant System - A Plant System - B Plant System - C

Subsystem(PC)

Subsystem(PLC)

Subsystem(PLC)

Digitizers

Digitizers

Digitizers Equipment

CODAC Systems

I&C Bridge I&C Bridge I&C Bridge

Plant System Host Plant System Host Plant System Host

CODAC Systems

Provides supervisory functions of ITER plant operation, plasma experiment, overall ITER plant operating status monitoring, data archiving and alarm handling, HMI interface and remote experiment handling functions.

Modular design with Dual Redundancy required

Page 26: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition26

CODAC componentsCODAC components

The principal CODAC System is the Supervisory Control System

The Supervisory Control System:

• dynamically allocates any required resources to an ITER operation Task.

• manages a dynamically evolving set of concurrent activities, each of which is driven by an Operation Schedule.

The Operation Schedule

• is prepared by Schedule Preparation and each Operation Schedule requires Schedule Validation before becoming executable.

• is executed by Schedule Execution once the resources are made available by SCS.

There is a strong interface between scheduling and ITER operation planning

Page 27: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition27

CODAC componentsCODAC components

The status of ITER is obtained from Plant Monitoring, which also generates a data stream for Data Logging.

•Maximum refresh frequency proposed is 3 Hz, corresponding to a human reaction.

•Minimum rate is 0.1 Hz to ensure a continuous record and continuous functionality checking.

Monitoring data are available in the Experiment Sites to enhance contact with operation.

The functionality is typical of an industrial SCADA (Supervisory Control And Data Acquisition)

•displays on mimic diagrams,

•trending,

•warning and alarm handling,

•manual triggering of commands or changes to set-points.

Page 28: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition28

Central data management functionsCentral data management functions

• CODAC shall provide data logging, data monitoring, data archiving and data visualisation functions within the Main Control Room.

• Data are generated by different parts of the ITER plant at different sampling rates according to the information itself and according to the operational state of ITER.

• The notion of pulse archive also disappears, replaced by the data in a particular pulse being and identifiable time interval in a continuous data retrieval stream.

• Storage strategies required for efficient recovery of data taken at different sampling rate (0.1 Hz to 1 MHz). Separating the total data flow into suitable streams to optimize the retrieval of archived information.

Page 29: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition29

Issues:Issues:Operation SoftwareOperation Software

Issues Actions

Operation for longer periods or even continuous

•the operation should be considered continuous and the ageing ‘shot’ paradigm shall be replaced•Event-driven support where data is acquired or actions performed only when relevant events occur

– data indexing using events and time-stamps with synchronism of all digitizer and generator endpoints (absolute time)

•Provide tools for continuously accessing and analysing data during a pulse

Page 30: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition30

CODAC and Plant Systems I&C CODAC and Plant Systems I&C architecturearchitecture

ITE

R_C

OD

AC

_P

CD

H_

Fig

ure

s_

Vis

io_0

00

3

CODAC Networks

Equipment Equipment Equipment

Equipment

Subsystem(PLC)Subsystem

(PLC)

Digitizers

Subsystem(PLC)

Subsystem(PLC)

Plant System - A Plant System - B Plant System - C

Subsystem(PC)

Subsystem(PLC)

Subsystem(PLC)

Digitizers

Digitizers

Digitizers Equipment

CODAC Systems

I&C Bridge I&C Bridge I&C Bridge

Plant System Host Plant System Host Plant System Host

CODAC Networks

Provide ITER wide physical and logical interconnections between CODAC and the Plant Systems.

The roles and functions of networks are defined according to their performance requirements.

Page 31: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition31

CODAC NetworksCODAC Networks

The CODAC architecture is based on distributed systems connected

over a set of complementary asynchronous and synchronous networks

Each Plant System and CODAC Systems can communicate over one or more networks

•Asynchronous general purpose Plant Operation Network (PON) provides the backbone of CODAC communication for most CODAC data traffic.

•General ITER Networks (GIN) used to connect between the Plant Operation Zone and external Experiment and Analysis Sites.

Networks used will depend on the required functionality, volume of data, bandwidth and latency.

Page 32: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition32

High Performance NetworksHigh Performance Networks

Some functions of CODAC require deterministic, hard real-time communication and synchronization between distributed nodes.

These requirements are addressed by the CODAC High Performance Networks

•Synchronous DataBus Network (SDN)

•Time Communication Network (TCN)

•Event Distribution Network (EDN)

•Audio Video Network (AVN)

Page 33: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition33

Issues:Issues:Timing, Synchronization, events and Synchronous Timing, Synchronization, events and Synchronous data Transport Networksdata Transport Networks

Issues Actions

Distribution of timing signals from central unit to sub-system timing decoder cards where timing signals are subsequently connected to digitizers and control processors

•Include timing, synchronization and event decoder directly on digitizers and control processors

– Higher reliability

– Improve commissioning and maintenance efforts

– Possibility of performing the timing and synchronization of control processes over the synchronous network (IEEE 1588)

Absolute time concept •CODAC shall provide absolute time broadcast; triggers are timing events; clocks are recovered from the synchronous data (optical) links used for events transport.

– Autonomous local absolute time clock units on the digitizers synchronized through a fast network

New diagnostics and plasma controllers may require an updated real-time control and monitoring hardware infrastructure

•Specify a new synchronous network infrastructure

– Lower latency distribution of plasma variables and events (< 2 us)

Page 34: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition34

CODAC and Plant Systems I&C CODAC and Plant Systems I&C architecturearchitecture

ITE

R_C

OD

AC

_P

CD

H_

Fig

ure

s_

Vis

io_0

00

3

CODAC Networks

Equipment Equipment Equipment

Equipment

Subsystem(PLC)Subsystem

(PLC)

Digitizers

Subsystem(PLC)

Subsystem(PLC)

Plant System - A Plant System - B Plant System - C

Subsystem(PC)

Subsystem(PLC)

Subsystem(PLC)

Digitizers

Digitizers

Digitizers Equipment

CODAC Systems

I&C Bridge I&C Bridge I&C Bridge

Plant System Host Plant System Host Plant System Host

Plant Systems provide data acquisition, operation & control, status/alarm monitoring, and data communication functions with the CODAC systems.

Also have local autonomous operation control independent from the CODAC.

Plant Systems communicates only through CODAC.

Page 35: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition35

CODAC and Plant Systems I&C CODAC and Plant Systems I&C architecturearchitecture

ITE

R_C

OD

AC

_P

CD

H_

Fig

ure

s_

Vis

io_0

00

3

CODAC Networks

Equipment Equipment Equipment

Equipment

Subsystem(PLC)Subsystem

(PLC)

Digitizers

Subsystem(PLC)

Subsystem(PLC)

Plant System - A Plant System - B Plant System - C

Subsystem(PC)

Subsystem(PLC)

Subsystem(PLC)

Digitizers

Digitizers

Digitizers Equipment

CODAC Systems

I&C Bridge I&C Bridge I&C Bridge

Plant System Host Plant System Host Plant System Host

Plant System Hoststandard image of Plant System to the CODAC.

It is a single point entry for the asynchronous communication with CODAC (data exchange)

Controls data flow, interprets all the commands and passes them to the Subsystem Controller for necessary actions.

Page 36: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition36

Plant System supplier’s Plant System supplier’s responsibilityresponsibility

Plant System suppliers shall:

– Provide self-description data of their Plant System I&C and shall receive interface requirements from the CODAC.

– Provide and implement applications for their Plant System monitoring, data acquisition, autonomous operation and control functions.

– Provide Plant System simulators/Plant subsystem simulators.

– Carry out Factory Acceptance Tests using mini-CODAC as a testing tool. Also Plant System is responsible to carry out installation, commissioning and SAT at the IO site.

Page 37: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition37

Mini-CODACMini-CODAC

Development of a mini-CODAC mandatory • required to test all CODAC concepts prior to full development• provide a development and test tool for Plant System designers

Mini-CODAC is a CODAC emulator for development of Plant Systems

• Tool for carrying out functional testing of the Plant System to certify Plant System functional integration.

• Scalable functionality to achieve limited performance testing of the Plant System interfaces with CODAC.

• Used to carry out Provisional Acceptance Tests (and for repeating the FAT if needed) to prove and verify that all plant cabling and connections have been terminated correctly and that the input and output schedule is as required by the design.

• Does not define the technical functionality and test processes of Plant Systems but defines and provides environment with limited performance to facilitate integration testing of Plant Systems with CODAC.

Page 38: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition38

Plant System SimulatorPlant System Simulator

PSH

Mini CODAC Plant System # 1 Simulator

Plant System # 2 Simulator

Plant System simulator

ITER

_CO

DAC_

PCDH

_Fig

ures

_Vis

io_0

005

Plant Subsystem simulator

Used to test the behaviour of Plant Systems during different phases of integration (Factory Acceptance Tests, Commissioning and Site Acceptance Tests and also during plant operation).

Plant System/Sub-system simulator should be based on self description data from Individual Plant System/sub-system supplier.

Page 39: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition39

CODAC and Plant Systems I&C architectureCODAC and Plant Systems I&C architecture

ITE

R_C

OD

AC

_P

CD

H_

Fig

ure

s_

Vis

io_0

00

3

CODAC Networks

Equipment Equipment Equipment

Equipment

Subsystem(PLC)Subsystem

(PLC)

Digitizers

Subsystem(PLC)

Subsystem(PLC)

Plant System - A Plant System - B Plant System - C

Subsystem(PC)

Subsystem(PLC)

Subsystem(PLC)

Digitizers

Digitizers

Digitizers Equipment

CODAC Systems

I&C Bridge I&C Bridge I&C Bridge

Plant System Host Plant System Host Plant System Host

EquipmentMay not communicate in a project-wide standardised form.

Configured hierarchically according to the individual Plant System design.

Not procured for direct interface with CODAC but subject to an integration procurement arrangement to deliver as a Plant System component.

(sensors, actuators, instrumentation, electronics, modules, racks, cabling, wiring,…)

Page 40: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition40

Sub-systems and equipments Sub-systems and equipments

Control System Interface Definition Optics alignment controller

interfaces Laser alignment unit interfaces Lasers controller interfaces Detection system controller

interfaces Windows monitoring unit

interfaces Inner wall monitoring unit

interfaces Control and monitoring software

specification Local controller hardware

specification

Radial Port Labyrinth

Primary Vacuum Window

Cryostat Window

Biological Shield Labyrinth

Beam Combiner

532 nm Lasers

Calibration Laser

1064 nm Lasers

PLASMA

VACUUMVESSEL

CRYOSTAT

PIT

M4

M5

LinearServomotors

Linear Servomotors

Alignment Optics Feedback Controller

CCCD

PIT RELAYS

Laser Alignment Feedback Controller

CCCD

CCCD Blanket

penetration

Primary imaging mirror

Laser Controller

Filter Spectometer

Gated Fast Detectors

(MCP PMTs)

Data Acquisition 20 GSPS, 10-bit,

6 GHz, 500 ps

Collected LightTransmission Line

Laser Transmission

Line

Detection System

Controller

LOCAL CONTROL

Windows Monitoring Unit

Inner-Wall Monitoring Unit(to be defined)

7 beams1 beam

7 beams

CCCD Back wall

2 cameras

XY axisXY axis

Laser control, triggering, fault detection and monitoring lines

40LIDAR CONTROL AND DATA ACQUISITION as example

Diagnostic systems will have local controllers and data acquisition which may have to be developed to meet the requirements

Page 41: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition41

Issues:Issues:Data AcquisitionData Acquisition

Issues Actions

The size of the data collected in each shot is increasing(ITER will generate huge quantities of data. e.g 12 IR cameras data resulting from a 10 minutes shot: 1.728 TBytes)

• Implement faster data transport to comply with cycle-time

• Higher-speed real-time pulse processing both during and after shot

High sampling rates and data bandwidth

• Data reduction techniques directly in the sub-system, preferably on the digitizer modules

– Increment the transient recorder time window

– Solve local bandwidth bottlenecks

– improve the CODAC performance since data reduction tasks will be parallelized, lessening data management burden on CODAC infrastructure thus helping meeting the operation delay constrains

Page 42: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition42

Issues:Issues:InstrumentationInstrumentationIssues Actions

Multitude of different hardware platforms to maintain

• ‘generic’ sub-system able to perform one or more of the (fast) local control operations, feedback and data acquisition tasks

• Develop for easy deployment and maintenance

Higher processing power requirements for local data reduction techniques, fast real-time local control or other specialized algorithms on the digitizers

• Use of processors with parallel processing capabilities (multicore, FPGAs,…), interconnected through a (multiple-) full-mesh topology low-latency high bandwidth network for transport of variables and data streams

Ease the programming of local real-time processing algorithms by non-specialized users

• Development of a set of helper tools for easier programming, simulation and deployment of code on reconfigurable devices (Field Programmable Gate Array - FPGA) and multi-core processors

Page 43: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition43

Issues:Issues:Instrumentation (cont.)Instrumentation (cont.)

Issues Actions

Improve systems reliability •Standards based instrumentation with inherent redundancy and mechanical/ thermal characteristics (e.g. Advanced Telecommunications Computer Architecture - ATCA)

Local and global management of hardware operation is required to improve maintainability

•Implementation of a standards based improved hardware management interface (e.g. Intelligent Platform Management Interface (IPMI), Shelf (crate) management (ShM)

Page 44: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition44

Issues:Issues:Instrumentation (Cont.)Instrumentation (Cont.)Issues Actions

Easier installation/replacement of hardware modules

•Hardware interface designed for ‘Plug-and-Play’, and ‘Hot-swap’e.g. ITER will evolve both equipment and ideas over 20 years therefore evolution must be built into CODAC

Improve the speediness of sub-system commissioning and integration on global CODAC

•Design the local sub-system control and data acquisition software for autonomous operation•Preliminary test operations during development can be performed autonomously

Scalability

Page 45: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition45

CODAC componentsCODAC components

Plasma Control is implemented as a specific Operation Schedule to maximise reuse of automation and plasma control tools.

•General feedback control, including Plasma Control, uses a Synchronous DataBus to communicate data converted to physics units, including an estimate of the error on each signal and its status.

•Evaluation of plasma diagnostic information is local in the diagnostic Plant System if this is straightforward.

•Information is collected over the Synchronous DataBus for analysing data from multiple Plant Systems and finally transmitted over the Synchronous DataBus to the actuators.

Page 46: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition46

JET has a wide range of Plasma Control Systems

Aim of Control Systems in fusion:Aim of Control Systems in fusion:JET as an exampleJET as an example

Magnetics

R-T Signal Server

R-T Controller

plasma

CXS Ti (R)

MSE pitch (R)

Flux surfaces EQX

Confinement

VUV impurities

Shape & Current Control (PPCC)

ECE Te (R)

q profile

Neutron X-ray etc.

Interferom/Polarim

NBI

ICRH

LHCD

GAS + Pellets

PF Coils

Vis H/D/T

Vis Da, Brem, ELM

Comms network ATM, some analogue

X-ray Ti (0)

LIDAR Ne&Te(R)

Simulink codeEQX kinetic map

Pale blue = Diagnostic, Sky blue = Analysis, Red = Heating / Fuelling / Magnets & Power, Yellow = PPCC (XSC), Green = RTMC

TAE / EFCC

Wall Load

Coil Protection

To reach and reproduce scenarios which cannot be programmed• Quasi-Steady State Experiments• Magnetic and Kinetic Profile & ITB Experiments• Mode Conversion Experiments• Radiation and Impurity Experiments• MHD Experiments

To protect Plasma and Machine• NB Shinethrough (WALLS, PEWS and NBLM)• LHCD Launcher (Monitor Iron and Radiation)• Avoid Disruptions• Mimize waste (Neutrons, Tritium)

Page 47: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition47

High Maintenance•Proliferation of interfaces•Should converge on modern instrumentation, computer, etc. standards

Lack of commonality and functionality between different devices RTMC systems•e.g. JET PPCC not simply exportable to other devices and vice-versa

Lack of flexibility•Integration of new equipment and physics into the existing infrastructure is time-consuming.

Lack of good transport and integrated models and tools•Integrated development environment and interchange formats

Future developments should acknowledge these issues

Limitations of existing Control Systems: Limitations of existing Control Systems: JET as an exampleJET as an example

Page 48: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition48

Issues:Issues:Control SystemsControl Systems

Issues Actions

Higher algorithm complexity and higher number of input signals for plasma control

• Use of MIMO plasma controllers• Higher real-time processing power

Lower loop delays for time-critical real-time control

• Faster, lower latency real-time local synchronous network for variable sharing among processors

Faster design cycle of RT control systems

• Develop an improved integrated framework for code development, simulation and deployment

Page 49: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition49

How important are the control systems?How important are the control systems?Vertical Stabilization as exampleVertical Stabilization as example

•Loss of control if plasma reaches the vessel protecting tiles.

•Dedicated MIMO systems are designed to make the plasma vertically stable so that other controllers can successfully control the plasma position and shape.

From F. Sartori, IEEE CONTROL SYSTEMS MAGAZINE, APRIL 2006

Elongated plasmas are vertically unstable

Page 50: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition50

How important are control systems?How important are control systems?ITER Vertical Position ControlITER Vertical Position Control

• Loss of vertical plasma position control in ITER will cause thermal loads on

Plasma Facing Components of 30-60 MJ/m2 for ~0.1s.

– PFCs cannot be designed to sustain such (repetitive) thermal loads

• Vertical Displacement Events also generates the highest electromagnetic loads

– A phenomenological extrapolation of horizontal forces from worst JET cases implies horizontal loads ~45MN on ITER vacuum vessel.

– MHD wetted kink model developed to simulate the horizontal loads predicts ~20MN

– Vertical loads ~90MN

Plasma vertical position in ITER must be robust & reliable to ensure a vertical plasma position control

loss is a very unlikely event

Page 51: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition51

JET Vertical Stabilization:JET Vertical Stabilization:Specifications of the MIMO architectureSpecifications of the MIMO architecture

• Reduction of – loop delay on the signal acquisition/generation endpoints (down to 10 µs)

– on the data interconnect links from and to the processing unit

– on the analogue signal path (analogue filters);

• High processing power– on the acquisition/generator endpoints

– on the system controller;

– improvement of MIMO algorithm performance

• Synchronization of all digitizer/generator endpoints;

• Architecture designed for maintainability, upgradeability and scalability;

• Low cost per channel;

• Low risk of implementation and testing.

Page 52: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition52

Does it work?Does it work?ATCA @ JET Vertical StabilizationATCA @ JET Vertical Stabilization

Front view Rear view

192 input signals

Page 53: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition53

• 192 signals acquired by ADCs and transferred at each cycle

• 50 s control loop cycle time with jitter < 1 s

• Always in real-time (24 hours per day)

•1.728 x 109 50 s cycles/day

•Crucial for ITER very long pulses

Does it work?Does it work?ATCA @ JET Vertical StabilizationATCA @ JET Vertical Stabilization

192 input signals

Front view

Page 54: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition54

ITER relevant technology:ITER relevant technology:ATCAATCA

ATCA is the most promising architecture to substantially enhance the performance and capability of existing standard systems

• It is designed to handle tasks such as – event building, – feature extraction – high level trigger processing.– TeraOPS of processing power in a single sub-rack.

• First commercial open standard designed for – high throughput – High availability

High interest to data acquisition physics and attractive for

experiments requiring a very high up-time

Page 55: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition55

Demanding solutions require robust hardware:Demanding solutions require robust hardware:ATCA is an ITER relevant solutionATCA is an ITER relevant solution

ATCA is the most promising architecture to substantially enhance the performance and capability of existing standard systems

•It is designed to handle tasks such as – event building, – feature extraction – high level trigger processing.– TeraOPS of processing power in a single sub-rack.

•First commercial open standard designed for – high throughput – High availability

High interest to data acquisition for physics and attractive for

experiments requiring a very high up-time

Page 56: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition56

Why ATCA?Why ATCA?

ATCA platform is gaining traction in the physics community because of • Advanced communication bus architecture (serial gigabit replacing parallel buses)

• very high data throughput options and its suitability for real-time applications

• Agnostic backplane that accepts several serial switch network protocols

• Scalable shelf capacity to 2.5Tb/s

• Scalable system availability to 99.999%

• Robust power infrastructure (distributed 48V power system) and large cooling capacity (cooling for 200W per board)

• High levels of modularity and configurability

• Ease of integration of multiple functions and new features

• The ability to host large pools of DSPs, NPs, processors and storage

• The ability to host multiple controllers and storage on a shelf

• High security and regulatory conformance

• Reliable, full redundancy support

• Reliable mechanics (serviceability, shock and vibration)

• Hardware management interface

Page 57: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition57

ATCA @ JET Vertical Stabilization:ATCA @ JET Vertical Stabilization:“the star of the show”“the star of the show”

IPFN’s ATCA-MIMO-ISOL

Page 58: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition58

Does it work?Does it work?

VS controller in action - Vertical kicks experiment

Page 59: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition59

And when something fails…And when something fails…

Page 60: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition60

CODAC milestonesCODAC milestones

• First plasma assumed to be 2019

• All functionality available – end 2015

• Control room and site distribution operational – end 2013

• First integration – end 2013

• Mini-CODAC operational – end 2010

• Plant System Host –end 2010

These dates may be revised (and certainly not for earlier)

Page 61: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition61

“These days, building the best system isn’t enough. That’s the price of entry.”

Ann Livermore, HP

For fusion reactors the remaining price should be paid on…

• Controllability • Robustness• Fault tolerance • Reliability• Maintainability • Security• Operability

Future real time systems have to fit the bill!

Closing remarksClosing remarks

Page 62: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition62

SummarySummary

ITER CODAC is a challenging endeavour

Page 63: ITER CODAC

Author’s name | Place, Month xx, 2007 | EventB. Gonçalves | Lisbon, February 8, 2010 | Diagnostics & Data Acquisition63

A professor is only as good as the A professor is only as good as the questions he raises in his pupils minds...questions he raises in his pupils minds...

IEU PhDNot just PhD

EU PhD