INS-DAS-17
System Responsibilities for the UltraDAS

Issue 2.1, written on 7th October 1998 
Guy Rixon, ATC
gtr@mrao.cam.ac.uk 
 

1. Introduction

1.1. Purpose of this document

An analysis is presented that explores the user requirements [1] for the UltraDAS. Use cases describe the supported operations. A table of system responsibilities (roughly equivalent to "software requirements" in PSS-05 [2]) is given, and this can form the basis of system testing.

1.2 Document history

Issue 1.1, 1998-08-07:
Original document.
Issue 2.1, 1998-10-07:
The set of states for a camera is changed: the "engineering mode" state is dropped.  The way the engineering interface works is different: it can now be used at the same time as the observing system.  The use cases concerned with starting up the system have been simplified.  The use case start one camera has been withdrawn (it is absorbed into other cases) and there is a new use-case tune camera.

1.3. Scope of the system

The UltraDAS controls ING's detectors, both optical and IR, through SDSU detector controllers. It provides these services: The UltraDAS does not log observations: that is done by the ING Data Manager to which the UltraDAS is interfaced.

The UltraDAS consists in the server and controller software, the client software, the status displays and the automatic image-display. All are parts of the the overall observing system, and the client software and displays are specifically part of the central intelligence of the observing system. The shorthand term DAS is used below to mean the server and controller software since those parts are associated with a DAS computer.

1.4. References

  1. User requirement document for the UltraDAS data-acquisition system,

  2. ING document INS-DAS-16 by Dennis Armstrong.
  3. Software Engineering Standards

  4. by C. Mazza et al., pub. Prentice Hall 1994, ISBN 0-13-106568-8. (This is a published version of the ESA standard PSS-05.)
  5. Architecture for the ING observing-system,

  6. ING document OBS-ARCH-1 by Guy Rixon.
  7. Remote procedure calls for DRAMA clients

  8. ING document OBS-RPC-1 by Guy Rixon, RGO.
  9. WWW site of the CCD laboratory at San Diego State University,

  10. http://mintaka.sdsu.edu/ccdlab/
  11. WWW site of the CCD laboratory at San Diego State University,

  12. http://mintaka.sdsu.edu/ccdlab/
  13. Guide to writing DRAMA tasks

  14. AAO document DRAMA_GUIDE by Tony Farrell.
  15. User's guide to the message log and message display

  16. ING document OBS-TALK-4 by Guy Rixon, RGO.
 

2. General description

2.1. Relation to other projects

The UltraDAS is to build on the qualified success of the Data-Cell DAS, as used at the INT from 1996 to 1998. The new system is expected to keep the look and feel of its predecessor while improving performance, reliability and breadth of function. In order to meet the tight timescales for first release, the UltraDAS is required to reuse large amounts of code from the Data-Cell DAS and will thus have to include DRAMA.

The UltraDAS will be an important part of the programme to replace the VAX-ADAM system on the WHT. It may be the first part of that programme to release software.

The second phase of the INGRID development is expected to use the UltraDAS. IAC's LIRIS project will probably use UltraDAS too. The NAOMI project requires INGRID in its phase-2 form and so depends on the UltraDAS.

The 2-chip-mosaic camera for the WHT requires the UltraDAS to reach its planned performance.

Future detectors for ING are all expected to use SDSU controllers through the UltraDAS. Some planned cameras have large mosaics of detectors with many readout channels each. To accommodate these, the DAS has to be able to operate more than one detector controller per camera.

UltraDAS itself depends on the continued support for DRAMA. If ING chooses to dispense with DRAMA, then a large part of the UltraDAS will have to be replaced.

UltraDAS depends critically on the production of a PCI interface by the CCD laboratory at SDSU.

2.2. Environment and constraints

SDSU mark-II controllers [5] are a requirement. They are the reason that the UltraDAS is being developed. This means that the controller software runs on Motorola DSPs, with no operating system. The interface to the host computer must be one of SDSU's products and the only one suitable for the required speeds is the new interface for the PCI bus [6].

The project brief requires some of the code from the Data-Cell DAS to be re-used. This means that the server software must run on Solaris. To host the interfaces to the detector controllers in a PCI bus, the DAS computer must be from the Sun Ultra series or a clone of that kind.

To meet the release timescales, the architecture of the observing system must follow directly from the work on the INT and JKT [3]; there is no time to develop or import an alternative. UltraDAS assumes that:

Logging and archiving of observations is done outside UltraDAS by the ING data-manager.

2.3. Model description

2.3.1 Actors

These are the human players: These are the non-human actors that appear in the use cases:

2.3.2. States of a camera

A camera can be in one of several states [SR1, SR1.1]:
Disconnected:
powered down, or not connected to the host computer.
Unclaimed:
cabled to DAS computer and powered up, but not in use by any software.
Unrecognized:
as "unclaimed", but the camera reports an identity for which the system has no profile; the camera cannot be used.
On-line:
contactable by clients on the system computer.
Idle:
in communication with the system computer but not executing a command.
Busy:
executing a command on behalf of the system computer.
Laboratory mode:
in use by a laboratory tool such as ccdtool (see the section on UIs, below).
There is restricted movement between the states [SR1.2], reflecting the fact that only one of the three sets of control software can use the camera at a time [SR1.3]. The allowed transitions are shown in the state-transition diagram .

When a camera goes from "unclaimed" to "on-line", it is ready for use. It has initialized itself from its profiles. Observers can send commands to the camera after it goes from "on-line" to "idle" [SR1.4].

The camera is "busy" while it is executing a user command during observing. The natural states of a detector - clearing, exposing, reading out, etc. - are all sub-states of "busy". The exact sub-state doesn't have much effect on the usage since the camera has to go back to "idle" before the user can do anything else.

Apart from moving simultaneously to the "unclaimed" state if the DAS computer crashes or is rebooted [SR1.5], cameras change state independently. There are no significant states of the DAS as a whole once its computer is running.

2.3.3. Data flows

This document does not contain a full and detailed analysis by data-flow diagrams in the Yourdon/Ward-Mellor/Hatley-Pirbigh styles because they wouldn't add much to the descriptions in the use cases. However, three particular flows need illustration as they cross the boundaries of the UltraDAS.

The handling of pixels [SR2.1] (separately from other data) occurs entirely within the UltraDAS. In abstract terms, the source of data is the detector groups. More exactly, the video board of the detector controllers is on the boundary of the system and is the proximate source [SR2.1.1]. The destination of the data from an observation is a FITS file on the disc-server [SR2.1.2].  In the most general case (and one has to be general at this level to take account of the range of cameras that the UltraDAS supports) the pixel processing consists of a pipeline with a signal processing part [SR2.1.3] and a bulk-data processing part [SR2.1.4], separated by a buffer [SR2.1.5]. Signal processing defines the values of individual pixels and data handling arranges the pixels in a recognized raster format, taking into account windowing and binning of the readout.

This is shown in the data-flow diagram for pixels.

The nature of the detectors means the detector controller has to do the signal processing in real time without compromising detector performance [SR2.1.3]. The bulk manipulation of pixels has to be done in the DAS computer (the controllers don't have enough memory or processing power) [SR2.1.4] and the latter machine can't work to hard real-time constraints. Hence, the pipeline has to have a buffer in the middle [SR2.1.5]. Changing the pixel format from 16-bit unsigned integers to FITS format (signed 16-bit integers or 32-bit reals) could be put either side of the buffer, but has been shown as part of the signal processing here.

The data for FITS headers come in to the system from the cameras [SR2.2.1], the TCS [SR2.2.2], the ICS [SR2.2.3]. Some (e.g. the title of the observation) come directly from the observer as part of the command that starts an observation [SR2.2.4].

The TCS and ICS produce complete FITS packets which the system can copy verbatim into a FITS header. This is a standard arrangement of the observing system which needs to be preserved in the UltraDAS [SR2.2.5].

The packets describing the camera and the formatting of the output image are produced in the DAS [SR2.2.6].

The world-coordinate-system packet combines pointing data from the TCS, geometry data from the TCS and configuration/format data from the camera; the latter are more numerous. To avoid unnecessary coupling, this packet is made inside the UltraDAS [SR2.2.7]. It would be simpler to make the WCS packet in a post-processing chain (outside the system), but it is more desirable to have the data made in time for use in the automatic display.

These flows in transformations are shown in the data-flow diagram for FITS headers.

The data-flow diagram for completed observations shows how the UltraDAS fits into the observatory's data-management process.

The boxes on the right of the diagram represent the observatory's data products. The UltraDAS can produce none of these by itself; they require interoperation with the disk server [SR2.3.1], the data manager [SR2.3.2] and the FITS-tape packages [SR2.3.3].

The boxes on the left are the displays produced to aid observing. The UltraDAS produces the image display, but the display of the night report comes from the Data manager. The image display could be arranged either as a co-operation with the disk server  [SR2.3.4], or a direct data-feed from the DAS [SR2.3.5].

The flow of observations emphasizes the central role of the disk server. This installation is a single point of failure for all outputs. The organization and naming of files on the disk server has to be an agreed standard [SR2.3.6] between the UltraDAS, the data-manager and the tape packages. The users also have to know the location of files to do local data-reduction.

2.3.4. Concepts and associations

A small number of association diagrams in the UML notation are provided here. They mainly serve to show the one-to-many relationships between parts of the system.

The first diagram defines what constitutes "a camera" [SR3.1]. Note the concept of detector group to describe the detectors operated by one controller.

The second diagram illustrates how the detector controllers of a camera are connected to the DAS. It show the number of parallel readouts that the DAS is required to support [SR3.2]. Note that one DAS (there could be more than one DAS per observing system!) can either serve a few single-channel cameras or one camera with several detector groups but not both at once. The hardware prevents many multi-detector cameras being used at once.

The third diagram shows how the system identifies the attached cameras and determines their configuration [SR3.3]. By the use of plugs carrying an ID code, the connections between detector groups, detector controllers and the DAS can be determined in software and used to set the configuration by matching the IDs against a library of profiles. The system is almost self-configuring, but it may need to take account of the instrument in use. This latter information comes in from the CIA and modifies the choice of profile. If there are no instrument-specific features, or if those features can be embodied as alternatives within one profile, then the system becomes much simpler to work with.

The engineering profiles are written and filed when a new camera is released onto the system. They define the settings of the camera that only a specialist can set; in particular, the profile defines which controller software to run [SR3.4].

The user profiles modify the defaults in the engineering profiles for user-selected settings uch as the size and position of windows. The users profiles are the system's memory of the preferred settings and are built up as the observers configure a camera for observing [SR3.5].

2.3.5. User interfaces

There are three separate user interfaces for the detectors involved with the UltraDAS.

The laboratory software is expected to be the program ccdtool or one of its variants. These programs have built-in graphical interfaces [SR4.1].

The engineering software, used to investigate and configure a camera separately from the observing system, will be a program with a built-in graphical interface [SR4.2]. This tool consists in

This interface allows quicker cycles of changes and tests than could be achieved by stopping and starting a camera from within the observing system. The interface also allows write access to the engineering profiles which the observing system does not.

The observer's interface to the UltraDAS is a varied set of GUI programs [SR4.3]. The DAS display is merged with the mimics and GUIs for the instrumentation [SR4.3.1]; e.g., the wide-field camera mimic has displays for the filter wheel, shutter and CCDs in the same window. Hence, the graphic design of the display is expected to differ from instrument to instrument, forcing the production of multiple GUIs. To make these programs more manageable, there must be a standard interface into the DAS from which state data can be read [SR4.3.2] and a standard interface for giving commands [SR4.3.3].

The engineering interface is a simple client of the DAS that can be run on the DAS computer.  The observer interface
is part of the CIA and has to be run on the system computer.

The engineering and observer interfaces can be "attached" to a camera at the same time in the sense that both can display the camera's state.  Only one interface should send commands at a time; the interlocking between engineering and observing commands will be minimal.  The two interfaces can send commands alternately.

2.3.6. Use cases

These are the supported operations for observing: These are the recognized engineering operations:

3. Specific responsibilities

These responsibilities refer back to elements of the model description above: they link to the tags (SRn) in the text. This section is meant to define the high-level tests to be made on the system before it is delivered.
Responsibilities from section 2.3.2:
Responsibility  Associated URs 
SR1  A camera must exhibit explicit states  none 
SR1.1  Standard states for a camera  none 
SR1.2  State transitions are restricted  none 
SR1.3  One control-program per camera at at time  13 
SR1.4  Camera initializes on entry to "on-line" state  none 
SR1.5  On reboot of DAS, all cameras go to "unclaimed" none 
 
Responsibilities from section 2.3.3:
Responsibility  Associated URs 
SR2  Handle the main data-flows  17 
SR2.1  Pixel flows through DAS  17 
SR2.1.1  Interface to CCDC video board  17 
SR2.1.2  Interface to disk server  none 
SR2.1.3  Signal processing in real time  17 
SR2.1.4  Bulk-data handling for pixels  17, 25 
SR2.1.5  Buffer pixels between CCDC and DAS  17, 25 
SR2.2  Handle data flows for the FITS headers  17 
SR2.2.1  Take data from the TCS  17 
SR2.2.2  Take data from the ICS  17 
SR2.2.3  Use the current state-data of the camera  17 
SR2.2.4  Get data from the observer's run command  17 
SR2.2.5  Copy TCS and ICS packets verbatim  17 
SR2.2.6  Produce detector and ccdproc packets  17 
SR2.2.6  Produce WCS packet  17 
SR2.3  Observation flows ex UltraDAS  17, 18, 19 
SR2.3.1  Interoperate with external disk server  17 
SR2.3.2  Interoperate with data manager  18 
SR2.3.3  Interoperate with FITS-tape packages  none 
SR2.3.4  Image display is fed directly from DAS  19 
SR2.3.5  Image display is fed from disk server  19 
SR2.3.6  Organization of directies follows a standard  none 
 
Responsibilities from section 2.3.4:
Responsibility  Associated URs 
SR3  System embodies listed concepts  none 
SR3.1  Part of a camera with system support  none 
SR3.2  Supported number of detectors etc.  11, 24 
SR3.3  System auto-configures for attached cameras  none 
SR3.4  Configuration is defined in eng. profiles 
SR3.5  Configuration is modified in user profiles 
 
Responsibilities from section 2.3.5:
Responsibility  Associated URs 
SR4  Provide separate engineering and observing interfaces 13 
SR4.1  Provide for laboratory work off the telescope  13 
SR4.2  Provide for engineering work on the telescope  13 
SR4.2.1  Display the camera's state  13, 14 
SR4.2.2  Allow editing of the engineering profile  none 
SR4.2.3  Provide local controls for a few operations  13 
SR4.2.4 Allow selective starting and stopping of individual cameras. 13
SR4.3  Provide for observing control  13 
SR4.3.1  Integrate UltraDAS displays with other mimics  none 
SR4.3.2  Use a standard interface for state data  2.1, 2.2 
SR4.3.3  Use a standard interface for commands  2.1 
 
Responsibilities defined in use case prepare to observe:
Responsibility  Associated URs 
SR5.1  Bring up DAS within 5 minutes  None 
SR5.2  Bring up TCS within 5 minutes  None 
SR5.3  Bring up ICS within 5 minutes  None 
SR5.4  Total start time <10 minutes  None 
 
Responsibilities defined in use case observe with a CCD:
Responsibility  Associated URs 
SR6.1  Remember readout parameters  4, 5, 6, 7, 8, 9 
SR6.2  Standard run  2, 10 
SR6.3  Multrun  2, 10 
SR6.4  Selectable readout channels  2, 11 
SR6.5  Bias frames  2, 10 
SR6.6  Dark frames  2, 10 
SR6.7  Scratch runs  2, 10 
SR6.8  Glance runs  2, 10 
 
Responsibilities defined in use case observe with INGRID
Responsibility  Associated URs 
SR7  Support INGRID 
SR7.1  Remember readout parameters  4, 5, 6, 7, 8 
SR7.2  Standard run (MNDR)  2, 10 
SR7.3  Multrun (hypercubic MNDR)  2, 10 
 
Responsibilities defined in use case observe with Taurus
Responsibility  Associated URs 
SR8.1 Remember readout parameters  4, 5, 7, 8, 9 
SR8.2 Remember etalon steps per observation
SR8.3 Standard run 2, 10 
SR8.3  Multrun  2, 10 
SR8.4  Bias frame  2, 10 
SR8.5  Scratch run  2, 10 
SR8.6  Glance runs  2, 10 
SR8.7  Default dettings 
 
Resposibilities defined in use case observe with CCD photometer:
Responsibility  Associated URs 
SR9.1  Remember readout parameters  4, 5, 7, 8, 9 
SR9.2  Remember readouts per observation 
SR9.3  Standard run  3, 10 
SR9.4  Glance runs  2, 10 
SR9.5  Alternative mode using a shutter  2, 10 
 
Responsibilities defined in use case introduce new detector:
Responsibility  Associated URs 
SR10.1  Understand camera through standard files 
SR10.2  Camera goes to "unclaimed" on power up  none 
SR10.3  Camera go to "on-line" when ordered to start  none 
SR10.4  Camera goes to "idle" when CIA starts  none 
 
Responsibilities defined in use case change camera
Responsibility  Associated URs 
SR11.1  Camera goes to "disconnected" on power-down  none 
SR11.2  Camera goes to "unclaimed" on power-up  none 
SR11.3  Camera goes to "on-line when ordered to start  none 
SR11.4  Camera goes to "idle" when CIA starts  none 
 
Responsibilities defined in use case start DAS:
Responsibility  Associated URs 
SR12.1  Display menu of sub-system configurations  none 
SR12.2  Shut down all cameras if wrong version running  none 
SR12.3  List state of all cameras  none 
SR12.4  Allow sequential start of multiple cameras  none 
 
Responsibilities defined in use case start a camera:
Responsibility  Associated URs 
SR13.1  Load server software  none 
SR13.2  Load controller software  none 
SR13.3  Initialize from engineering profile  none 
SR13.4  Initialize from observing profile  none 
SR13.5  Camera goes to "on-line"  none 
SR13.6  Handle cameras claimed by other software  none 
 
Responsibilities defined in use case start CIA and ICS:
Responsibility  Associated URs 
SR14  Start CIA  none 
SR14.1  Allow choice of cobnfiguration  none 
SR14.2  Confirm observer's identity  20 
SR14.3  Select software for chosen configuration  none 
SR14.4  Locate and connect all chosen cameras  none 
SR14.5  Chosen cameras go to "idle"  none 
SR14.6  Start DAS displays  13 
 
Responsibilities defined in use case set readout parameters:
Responsibility  Associated URs 
SR15.1  Set binning 
SR15.2  Set a window 
SR15.3  Set readout gain 
SR15.4  Select readout channels  11 
SR15.5  Set readout speed 
SR15.6  Set preflash time  none 
SR15.7  Defaults for settings 
 
Responsibilities defined in use case make FITS header:
Responsibility  Associated URs 
SR16.1  Signal ICS/TCS to start packets  17 
SR16.2  Make standard part of header  17 
SR16.3  Signal ICS.TCS to finish packets  17 
SR16.4  Read ICS/TCS packets  17 
SR16.5  Make observation packet  17 
SR16.6  make WCS packet  17 
SR16.7  Assemble header  17 
SR16.8  Defaults for missing packets  17 
SR16.9  Default WCS  17 
 
Responsibilities defined in sequence save in archive file:
Responsibility  Associated URs 
SR17.1  Rename observation file to standard system.  17 
SR17.2  Store observation in standard directory.  17 
SR17.3  Use /obsdata  17 
SR17.4  Use alternate disks  17 
SR17.5  Make one file per readout channel  11 
 
Responsibilities defined in sequence save in scratch file:
Responsibility  Associated URs 
SR18  Make a scratch file  10 
SR18.1  Rename observation file to standard system.  17 
SR18.2  Overwrite existing files as necessary.  17 
SR18.3  Store observation in standard directory.  17 
SR18.4  Use /obsdata  17 
SR18.5  Use alternate disks  17 
SR16.5  Make one file per readout channel  17 
 
Responsibilities defined in use case set IR readout-parameters:
Responsibility  Associated URs 
SR19.1  Set reads per MNDR  10 
SR19.2  Set a window 
SR19.3  Set readout gain 
SR19.4  Select readout channels  11 
SR19.5  Set readout speed 
SR19.6  Defaults for settings 
 
Responsibilities defined in sequence MNDR:
Responsibility  Associated URs 
SR20.1  Reset detector  10 
SR20.2  Zero pixels  10 
SR20.3  Read n into first plane  10 
SR20.4  Integrate  10 
SR20.5  Read into second plane  10 
 
Responsibilities defined in sequence read and store pixels:
Responsibility  Associated URs 
SR21.1  Read into FITS image  17 
SR21.2  Apply windows 
SR21.3  Apply binning 
SR21.4  Apply transformations  25 
SR21.5  Reformat each pixel  17 
SR21.6  Use a separate image for each readout channel  11 
SR21.7  Allow up to 7 dimensions in FITS image 
SR21.8  Allow coadding  10 
 
Responsibilities defined in use case inspect observations:
Responsibility  Associated URs 
SR22.1  Provide an automatic image-display  19 
SR22.2  Provide an xgterm interface  19 
SR22.3  Display each image within 10s of observation  19 
SR22.4  Display world coordinates where possible  19 
SR22.5  Allow resricted IRAF work on raw data  19 
 
 
Responsibilities defined in use-case tune camera:
Responsibility Associated URs
SR23.1 Display states of all cameras 13
SR23.2 Display detailed state of chosen camera 13, 14
SR23.3 Do test exposures 10, 11
SR23.4 Display engineering profile 4
SR23.5 Edit, check and reload engineering profile 4