LCTPC Software WP Meeting

Europe/Zurich
http://evo.caltech.edu

http://evo.caltech.edu

To participate go to http://evo.caltech.edu. The meeting is booked in the ILC community. Or use the phone bridge: Phone Bridge ID: 319878 EVO Phone Bridge Telephone Numbers: --------------- - USA (Caltech, Pasadena, CA) +1 626 395 2112 - Switzerland (CERN, Geneva) +41 22 76 71400 - Slovakia (UPJS, Kosice) +421 55 234 2420 - Italy (INFN, several cities) http://server10.infn.it/video/index.php?page=telephone_numbers Enter '4000' to access the EVO bridge - Germany (DESY, Hamburg) +49 40 8998 1340
Martin Killenberg (Physikalisches Institut - University of Bonn)
Description
Discussion about the software status and requirements for the Large Prototype To participate go to http://evo.caltech.edu. The meeting is booked in the ILC community. Or use the phone bridge: -Phone Bridge ID: 319878 EVO Phone Bridge Telephone Numbers: --------------- - USA (Caltech, Pasadena, CA) +1 626 395 2112 - Switzerland (CERN, Geneva) +41 22 76 71400 - Slovakia (UPJS, Kosice) +421 55 234 2420 - Italy (INFN, several cities) http://server10.infn.it/video/index.php?page=telephone_numbers Enter '4000' to access the EVO bridge - Germany (DESY, Hamburg) +49 40 8998 1340
Slides

Attending:

Jason Abernathy
Klaus Dehmelt (Desy, phone bridge)
Ralf Diener (Desy, phone bridge)
Keisuke Fujii
Lea Hallermann (Desy, phone bridge)
Xavier Janssen
Jochen Kaminski
Dean Karlen
Martin Killenberg
Claus Kleinwort (Desy, phone bridge)
Oliver Schäfer
Ron Settles (phone bridge)
Stephen Turnbull (phone bridge)
Adrian Vogel
Peter Wienemann

Martin presented the current status of the software (see slides). The following points have been discussed during the presentation:
  • There is a pedestal calculator inside the MarlinTPC package. But it is not compliant with the pedestals defined in the conditions data. It is old code which has been written before the conditions data had been implemented and has to be adapted.
  • The calibration data will be calculated in special "calibration runs" and written to the conditions data base. Afterwards the reconstruction and analysis is performed, using this data. The calibration data should also have a time stamp when it was created, which is useful in case of recalibration.
  • The time stamp in the Tracker(Raw)Data is not taken into account in the current implementation. Some DAQ systems provide this information, and in this case it certainly should be used (to be implemented). It might be necessary to distinguish between different DAQ systems for this, which is done by using special names for the data collections.
  • In case a per-hit reconstruction is performed, a correction of E- and B-filed inhomogeneities (incl. ExB effects) as well as pad response correction is needed. In case of the global likelihood fit this is included in the likelihood function, so no separate correction step is needed. In both cases the required information (field settings, diffusion ...) are provided as conditions data.
  • The reconstruction should provide the momentum of the particle. Need to implement a momentum processor.
  • The geometric mean processor need frequent re-fitting of the track (once for every hit), which is time consuming. With a sufficiently large number of hits on the track the bias introduces by just fitting the track once with all the hits is small. In this case the "BiasedResidualsProcessor" should be used due to performance reasons. (N.B. one could use a correction factor sqrt( n / (n - DoF) ) on the biased value)
  • In case of inhomogeneous B-field (and of energy loss) the trajectory might not be a perfect helix. In this case the track should consist of helix segments. Can the LCIO track model handle this? The fitting processors also have to be adapted to this track model.
  • The likelihood processor must be able to work across several modules, the track finder will only provide track candidates in each single module which have to be combined before the track fitting.
  • Suggestion: The conditions database should be hosted at Desy. Claus Kleinwort has started to work on the database for the test beam slow-control.
  • All conditions data classes containing per channel information will have to be extended by a field containing the module ID. For the LCIO data classes the CellID1 can be used.
  • Missing in FieldSettings: Flag whether GEMs or Micromegas have been used.
  • The applied voltages should be stored rather than the fields (be as basic as possible). This allows to recalibrate the fields with more accurate values of the drift distance or the GEM spacing if needed.
  • The trigger and beam conditions (cosmic or beam trigger, electron energy) should also be stored as conditions data.
  • The conditions data does not have to be stored with every event. The ConditionsProcessor monitors the data base and provides the correct information for each event when needed.
  • There was a discussion where the trigger information, which can change event by event should go. It was discussed to put this into the event header, but there is no event header in LCIO (only a file and a run header). This means the per event data can either be written to the same file as the data, or into a separate lcio file. It is treated as conditions data, so it only has to be stored when it changes. Again the conditions processor provides the information for all events.
  • Currently there is no class foreseen for the trigger information, it has to be defined.
  • There are (at least) two trigger subsystems:
    •  The cosmic trigger electronics, which combines the information from the scintillator slabs for the cosmics. This unit can provide information which of the scintillators has fired.
    • The Trigger Logic Unit, which gets one NIM signal from the cosmic trigger plus the signals from the beam trigger. It combines these signals, provides the DAQ trigger and coordinated the handshake between the different DAQ subsystems (mutual inhibit). Can the TLU also provide information which scintillator has fired?
  • Where is the position of the TPC wrt. the beam and the magnet stored? Where to store the offset between the different readout modules? This has to be implemented when extending GEAR for multiple modules.
  • The conditions data should be written automatically by the slow control system, not manually. Everything should be automated as much as possible (for instance for every data run the required information, which has to entered by the user, should be requested before the run is started)
  • Where is the silicon hodoscope data reconstructed? The reconstruction should also done with Marlin, so the LCIO data streams should be merged as soon as possible (Eudaq Data Collector?) and there is a common reconstruction of silicon and TPC data in one Marlin run.
  • The magnetic field map (as well as a map with E-field corrections) will be stored as conditions data. In GEAR there is only a homogeneous B-Field foreseen.
  • Stephen Turnbull has started to work on an event display. He will look at the event display thread on the forum to see what has been discussed and what is needed.
  • Martin will start discussion threads on TPC conditions data and what default analyses should be implemented on the forum ( forum.linearcollider.org ).
  • Depending on the discussions in the forum we will decide whether to have the next meeting in two weeks (2008-05-14) or in four weeks (2008-05-28). Martin will announce this via eMail on the MarlinTPC mailing list next week (2008-05-14)
There are minutes attached to this event. Show them.
The agenda of this meeting is empty