HQM - Hamburg Quasar Monitoring
HQM - OBSERVATIONS - 1
The primary observing equipment was a CCD camera in the cassegrain focus of the 1.23m telescope of the DSAZ observatory on Calar Alto, operated by the Max Planck Institute for Astronomy in Heidelberg.
Out of cooperations with other observers and from observations at other telescopes we have used data from Calar Alto (1.23 m, 2.2 m, 3.5 m), Nordic Telescope at La Palma (NOT), ESO (2.2 m, NTT, 3.6m, Danish 1.54m, Dutch 0.9m), 70cm at the Landessternwarte Heidelberg, HST, SAO 6m BAT , CFHT, Las Campanas 1.0m .
During the years of the program lifetime CCD technology involved and different Chips with different pixel sizes, quantum efficiencies, sprectral response and chip sizes have been used. Sometimes it became nessescary to observe in a binned mode, sometimes cameras were mounted rotated or the readout was flipped so that a total amount of 130 different camera configurations found entrance to the HQM data pool.
As the CCDs quantum efficiency was highest in the red band we have chosen the Johnson R filter for our primary filter. In the case of extraordinary events we used Johnson B and V. A uniform Johnson R filter characteristic was not achievable throughout the campaign on Calar Alto. Even at Calar Alto different filters with Johnson R characteristics were used. The specifications of filters from cooperations remained usually unknown but surely fitted the region of a large banded R filter.
The CCD frames have been reduced using standard flatfield-, bias- and dark-corrections. With some very early data from 1985 and 1986 additional stripe corrections became nescessary, which resulted from technical problems to that time.
In general observations were carried out with two subsequently following exposures, one with 100 seconds, the second with longer exposure times between 300 and 1000 seconds. Both frames have been reduced independently and both should give the same photometric values within the error bar. This procedure was used to controll the automatic process and to eliminate possible data corruptions from cosmics, dust, or fit errors. During the first years of HQM when the pointing of the telescope was poor and the guiding TV unit had only low power the short exposure was used online to calculate the pointing offset for the telescope.
We have used relative photometry in a semi and full automatic precedure with the following steps:
The result is a lightcurve with relative photometric values within the set of observed HQM frames.
Two initial calibration steps were used.
One calibrated frame immediately calibrates all other frames in the lightcurve of an object. By using
we receive a sample of calibration constants "const" for each night with observations. A photometric night is indicated by a set of very homogenous const values. By analyzing the const values we used a looped photometric network to calibrate all other lightcurves.
From nights with homogenous const we calculated the photometric values for not yet calibrated frames which added new calibration const for other nights so that the network was growing. This procedure was repeated until all lightcurves were calibrated. As this method is selfcontrolled we received very relyable photometric values.
HQM was designed in 1988 to run nearly automatically. Data observed by the astronomer were sent imediately after readout from the telescope controll to a Micro-Vax computer which started automatically with the reduction. The sets of flatfields, bias and darks had been loaded already to the database. The reduction ended with the display of the new lightcurve on the screen. For a 500x500 Chip the computer needed around 3 to 5 minutes, for a 1K chip between 5 and 10 minutes. So the astronomer was able to decide already from the first short exposure to add another exposure in other color bands or to switch to the next target. After the new telescope pointing the system was ready for the next frame.
Later after the observational campaign a second identical computer system in Hamburg was used to eliminate uncertainties within the lightcurves e.g. to carry about frames with bad observing conditions from clouds, moon shine or other effects.
In Hamburg the raw data were rereduced and written to the image data archive. Each photometric reduction produced an entrance to the photometric library for each star and all fit results were written into the frame specific fit result database. Telescope and observational parameters (pointing, seeing, temperature) were stored in othe datasets.
data, graphics and images may not be used without permission.
contact:Jochen Schramm, Hamburger Sternwarte.