- T. Jarrett -
Work on the three channel GalWorks
is underway and on schedule. Eric Copet of the DENIS project is
currently at IPAC and brought I, J, and Ks images from scans of four
common 2MASS/DENIS fields; the I-band information will be useful as
"truth" data for GalWorks development. R. Cutri and E. Copet are doing
comparisons of point sources to investigate the relative
sensitivities and completeness of 2MASS and DENIS.
- J. Fowler -
A high density scan processed by
PIXCAL/DFLAT/FREXAS (scan 074 of 95-05-04) terminated with an error
message from FREXAS indicating that the APPHOT subroutine returned an
error code of -3. This is documented in SIS PRO02 as meaning that too
few good pixels were found in the annulus. APPHOT requires pixel values
to be greater than -100 to be usable (among other criteria). B. Light
determined that the first Read1 frame had pixel values mostly between
-250 and -150. Somehow the dark frame seems erroneous. An attempt to
run the scan with the APPHOT limit set to -300 was made; this time
APPHOT appeared to get into an infinite loop while processing the
eighth Read1 frame, which appeared similar to the first seven. The
APPHOT limit was set back to -100, and a change was made to DFLAT to
subtract the frame median value from each Read1 frame as a defaulted
option. This allowed the scan to be processed, and the results appear
negligibly different from the proto-pipeline results, which used some
techniques designed by G. Kopan that will be examined when he returns
to IPAC on August 19. These techniques appear to include forcing the
Read1 frame median to some value other than zero based on the
Read2-Read1 median and the ratio of exposure times, but this is not yet
certain and must be elucidated by Gene. The DFLAT code will be modified
to use similar techniques. For now, scans will be processed using the
zero-median code, and the effect of this on previously successful
low density J and K scans will be investigated. [Note added in proof:
these checks were made on scans 084 and 094 of 95-05-03, and only
negligible differences were found.]
Analysis of the 3x3x3 test grid for flattening studies is underway;
this grid is composed of the three bands with scans at three different
densities, with each combination processed with three flattening
control parameters: pure symmetric trimmed averaging, pure RMDR trimmed
averaging (Recursive Median Distance Rejection asymmetric trimming),
and a 50-50 hybrid.
BandMerge has been coded and run successfully on simulation data;
current liens are usage of yet undefined status flags and all
statistical analysis code. Automated comparison of BandMerge output to
simulation input has been implemented. Carefully hand tweaked confusion
cases are planned to exercise all code branches.
- J. White & R. Beck -
A "fake tape" was received
from M. Rudenko. It has been checked out, and a number of small changes
to the Rev. J Obs/IPAC I/F Document have resulted, most of which have
to do with case sensitivity. Revised files will be ftp'd. [Note added
in proof: J. White FedEx'd two DLT tapes to M. Rudenko, who sent the
revised data back on tape instead of using ftp.]
Skeletal EXEC and PCP scripts exist and are in testing. Various plans
for reducing tape costs are being investigated with R. Cutri.
- R. Cutri -
Several variations on a tape cost
reduction plan are being studied. One involved some use of Exabyte
tapes; a more desirable one uses only DLT tapes, mostly of the new
DLT7000 format, which is said to use the same physical tapes as the
existing DLT3000 format but to have twice the capacity, i.e., 40 GB
uncompressed versus 20 GB. This plan involves each observatory writing
two DLT3000 tapes per night; one is archived at the observatory, and
the other is sent to IPAC. Once read and verified at IPAC, the data are
written to duplicate stacked DLT7000 tapes for archival purposes, and
the DLT3000 tape is sent back to the observatory to be reused. One of
the duplicate DLT7000 tapes is sent to UMASS. Processed data are
written onto a single stacked DLT7000 tape. If the tape is later found
to be unreadable, the raw data will be reprocessed. About two nights'
data should fit onto one stacked DLT7000 archive tape, and about three
nights' processed data should fit onto one stacked DLT7000 tape. A
dedicated sparc will be needed for the tape processing; this is
expected to be a single CPU machine. Some special software will be
needed to recover observatory format data from stacked archived tapes
when reprocessing is needed. It would also be desirable to install the
2MAPPS pipeline through CALMON on the observatory sparc for faster
identification of unacceptable data (some concerns were voiced about
maintaining two versions of 2MAPPS). With current estimates of data
volume, this plan could reduce tape costs from the initial estimate of
$500,000 (5000 @ $100 each) to somewhere between $60,000 and $120,000
(1200 @ $50 to $100). [Note added in proof: in a later conversation
with Artecon, R. Cutri received some less optimistic cost and schedule
information that conflicted with what he had been told previously by a
different representative.]
- B. Light -
Comparisons of KAMPHOT and PROPHOT
results are underway for the same protocam data processed with the same
thresholds and PSFs. For some reason as yet unknown, PROPHOT has a
higher incidence of reduced chi-squares greater than 2.
A new subroutine named APCOM is being written to combine multiple
apparition single-frame aperture photometry results into a single
magnitude and uncertainty for each source. This will be used by POSFRM
for Read1 detections and by PROPHOT for Read2-Read1 detections.
- T. Evans -
The MAPCOR SDS is being written, and
work is being done on defining the status flags for point sources. An
updated SIS MAP01 is being prepared.
Three presentations of database systems have taken place, and the
analysis group is currently digesting the details. One more
presentation is planned, and then a decision on which system to
employ will be made.
- H. McCallon -
Much work on low density scans has
now been done. Two effects are seen in the offsets between Read1 and
Read2-Read1 positions. One is fairly systematic over the duration of a
scan, is probably due to some slowly varying peculiarity in the action
of the scanning secondary, and can probably be modeled adequately with
a linear function of frame number; the other is apparently due to
speckling, is random, and degrades significantly as the seeing gets
worse. An error model for Read1 positions has been worked out in
collaboration with G. Kopan. The test field used was scanned seven
times with considerable variation in seeing. The fact that the Read1
positions are less reliable than hoped indicates that the relative
weighting of Read1 and Read2-Read1 position information will probably
have to shift toward brighter Read2-Read1 detections.
Reconstruction dependence on astrometric catalogs was studied by
varying the catalogs and starting conditions. Significant dependence
was found, as expected; the bottom line is that the current results
based on the most reasonable assumptions are at least as good as the
Level 1 Requirements specify, and better results are expected with the
superior astrometric catalog information anticipated to be available
for the survey.