- Hemisphere Classification --
The hemisphere-classification splinter group decisions were reported by J.
Fowler. As discussed last week, this group was tasked to propose procedures
for tracking the distinction between the north and south hemisphere with
respect to data affected by this distinction. The group was composed of J.
Fowler, G. Kopan, and J. White. The recommendations, which were accepted by
the members present, were as follows:
The directory structure will incorporate
the hemisphere distinction
at the highest level. Directories named "north" and "south" will have the
date directories underneath them, continuing with a directory structure
similar to that used for protocamera data analysis. Thus, for example,
data taken on observation date 97-06-12 in both hemispheres will be kept
separate by placing the data from the northern-hemisphere observatory in
and under the directory "north/97-06-12" and the data from the southern
hemisphere observatory in and under the directory "south/97-06-12".
TAPELOAD will identify the hemisphere from the
contents of the observatory tape it is processing, create the
directory tree extensions needed,
and set up hemisphere-dependent data files for the 2MAPPS subsystems in
those parts of the tree where the subsystems will look for them. Thus,
for example, POSFRM will obtain the hemisphere-dependent focal-plane
distortion model by reading it from a directory accessed via a relative
offset from its default directory (e.g., "../../distmodel"), without
having to know which hemisphere is actually involved.
Subsystems which require explicit knowledge of
the hemisphere being
processed will obtain it directly from their NAMELIST input, which will
be set up for them by TAPELOAD in the same way that other hemisphere
dependent data files are set up (i.e., item (B.) above). This information
will also be in the headers of all FITS files, so subsystems that read
such files can obtain the hemisphere information that way if preferred.
Cognizant engineers whose subsystems involve
hemisphere dependent
data must identify the parameters involved to J. White and arrange
for the files to be installed in separate-hemisphere data bases
for TAPELOAD to use in setting up and loading the directories for
subsequent processing.
- Photometric and Position Precision --
C. Beichman and R. Cutri have requested that precision requirements for
position and photometric information be placed in the FDD. J. Fowler gave a
recommendation on how to include these requirements; this resulted afterwards
in the following paragraph to be placed at the end of section 2.6 in the FDD,
"Subsystem Coding and Documentation Requirements" (this will be in version 3.3
of the FDD).
"To guarantee precision sufficient to support the accuracy requirements
specified in the 2MASS Level 1 Science Objectives and Specifications
document (see section 1.1, item 5), the following precision requirements
are levied on the 2MAPPS subsystems. Position coordinates and their
uncertainties shall be represented in disk files with a precision corre-
sponding to at least 0.02 seconds of arc, and computations involving
position coordinates and their uncertainties shall be done in double
precision as needed to guarantee conservation of this precision. This
implies file formats representing position parameters accurate to three
decimal places when represented in seconds of time, two decimal places
when represented in seconds of arc or camera pixels, and six decimal
places when represented in floating-point degrees. Photometric quantities
and their uncertainties shall be represented with a minimum precision of
0.001 magnitudes, with similar requirements regarding computational pre-
cision needed to guarantee conservation of this precision and correspond-
ing precision when other photometric representations are used."
- Survey Strategy Requirements Document --
C. Lonsdale reported that she has prepared a draft version of the 2MASS
Survey Coverage Strategy and Plan document (this is the name used in the FDD,
section 1.1 item 4, where it is noted as TBD; the actual name may turn out to
be different). The draft will be distributed locally for comment by the end of
the week, and a version with revisions incorporated will be sent to the project
at the end of next week.
- Compression Method for Online Images --
T. Jarrett displayed some sample coadded images of several fields
compressed to various degrees of file size reduction by the HST compression
algorithm. Although the decision of which compression algorithm to use for the
online image data base is not final, the HST algorithm is acquiring the
reputation of a preferred standard, and it seems to be the front runner
for use at IPAC. This is a "lossy" compression algorithm, which is
acceptable for the
online image data base because of the anticipated use of this data base (i.e.,
visual field inspection, preparation of finding charts, and the like, as
opposed to detailed photometric analysis, for which the full coadded images
must be reloaded from tape once they have been moved offline). The algorithm
was developed by R. White and has been used for some distributed HST images.
The desire is to achieve a compression factor of 100, so that the expected
image data base size of 4 terabytes can be reduced to an online data base of
40 gigabytes. The members present were not quite satisfied with any of the
examples of this compression level, however, with the qualitative
acceptability level appearing to run between the 20 and 50 reduction
factors. Further consideration will be given to the compression method and
level of reduction to be used.
- Blended Point-Source Processing --
R. Cutri reported that he is preparing a memo for distribution to the
science team describing the problems in handling blended point sources (see
last week's minutes) and various approaches to solutions. This memo will also
contain a proposal for implementing band-filling.