Help for MARSCAHV

PURPOSE:
To convert images from distorted (CAHVOR/CAHVORE) to linear (CAHV) coordinates.

This is a multimission program derived from marsmos.
It supports any mission, instrument, and camera model supported by the
Planetary Image Geometry (Pig) software suite.

The program can accept a navigation file written by marsnav(for example).
This will cause the output camera model to include the updated pointing,
but in most cases will have no effect on the output pixels. Only CAHVORE
input images using a surface model other than the default SPHERE may show
differences in pixels due to the presence of a nav file.

Radiometric correction is turned off by default; this may be turned on via 
the RAD keyword parameter.

The program will work with color images if such are given as input and the
BAND parameter is not specified; the number of output bands will equal the
maximum number of bands across all inputs. If BAND parameter is specified, 
only that one single band is processed and output.

EXECUTION:

marscahv inp=input.img out=output.img
where input.img is a VICAR file and output.img is the resulting VICAR
output file.

marscahv inp=left.img out=output.img stereo_partner=right.img
If stereo_partner is not present, the program tries to find stereo partner
of the input image based on mission, instrument etc. info of the input image.
The presence of stereo_partner input parameter is an unconditional overwrite of
that process.
Note that for a generic camera, there is absolutely no way to tell what the
stereo partner might be, or even if there is one.  Thus using stereo_partner
input parameter is the ONLY way to specify stereo pair for generic image.

marscahv inp=sub.img out=output.img fullsize='('1024, 1024')'
The presence of the fullsize input parameter tells marscahv that the input
image is the subframe and the full frame size is specified by "fullsize".  
This parameter is only necessary if the camera model specified in subframe is
for a full frame image.
Note that for MER this parameter is not necessary, since every subframe has
it's own camera model, specific to it's size and location within full frame.


USAGE:
The purpose of marscahv is to remove geometric distortion inherent in the 
camera instruments, or "linearize" image data. This is usually necessary to 
facilitate correlation process, to line up stereo pair for viewing etc.

In order to linearize an image, the 2D image coordinate must be projected
into 3D space using the non-linear model, then projected back into 2D space
using the linear model(and interpolated from there).  For CAHVOR-based 
cameras(MER:pancam, navcam, MI), this process creates a linearized(CAHV-based)
image which is mathematically perfect according to the camera models(with 
the exception of interpolation noise). It is perfect since any point in 
CAHVOR image along the 3D projected line will back-project to the same point
on the CAHV image since they both project through the same C point. 
For CAHVORE-based cameras(MER: hazcams), however, that is not the case.  The
moving entrance pupil means that the program has to pick a specific point
along the projected ray, and which point is picked will make a difference in
the output.

The only way to pick that specific point is to use some kind of 
"surface model".  For example, sphere, picking the point at some radius
(default is 1unit = 1 meter) from the camera center.  So, depending on what
surface model program uses, the linearization process will provide different
results, and will always be an approximation.  The only "exact" solution is
to model the actual surface.  But we generate approximation of the actual 
surface using linearized images obtained by marscahv, thus we don't have
actual surface.

For MER hazcams, the entrance pupil moves only about 1mm over the usable
range of hazcams, which is not very much.  And, the effect decreases toward 
the center of the image, becoming 0 at the axis.  But, since the ground is
only ~0.5 meters away at the closest point, the "moving entrance pupil" effect
could be noticeable.

Labels will be written to the output image specifying all parameters
needed in order to reproject the image, and to convert pixel coordinates
into XYZ view rays in the output coordinate system.  See ???? for
details on what the label items mean.


OPERATION:
The program uses the appropriate camera model for input image and
outputs an image using a camera model aligned for stereo viewing.  
Each pixel in the output is transformed from output to input camera models 
in the following steps:
1. Each output pixel defines a unit vector.
2. We compute the intersection of this vector with a surface model.  This is
   by default a unit sphere(not exactly, see notes below) with the center 
   at unit vector's origin.
Note that in CAHVORE case the process doesn't actually project the rays out
to a sphere.  It projects them from the ray's origin which could not coinside
with the location of C-vector.  As a result, we are actually projecting onto a
sort of ovoid-shaped surface, slightly elongated in the direction the entrance
pupil moves.
3. Then this ground point is ray traced back into the input camera images.
4. The DN value in the selected input image is bilinearly interpolated
   and placed into the output location

The UNSUB mode will "un-subframe" the image before linearizing it.
When this mode is on, the program looks up the nominal size of the camera
frame (using the camera mapping file), and sets the output to be that size.
It then puts the input in the right position in the output according to the
subframe start (FIRST_LINE and FIRST_LINE_SAMPLE labels for most missions).
From then on, it proceeds as if a full-frame image had been given.  This is
useful for cameras like the MSL Mastcam, where the differential zoom on both
eyes causes them to often do different subframes on the two eyes.  Removing
the subframe puts it all back in the same geometry again, so downstream
programs that need the same image size for both eyes can work.

Linearization Modes
-------------------

There are now multiple linearization modes, controlled by "cmod_warp" in
the POINT_METHOD parameter.  For example, point=cmod_warp=2 will set mode 2.

Mode 1:  This is the traditional method, used throughout most of MER and MSL
ops, and is still the default for those missions.  It assumes images come
from traditional stereo cameras, so the baseline is approximately parallel
to the image rows.  This works fine for most stereo cameras but does not work
well for odd cases such as long baseline, vertical stereo, repointed stereo,
or cross-instrument coregistration.

Mode 2:  This is a new method, and is the default for InSight and Mars 2020.
This does not assume the rows match the baseline, so it will handle many more
cases, including long baseline, vertical stereo, repointed stereo, or
cross-instrument coregistration.  It can be used with MER and MSL by setting
the cmod_warp.

Mode 3 or PSPH:  This linearizes to a new model type developed by Todd Litwin,
called Planospheric or PSPH.  It is *not* a CAHV model, it is something
entirely different.  It is designed to have better performance at the center
of fisheye cameras (such as hazcams).  Although this is supported fully in PIG,
it is not supported (as of this writing) in marsjplstereo so getting a first
stage correlation may be challenging.

Modes 2 and PSPH fully support input images of different sizes, such as
different subframes or different instruments, without the use of -unsub.
In order to do this, you'll need to specify an output size, so that both runs
of marscahv (making the left and right image) end up with the same size.  This
can be done via OUTSIZE, but can also be done via the MINSIZE or MAXSIZE
keywords, which set the output size to the minimum or maximum dimensions of
the two inputs.  This only applies if stereo_partner is provided.

One other note is regarding "cahv_fov".  This POINT_METHOD parameter selects
the field of view of the output, either max, min, or linear.  The min sets
the output FOV to the intersection of the inputs... at infinity.  So if you
are looking at infinity, or have no appreciable toe-in, cahv_fov=min will give
you the best image scale, with nothing but overlap.  However, if you have
significant toe-in and are not looking at infinity, it's possible that
cahv_fov=min may end up with no overlap.  For that reaason, cahv_fov=max is
strongly recommended for non-traditional stereo, or for toed-in cameras (such
as the MSL Mastcam).

In summary, the following will give the maximum likelihood of success for
non-traditional stereo images:

$MARSLIB/marscahv left_in left_out stereo_p=right_in
				-max point='"cahv_fov=max,cmod_warp=2"'

Parallel Processing
-------------------
This program has been parallelized using Open MP (OMP), which is built in
to the g++ compiler.

By default the number of threads used equals the number of cores on the machine
where the program is being run.  Each image line is assigned to a different
core, with "dynamic" scheduling to keep the workload for eeach core similar.

Parallel processing can be disabled via the -OMP_OFF keyword.  The number
of threads can be controlled by setting the OMP_NUM_THREADS environment
variable before running the program.  There are numerous other OMP variables
that can be set; see the OMP documentation.  However, the number of threads
is the only one that is likely to be useful in most cases.


HISTORY:
1994-04-30 jjl	Initial mpfmos, mpfcahv by J Lorre. 
1998-08    rgd	Multimission conversion of mpfmos to marsmos by B. Deen
2002-09    ozp	Adaptation to marscahv by O. Pariser
2012-09-03 rgd	Added -unsub capability
2016-05    rgd  Parallelization of code
2017-05    rgd	Added new linearization modes
2018-10    sl   Added color processing capability   
2020-02-18 wlb  IDS-7927 - replaced sprintf() calls with snprintf() calls; added unit test and log

COGNIZANT PROGRAMMER:  B. Deen


PARAMETERS:


INP

Input image.

OUT

Output image.

NAVTABLE

Corrected navigation filename.

STEREO_PARTNER

Left/right partner of input image.

FULLSIZE

Full frame size of input image.

OUTSIZE

Overrides size of output image.

AUTOSIZE

Sets output size to min or max of inputs.

OUTOFF

Overrides x/y offset values for output.

BAND

The BSQ band number or don't specify for all bands.

NORMAL

Surface normal vector.

GROUND

Surface ground point.

RADIUS

Radius of a surface sphere.

SURFACE

The type of mars surface to use INFINITY, PLANE, SPHERE, MESH.

SURF_MESH

Mesh file for surface model VARI SURF_CSFILE File containing CS for surface model

RAD

Turns on or off radiometric correction.

DNSCALE

DN scaling factor.

CONFIG_PATH

Path used to find configuration/calibration files.

MATCH_METHOD

Specifies a method for pointing corrections.

MATCH_TOL

Tolerance value for matching pointing params in pointing corrections file.

POINT_METHOD

Specifies a mission- specific pointing method to use

NOSITE

Disables coordinate system sites.

INTERP

Turns on or off the interpolation.

UNSUB

Turns on or off the unsub mode, which converts subframes to full frames.

OMP_ON

Turns on or off parallel processing (default: on)

DATA_SET_NAME

Specifies the full name given to a data set or a data product.

DATA_SET_ID

Specifies a unique alphanumeric identifier for a data set or data product.

RELEASE_ID

Specifies the unique identifier associated with the release to the public of all or part of a data set. The release number is associated with the data set, not the mission.

PRODUCT_ID

Specifies a permanent, unique identifier assigned to a data product by its producer.

PRODUCER_ID

Specifies the unique identifier of an entity associated with the production a data set.

PRODUCER_INST

Specifies the full name of the identity of an entity associated with the production of a data set.

TARGET_NAME

Specifies a target.

TARGET_TYPE

Specifies the type of a named target.

RSF

Rover State File(s) to use.

DEBUG_RSF

Turns on debugging of RSF parameter.

COORD

Coordinate system to use

COORD_INDEX

Coordinate system index for some COORD/mission combos.

FIXED_SITE

Which site is FIXED for rover missions.

SOLUTION_ID

Solution ID to use for pointing correction.

See Examples:


Cognizant Programmer: