About UsResearchOpportunities
PublicationsResourcesSite Map
 
...processing
Special thanks to HP Labs, Kodak, ViA, and Kopin
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...return 
Strictly speaking the actual quantity measured in early systems was that of a single homodyne channel, which only approximated energy. Later in some systems energy was measured properly with separate I and Q channels.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...user-interface
An apparatus that embodies a user-interface of sufficient capability is said to meet the `Existential Criterion'[1]

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...proximity
An apparatus that embodies a user-interface of sufficient and integral physical proximity is said to meet the `Eudaemonic Criterion''[1]

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...constant
An apparatus having both operational constancy and interactional constancy is said to meet the `Ephemeral Criterion''[1]
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...processing
The term `WearComp' denotes a class of wearable, tetherless, computational apparatus with visual display means'[10] that meets the three criteria above.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...project
For a detailed historical account of the WearComp project, and other related projects, see'[14][15].

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...processing
The first wearable computers equipped with multichannel biosensors were built by the author during the 1980s inspired by a collaboration with Dr. Ghista of McMaster University. More recently, in 1995, the author put together an improved apparatus based on a Compaq Contura Aero 486/33 with a ProComp 8 channel analog to digital converter, worn in a Mountainsmith waist bag, and sensors from Thought Technologies Limited. The author subsequently assisted Healey in duplicating this system for use in trying to understand human emotions'[19].

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...dramatically
Perhaps it may stop, or ``skip a beat'' at first, but over time, on average, in the time following the event, experience tells us that our hearts beat faster when frightened.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...light
This `lightspace' theory was first written up in detail in 1992'[24].
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...scene
first modeling the motion as a projective coordinate transformation, and then estimating the residual epipolar structure or the residual epipolar structure or the like[33][34][35][36][37][38]

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...apparatus
The additional one-time download of a lens distortion map, into WearComp's coordinate transformation hardware, eliminates its lens distortion which would otherwise be very large, owing to the covert, and therefore small, size of the lens, and in engineering compromises necessary to its design. The Campbell method [42] is used to estimate the lens distortion for this one-time coordinate transformation map.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...parameters
published in detail in [44]
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...operation
also known as a group action or G-set [49].
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...head)
The resulting collection of images may be characterized by fewer parameters, through application of the Hartley constraint'[50] [27] to an estimate of a projective coordinate transformations.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...like
For simplicity, all these methods of automatic exposure control are referred to as AGC in this paper, whether or not they are actually implemented using an Automatic Gain Control (AGC) circuit or otherwise.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...``group
To be strictly mathematically correct, the projective group is written (ax+b)/(cx+d), but 47#47 in practical engineering problems (physically ``reasonable'' camera motion), so we may divide by d.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...model
This approximate model is used in the feedforward loop, while the exact non-parametric model is used in the feedback loop, as described in detail in'[44] [15].

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...camera
In some embodiments of WearComp, only a portion of the visual field is mediated in this way. Such an experience is referred to as `partially mediated reality''[12].

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...synesthesia
Synesthesia [66] is manifest as the crossing of sensory modalities, as, for example, the ability (or as some might call a disability) to taste shapes, see sound, etc..

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...transform
The chirplet transform [67] characterizes the acceleration signature of Doppler returns, so that objects can be prioritized, e.g. those accelerating faster toward the wearer can be given higher priority, predicting eminent collision, etc..

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

Steve Mann
Thu Jan 8 04:34:17 EST 1998