Blog Layout

FUSION: MORE THAN MULTISENSOR INTEGRATION

FUSION: MORE THAN MULTISENSOR INTEGRATION

 

In the early 1990s I recalled, in a manuscript (publication #49 – click here ), advocacy from years ago that probably originated from within USAF.  A sharp distinction was to be drawn between “multisensor integration” for low-speed and low-volume versus “sensor fusion” for high-speed-high-volume processing.  Unfortunately, separate terminology never survived; the industry uses the same vocabulary for nav update “fusion” at leisurely retes and for fusion of images (e.g., with megapixels/frame with 3-byte RGB pixels at 30 frames/sec) requiring speeds expressed in GHz.
Terminology aside, a major task for the imaging field is to recognize and categorize the degradations. Combining tracks from different sensors is a major undertaking.  For obvious reasons (including but by no means limited to inertial nav error propagation and time-varying errors in imaging sensors), complexity is compounded by motion.  Immediately I’ll step back and consider that from a perspective unlike concepts in common acceptance.  For brevity I’ll just cite some major ingredients here:


* Inertial nav is used to provide a connection between frames taken in succession from sensors in motion. INS error propagation for these short duration cannot correctly be based on antiquated nmi/hr modeling. There are literally dozens of gyro and accelerometer error contributors, most of which are motion-sensitive and often excluded from) IMU specifications.  Overbounding is thus necessary for conservative design, in effect either compromising performance or increasing data rate demand. For discussion of those error sources see relevant sections of Chapter 4 in the book whose link appears in the next item.


* It is very important not to attribute sensor stabilization errors to any tracked object's estimated state (surprisingly many operational designs violate that principle). Figure 9.3 on page 200 of my 2007 book shows a planar example for insight.

 

* Often there is motion of not only the sensors but also tracked objects, the masking of whose signatures -- already potentially an issue even when stationary -- will be further obscured if driven to thwart observation.


* Procedures for in-flight calibration, self-calibration, online temperature compensation, etc. are invoked. Immediately all measurement errors are then redefined to include only residual amounts due to imperfect calibration.  One caveat -- any large discrete corrections should occur suddenly between -- not within -- frames (e.g., a SAR coherent integration time).


* The association problem, producing hypothetical tracks (e.g., due to crossing paths), defies perfect solution. Thus a sensor response from object `A' might be combined with subsequent response from object `B' to produce an extraneous track characterizing neither. Obviously this becomes more unwieldy with increasing density of objects responding to sensors. 


* Ironically, many of the objects that complicate the association task are of little or no interest. Rocks reflect radar transmissions. Animals respond to IR sensors. Metallic objects respond to both, which raises an opportunity, to concentrate on metallic objects: accept information only from pixels with both radar and IR responses. Tracks formed after that will be far fewer and much more credible.


* Registration of image data from IR (Az/EL) and SAR (range/doppler) cells must account for big differences in pixel size, shape, and orientation.  Although by no means trivial, in principle it can be done. 


* Even if all algorithm development and processing implementation issues are solved, unknown terrain slopes will degrade the results. Also, undulations (as well as any structures present) in a swath will produce data gaps due to masking. How long a gap is tolerable before dropping an old track and reallocating its resources for a new one will be another design decision.


* Imaging transformation (e.g., 4x4 affine group and/or thin plate spline) applicability will depend on operational specifics.


When I was more heavily involved in this area the processing requirements for image fusion while still in raster form would have been prohibitive.  Today's capabilities are far more advanced.

 

By James Farrell 09 May, 2023
A look back in time by James L Farrell, PHD - 2023
11 Apr, 2020
Apologies for little posting lately. Much activity included some with deadlines; this will focus primarily on the few years leading up to Covid.
11 Apr, 2020
GNSS Aided Navigation & Tracking
By James Farrell 30 Aug, 2018
Apologies for little posting lately. Much activity included some with deadlines; this will be limited to the past twelve months. In 2017 my involvement in the annual GNSS+ Conference again included teaching the satnav/inertial integration tutorial sessions with OhioU Prof. Frank vanGraas. Part I and Part II are likewise being offered for Sept 2018. Also...Read More
28 Jun, 2018
Once again I am privileged to work with Ohio University Prof. Frank vanGraas, in presenting tutorial sessions at the Institute of Navigation’s GNSS-19 conference. In 2019, as in several consecutive previous years, two sessions will cover integrated navigation with Kalman filtering.  Descriptions of the part 1 session and part 2 session are now available online. By way of...Read More
30 Apr, 2018
The Institute of Navigation’s GNSS+ 2018 Conference provides me the privilege of collaborating with two of the industry’s pillars of expertise. Ohio University Professor Frank van Graas and I are offering fundamental and advanced tutorials.  Then on the last day of the conference I’m coauthored with William Woodward, Chairman of SAE Int’l Aerospace Avionics Systems Division and hardware lead...Read More
24 Apr, 2018
A new SAE standard for GPS receivers is a natural complement to a newly receptive posture toward innovation unmistakably expressed at high levels in FAA and Mitre (ICNS 2018).  Techniques introduced over decades by this author (many on this site) can finally become operational. 1980s euphoria over GPS success was understandable but decision-makers, lulled into complacency, defined requirements in adherence...Read More
22 Mar, 2018
At April’s ICNS meeting (Integrated Communications Navigation and Surveillance) as coauthor with Bill Woodward (Chairman, SAE International Aerospace Avionics Systems Division), I’ll present “NEW INTERFACE REQUIREMENTS: IMPLICATIONS for FUTURE“.  By “future” we indicate the initiation of a task to conclude with a SAE standard that will necessitate appearance of separate satellite measurements to be included...Read More
16 Jul, 2016
A recent video describes a pair of long-awaited developments that promise dramatic benefits in achievable navigation and tracking performance.  Marked improvements will occur, not only in accuracy and availability; over four decades this topic has arisen in connection with myriad operations, many documented in material cited from other blogs here. 
12 Feb, 2016
For reasons, consider a line from a song in Gilbert-&-Sullivan’s Gondoliers: “When everybody is somebody, then nobody is anybody” — (too many cooks) For consequences, consider this question: Should an intolerable reality remain indefinitely intolerable? While much of the advocacy expressed in my publications and website have focused on tracking and navigation, this tract concentrates...Read More
More Posts
Share by: