Skip to main content

Full text: Fusing ROV-based photogrammetric underwater imagery with multibeam soundings for reconstructing wrecks in turbid waters

Underwater photogrammetry 
24 
Hydrographische Nachrichten 
ronments. Moreover, sounding measurement un 
certainties not exceeding the IHO specification for 
Special Order Surveys (IHO 2008) provide a geo 
metrically realistic representation of these objects. 
Unfortunately, the range and depression angle 
between the MBES and the object Inevitably af 
fect the sounding measurement accuracy and, 
more importantly In this use case, the ability to 
ensonify all parts of the object, especially on com 
plex objects such as wrecks. A reliable assessment 
of the state of wrecks solely based on sounding 
measurements is therefore guestionable. Profes 
sional divers and camera-eguipped ROV provide 
the necessary close range in-situ inspection. How 
ever, due to poor visibility conditions, this inspec 
tion Is much localised. A thorough assessment of 
the state of large objects (e.g. wrecks) is necessarily 
the synthesis of many such localised inspections, 
which Is Inherently subjective and error-prone. 
Series of close-range underwater images col 
lected by a camera-eguipped ROV over wrecks, 
whilst still localised, are much easier to amalgam 
ate to portray the whole structure of a wreck. 
Cameras are portable and small, thus being eas 
ily mountable on ROVs or attached to divers. Data 
in occluded areas is collected by simply circling 
the object. Photogrammetric analysis carried out 
on these images results in geometrically-correct 
three-dimensional point clouds characterised by 
high spatial and temporal resolution as well as col 
our information (Luhmann et al. 2020). When the 
imagery-based 3D point cloud is co-registered to 
the MBES soundings, the result is a fused data set 
characterised by higher data density and addition 
al attribution (i.e. colour information). Moreover, 
targeting the ROV-lmages to the areas with low 
sounding densities (I.e. occluded areas) allows for 
an improved assessment of the state of a wreck. 
Finally, being fixed to a common terrestrial refer 
ence system, the fused data set is easily transfer- 
rable to the spatial data infrastructure of maritime 
administrations. 
Fusing MBES data with information generated 
from cameras Is thus highly desirable. However, 
underwater imagery suffers from many degrading 
and altering effects. This Includes multimedia ef 
fects, as light travels through air, glass and water 
and thus, according to Snell's law, the ray is refract 
ed twice at the interfaces. This, by definition ren 
ders the pinhole model invalid if no constructional 
corrections are employed. Strict modelling of the 
ray path has been developed, e.g. by Kotowski 
(1988), Maas (1995) and Jordt-Sedlazeck and Koch 
(2012). Several authors on the other hand found 
that when the camera is positioned close to a flat 
glass interface and oriented perpendicularly to It, 
refraction effects can be compensated by stand 
ard lens correction functions, as in Brown (1971), 
and strict modelling is only decisive In applications 
where highest accuracy is demanded (Kotowski 
1988; Przybllla et al. 1990; Shortls 2015; Kahmen 
et al. 2019). Furthermore, the entrance pupil of a 
camera lens can be adjusted with the centre of a 
hemispherical dome port. This accounts for Image 
degradation, and possible residual errors are com 
pensated by standard lens correction functions 
(Menna et al. 2016). Furthermore, optical degrada 
tion from wavelength dependent light absorption, 
chromatic aberration or dispersion reduces image 
guality. This results in Images with low contrast, 
colourcast, blur and haze (Wang et al. 2019). To ac 
count for these, several image enhancement and 
restoration algorithms have been developed over 
the years. These take the actual Image formation 
model into account (e.g. Akkaynak and Trelbltz 
2019) or employ appropriate Image processing 
tools, such as histogram stretching, white bal 
ance shift or gamma stretch to increase contrast, 
decrease colour cast, etc. In Blanco et al. (2015), 
the FAB method Is Introduced. Here, using a grey- 
world assumption, the chromatic component of 
the FAB colour space Is shifted towards the white 
point and the luminance component Is enhanced 
by histogram stretching and cut off. Thus, the 
method belongs to the latter kind of algorithms. 
Mangeruga et al. (2018) compared five state-of- 
the-art Image enhancement algorithms for under 
water photogrammetry and provided a metric for 
benchmarking these. It was concluded that for 3D 
reconstruction purposes, images enhanced with 
the FAB algorithm or the original Images perform 
best on their data sets. 
As photogrammetry cannot provide absolute 
positioning, further sensor data has to be com 
bined with the Imagery, providing a georeferenced 
position and point cloud. Furthermore, Imagery 
can be used for online algorithms, solving posi 
tioning and mapping of the environment in real 
time (simultaneous localisation and mapping, 
SEAM). These algorithms suffer from drift, as they 
can only take a certain amount of data points and 
positions into account, in order to not overflow 
the memory and reduce computational complex 
ity. Hence, the system drifts with time and distance 
travelled, thus requiring additional information for 
applications demanding high accuracy (Durrant- 
Whyte and Bailey 2006). Originating from the ro 
botics community, SEAM is a broad research field 
and several methods have been proposed In re 
cent years, using various kinds of sensors. State-of- 
the-art algorithms are proposed by Mur-Artal and 
Tardos (2017) or Engel et al. (2015), creating sparse 
or seml-dense point clouds respectively. These 
algorithms are capable, provided pre-callbrated 
cameras, of computing point clouds and provide 
localisation within the point clouds In real-time, 
depending on the image resolution. Furthermore, 
they automatically identify revisited areas and 
compute so-called loop-closures, i.e. creating con 
sistency between non-seguentlal parts of the data
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.