WO1999066343A1 - Method for producing a 3d image - Google Patents
Method for producing a 3d image Download PDFInfo
- Publication number
- WO1999066343A1 WO1999066343A1 PCT/NO1999/000176 NO9900176W WO9966343A1 WO 1999066343 A1 WO1999066343 A1 WO 1999066343A1 NO 9900176 W NO9900176 W NO 9900176W WO 9966343 A1 WO9966343 A1 WO 9966343A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- waves
- volume
- image
- received
- acoustic
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8902—Side-looking sonar
- G01S15/8904—Side-looking sonar using synthetic aperture techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/56—Display arrangements
- G01S7/62—Cathode-ray tube displays
- G01S7/6245—Stereoscopic displays; Three-dimensional displays; Pseudo-three dimensional displays
Definitions
- This invention relates to a method for producing an image of a submerged object, e.g. a shipwreck or the sea bottom.
- Acoustic sensors have become increasingly more common in systems for underwater sensing and imaging. Sonars are often used, ranging from simple systems detecting echos of an emitted pulse, to side scan sonars or two-dimensional multibeam sonar systems emitting and receiving signals along a line within a chosen angle and three-dimensional acoustic cameras, such as described in the articles "3D ACOUSTIC CAMERA FOR UNDERWATER IMAGING" by Rolf Kahrs Hansen and Poul Arndt Andersen in Acoustical Imaging, vol. 20, Plenum Press, New York, 1993, and "A 3D UNDERWATER ACOUSTIC CAMERA - PROPERTIES AND APPLICATIONS" by R.K. Hansen and P.A.
- the two-dimensional sonar In imaging larger objects the two-dimensional sonar is normally used by moving the sonar over e.g. the sea bottom and scanning at an angle essentially perpendicular to the direction of the movement. The data sampled along each line are combined to provide a segmented picture of the sea floor.
- a problem inherent in this solution is the difficulties in controlling the exact position of the sensor.
- a sonar unit being moved over the sea bottom is subject to drift because of current, wind, and if the sonar is carried by a boat, and inaccuracies in the control system of the vessel.
- each 3D segment contains data being virtually insensitive to movements of the recording transducer unit, that the 3D segments have been combined in order to provide a larger 3D image .
- Another advantageous feature is that the 3D segments have a coordinate accuracy which is better than the accuracy of the position measurement system, due to corrections based upon the information content in the separate 3D image elements .
- Yet another advantageous feature of this invention is that each underwater object, due to the overlapping images, are insonified several times, from different angles of incidence .
- the next angle of incidence might, thus providing a 3D image of the objects to a certain degree comprising views if the inside or back side of the objects. Therefore, much more detail will be available in the combined image than in one separate 3D image, or images composed of measurements from side scan sonars.
- Figure 1 illustrates the use of an acoustic 3D transducer unit for providing a 3D image.
- Figure 2 illustrates two partially overlapping areas.
- Figure 3 is a schematic illustration of the preferred method according to the invention.
- the camera 1 in figure 1 emits a number of pulses toward a selected volume 2.
- the pulses are reflected by different objects 3 and the receiver is capable of receiving the reflected pulses at a number of different sensors in a sensor matrix 4.
- the sensors are capable of sensing the phase, time and amplitude of the received signal from a chosen angular sector defined by , and x (see figure 2) .
- Comprised in the transducer unit 1 is also computing means for, based on the measurements of each sensor, calculating the direction, strength and time of arrival for the reflected waves.
- the signal received from each direction defined by the angular segments ⁇ ( and ⁇ x included ⁇ , and a , respectively, is analysed to detect the arrival of the peak amplitude or amplitudes of the signals received within the angular segment .
- the time of arrival of the peak amplitude indicates distance to the object.
- a composite image may be obtained from the direction of the received signal and the distance to the reflected object within each angular segment, as is indicated in figure 1 where the measured volume comprises a number of planes 5 referring to different distances from the sensor array 4.
- two or more reflecting objects at different distances may in some cases be found, increasing the three- dimensional image. This may be also indicate that the closest reflecting object is small, occupying only a part of the segment, having e.g. a hole transmitting part of the emitted pulse, or be partially transparent in the frequency range of the emitted acoustic signal.
- the methods for calculating these parameters are well known and will not be described in detail here.
- the computing is preferably performed using computer programs, implemented in hardware or software, inside the transducer unit. It may, however, also be performed later, or the received signals may be transmitted through a cable to a computer e.g. on board a control vessel steering the transducer unit .
- FIG. 1 Having registered a first image 2 the process may be repeated producing a second acoustic image 6 at least partially overlapping the first image.
- certain common features 3 in the common volume 7 may be found, e.g. an object protruding towards the transducer. Assuming that the features 3 are stationary the scale and relative positions of the images may be adjusted to provide a unitary three- dimensional image from the two images. This way any errors due to variations in the position or movement of the transducer unit may be reduced or eliminated.
- three or more common features 3 are used when combining the images, thus reducing the chances of error if one of the chosen features turns out to be moving.
- the abovementioned features may be chosen manually or, preferably, automatically using available mathematical methods of any suitable type, e.g. by detecting the minimum distances in the overlapping parts of the images .
- the combination of two or more images uses the following characteristics of the acoustic image:
- the image is three-dimensional it is represented by its x, y, and z coordinates, as well as the intensity representing the acoustic target strength of the reflecting object or part of object.
- each point will map one-to-one onto a plane parallel to the two- dimensional receiving transducer array.
- This projection is called the lateral projection.
- the three-dimensional image produced by a single acoustic pulse has a time tag which is applied to all the points in the image. This three-dimensional image is called a 3D image element.
- the preferred method for combining two images is as follows : SI. Thresholding to the lowest intensity limit defined by the user. S2. Let the first of the two images serve as the reference image - image I. Then the other image - image II - is positioned in space according to the measured position data, if available.
- the correlation between the succeeding image may provide an indication of the movements of the transducer unit. If the positions of the measured objects are known, the position and movement of the transducer unit may be found. However, the position and orientation of the transducer unit is preferably measured using a navigation system, e.g. using a GPS combined with a underwater positioning system. Thus the positions of the measured objects may be calculated.
- a navigation system e.g. using a GPS combined with a underwater positioning system.
- the method according to the invention may acquire images of objects within a corresponding range from the transducer or sensor unit.
- disturbances from objects close to or over a certain distance from the transducers may be eliminated.
- This may for example be used when inspecting a structure close to and in front of an oil platform.
- the oil platform may be removed from the picture, thus leaving a simple image to be analysed.
- the position and orientation of the acoustic transducer unit is constant, controlling a chosen volume over time.
- changes in the volume such as occurrence of a diver, may be detected.
- the distance range may be chosen individually for each angular segment providing a possibility to remove a structure protruding toward the transducers unit without affecting the range in other directions.
- the positions of the abovementioned common features in the images should not change. However, if changes are detected these changes may be analysed using known methods to provide an indication of the nature and extent of the change. By removing the static features in the volume using the technique described above only objects moving into the defined volume are detected.
- the preferred acoustic camera or transducer unit is known per se from the abovementioned articles and is not to be described in detail here. It is, however, within the scope of this invention to use other acoustic cameras capable of three-dimensional imaging.
- the size of the angular sector from which the reflected signals are received may be chosen, e.g. by adjusting the acoustic frequency, as both the size of the volume and the resolution depends on the acoustic frequency relative to the size of said sensor matrix.
- the possible side lobes of the acoustic transducers may be controlled by choosing the distance between the transducers depending on the frequency used in the measurements.
- the relationships between these parameters are well known in the related technical art.
- the succeeding images may be measured using different frequencies, thus, if e.g. each third of the images are made over a relatively large angular sector and the rest of the images are made in a narrower volume a composite image may be made having a good resolution along a narrow portion of the image and a poor resolution over a wider angle.
- two types of surveying may be made at the same time.
- the source and the receiver array are separate transducers, preferably positioned in the same transducer or sensor unit. It is, however, within the scope of this invention to provide a transducer array being capable of both emitting and receiving the acoustic waves.
- the calculations made according to the invention may be performed using software provided in computers positioned in the transducer unit or in another vessel.
- Necessary communication lines are provided, such as optical fibres, electrical conductors or acoustic or electro magnetical communication through the surrounding water.
- the calculations may also be performed using hardware, e.g. by specialized microprocessors to obtain higher processing speed.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Toys (AREA)
- Holo Graphy (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP99941901A EP1097393B1 (en) | 1998-06-19 | 1999-06-01 | Method for producing a 3d image |
AU55375/99A AU760693B2 (en) | 1998-06-19 | 1999-06-01 | Method for producing a 3D image |
US09/701,262 US6438071B1 (en) | 1998-06-19 | 1999-06-01 | Method for producing a 3D image |
DE69942745T DE69942745D1 (en) | 1998-06-19 | 1999-06-01 | METHOD FOR GENERATING A THREE-DIMENSIONAL IMAGE |
AT99941901T ATE480784T1 (en) | 1998-06-19 | 1999-06-01 | METHOD FOR GENERATING A THREE-DIMENSIONAL IMAGE |
DK99941901.3T DK1097393T3 (en) | 1998-06-19 | 1999-06-01 | The invention relates to a method for producing a three-dimensional image of an underwater object such as a shipwreck or the seabed by sending, receiving and processing audiobooks |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NO19982891 | 1998-06-19 | ||
NO982891A NO307014B1 (en) | 1998-06-19 | 1998-06-19 | Procedure for generating a 3D image |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1999066343A1 true WO1999066343A1 (en) | 1999-12-23 |
Family
ID=19902178
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/NO1999/000176 WO1999066343A1 (en) | 1998-06-19 | 1999-06-01 | Method for producing a 3d image |
Country Status (8)
Country | Link |
---|---|
US (1) | US6438071B1 (en) |
EP (1) | EP1097393B1 (en) |
AT (1) | ATE480784T1 (en) |
AU (1) | AU760693B2 (en) |
DE (1) | DE69942745D1 (en) |
DK (1) | DK1097393T3 (en) |
NO (1) | NO307014B1 (en) |
WO (1) | WO1999066343A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2647141A2 (en) * | 2010-10-25 | 2013-10-09 | Lockheed Martin Corporation | Building a three dimensional model of an underwater structure |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6700833B2 (en) | 2001-09-17 | 2004-03-02 | Bae Systems Information And Electronic Systems Integration Inc | Acoustical imaging interferometer for detection of buried underwater objects |
US6829197B2 (en) * | 2001-09-17 | 2004-12-07 | Bae Systems Information And Electronic Systems Integration Inc | Acoustical imaging interferometer for detection of buried underwater objects |
US6814701B1 (en) * | 2003-04-16 | 2004-11-09 | Aloka Co., Ltd. | Method and apparatus for ultrasound diagnostic imaging |
JP2007535195A (en) * | 2003-07-11 | 2007-11-29 | ブルービュー テクノロジーズ インコーポレイテッド | Method and system for implementing frequency-steered acoustic arrays for 2D and 3D images |
US20050276508A1 (en) * | 2004-06-15 | 2005-12-15 | Lockheed Martin Corporation | Methods and systems for reducing optical noise |
US8989431B1 (en) | 2007-07-11 | 2015-03-24 | Ricoh Co., Ltd. | Ad hoc paper-based networking with mixed media reality |
US7702673B2 (en) | 2004-10-01 | 2010-04-20 | Ricoh Co., Ltd. | System and methods for creation and use of a mixed media environment |
US8156116B2 (en) | 2006-07-31 | 2012-04-10 | Ricoh Co., Ltd | Dynamic presentation of targeted information in a mixed media reality recognition system |
US9063952B2 (en) * | 2006-07-31 | 2015-06-23 | Ricoh Co., Ltd. | Mixed media reality recognition with image tracking |
US8059486B2 (en) * | 2008-04-16 | 2011-11-15 | Coda Octopus Group | Method of rendering volume representation of sonar images |
US7898902B2 (en) * | 2008-06-13 | 2011-03-01 | Codaoctopus Group | Method of representation of sonar images |
EP2352312B1 (en) * | 2009-12-03 | 2013-07-31 | Oticon A/S | A method for dynamic suppression of surrounding acoustic noise when listening to electrical inputs |
US9058331B2 (en) | 2011-07-27 | 2015-06-16 | Ricoh Co., Ltd. | Generating a conversation in a social network based on visual search results |
JP6205722B2 (en) * | 2013-01-07 | 2017-10-04 | 日本電気株式会社 | Sonar image processing apparatus, sonar image processing method, sonar image processing program, and recording medium |
GB2533388B (en) | 2014-12-17 | 2021-01-06 | Sezanne Marine Ltd | Aspects of a sonar system |
US9886938B2 (en) | 2015-02-10 | 2018-02-06 | Navico Holding As | Transducer array having a transceiver |
US10114119B2 (en) * | 2015-05-20 | 2018-10-30 | Navico Holding As | Sonar systems and methods using interferometry and/or beamforming for 3D imaging |
US10024957B2 (en) | 2015-09-17 | 2018-07-17 | Navico Holding As | Adaptive beamformer for sonar imaging |
US11846733B2 (en) * | 2015-10-30 | 2023-12-19 | Coda Octopus Group Inc. | Method of stabilizing sonar images |
EP3449281A4 (en) | 2017-07-03 | 2020-01-29 | R2Sonic, LLC | Multi-perspective ensonification system and method |
US10816652B2 (en) | 2018-02-28 | 2020-10-27 | Codaoctopus Group | Method of compressing sonar data |
US11579288B2 (en) | 2018-04-14 | 2023-02-14 | Coda Octopus Group Inc. | Pseudo random frequency sonar ping generation |
US10718865B2 (en) | 2018-05-14 | 2020-07-21 | Coda Octopus Group | Method of compressing beamformed sonar data |
US11061136B2 (en) * | 2019-03-14 | 2021-07-13 | Coda Octopus Group Inc. | Sonar tracking of unknown possible objects |
KR102191007B1 (en) * | 2019-04-19 | 2020-12-14 | 한국과학기술원 | Three dimensional image generating method and apparatus |
CN110185080A (en) * | 2019-07-05 | 2019-08-30 | 中交上海航道局有限公司 | A kind of auxiliary twists the method and device of suction ship sand fetching construction |
US20220026570A1 (en) * | 2019-11-07 | 2022-01-27 | Coda Octopus Group Inc. | Techniques for sonar data processing |
US11789146B2 (en) * | 2019-11-07 | 2023-10-17 | Coda Octopus Group Inc. | Combined method of location of sonar detection device |
US11448755B2 (en) | 2020-09-18 | 2022-09-20 | Coda Octopus Group, Inc. | System and techniques for split-aperture beamforming |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0026385A1 (en) * | 1979-09-26 | 1981-04-08 | Siemens Aktiengesellschaft | Ultrasonic space surveillance system according to the pulse-echo method |
US5200931A (en) * | 1991-06-18 | 1993-04-06 | Alliant Techsystems Inc. | Volumetric and terrain imaging sonar |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5699318A (en) * | 1996-07-25 | 1997-12-16 | Northrop Grumman Corporation | Topographic composition sonar map |
-
1998
- 1998-06-19 NO NO982891A patent/NO307014B1/en not_active IP Right Cessation
-
1999
- 1999-06-01 DK DK99941901.3T patent/DK1097393T3/en active
- 1999-06-01 AU AU55375/99A patent/AU760693B2/en not_active Expired
- 1999-06-01 US US09/701,262 patent/US6438071B1/en not_active Expired - Lifetime
- 1999-06-01 WO PCT/NO1999/000176 patent/WO1999066343A1/en active IP Right Grant
- 1999-06-01 DE DE69942745T patent/DE69942745D1/en not_active Expired - Lifetime
- 1999-06-01 EP EP99941901A patent/EP1097393B1/en not_active Expired - Lifetime
- 1999-06-01 AT AT99941901T patent/ATE480784T1/en not_active IP Right Cessation
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0026385A1 (en) * | 1979-09-26 | 1981-04-08 | Siemens Aktiengesellschaft | Ultrasonic space surveillance system according to the pulse-echo method |
US5200931A (en) * | 1991-06-18 | 1993-04-06 | Alliant Techsystems Inc. | Volumetric and terrain imaging sonar |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2647141A2 (en) * | 2010-10-25 | 2013-10-09 | Lockheed Martin Corporation | Building a three dimensional model of an underwater structure |
CN103534608A (en) * | 2010-10-25 | 2014-01-22 | 洛克希德马丁公司 | Building a three dimensional model of an underwater structure |
EP2647141A4 (en) * | 2010-10-25 | 2014-12-10 | Lockheed Corp | Building a three dimensional model of an underwater structure |
US8929176B2 (en) | 2010-10-25 | 2015-01-06 | Lockheed Martin Corporation | Building a three-dimensional model of an underwater structure |
Also Published As
Publication number | Publication date |
---|---|
NO982891L (en) | 1999-12-20 |
AU5537599A (en) | 2000-01-05 |
DE69942745D1 (en) | 2010-10-21 |
DK1097393T3 (en) | 2010-11-08 |
EP1097393B1 (en) | 2010-09-08 |
NO307014B1 (en) | 2000-01-24 |
ATE480784T1 (en) | 2010-09-15 |
EP1097393A1 (en) | 2001-05-09 |
NO982891D0 (en) | 1998-06-19 |
US6438071B1 (en) | 2002-08-20 |
AU760693B2 (en) | 2003-05-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6438071B1 (en) | Method for producing a 3D image | |
US6873570B2 (en) | High resolution bathymetric sonar system and measuring method for measuring the physiognomy of the seabed | |
US5231609A (en) | Multiplatform sonar system and method for underwater surveillance | |
US4532617A (en) | System for locating a towed marine object | |
US4992990A (en) | Method for determining the position of seismic streamers in a reflection seismic measuring system | |
AU2010297524B2 (en) | Method and device for measuring a contour of the ground | |
US4815045A (en) | Seabed surveying apparatus for superimposed mapping of topographic and contour-line data | |
US5530680A (en) | Feature location and display apparatus | |
US20100067330A1 (en) | Ship mounted underwater sonar system | |
US20020126577A1 (en) | Multibeam synthetic aperture sonar | |
US6285628B1 (en) | Swept transit beam bathymetric sonar | |
US7639565B2 (en) | Point source localization sonar system and method | |
Châtillon et al. | SAMI: A low-frequency prototype for mapping and imaging of the seabed by means of synthetic aperture | |
US4970698A (en) | Self-calibrating sonar system | |
JP2002168952A (en) | Method of reconstituting submarine three-dimensional structure | |
CN113534161B (en) | Beam mirror image focusing method for remotely positioning underwater sound source | |
JPH04501316A (en) | sonar exploration system | |
JPH04357487A (en) | Side looking sonar | |
JPH0385476A (en) | Sea bottom searching apparatus | |
Sathishkumar et al. | Echo sounder for seafloor object detection and classification | |
US5402393A (en) | Non-invasive acoustic velocimetric apparatus and method | |
CA2091430A1 (en) | Method of determining depth values for the bottom profile of a body of water | |
RU2736231C1 (en) | Method for determining sound velocity distribution | |
JP2022138365A (en) | Echo-sounding device | |
Andrews et al. | Swathmap: Long range sidescan sonar mapping of the deep seafloor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 55375/99 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 09701262 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1999941901 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWP | Wipo information: published in national office |
Ref document number: 1999941901 Country of ref document: EP |
|
WWG | Wipo information: grant in national office |
Ref document number: 55375/99 Country of ref document: AU |