CN104808210B - A kind of fusion of imaging device and method of sonar and binocular vision imaging system - Google Patents

A kind of fusion of imaging device and method of sonar and binocular vision imaging system Download PDF

Info

Publication number
CN104808210B
CN104808210B CN201510180927.2A CN201510180927A CN104808210B CN 104808210 B CN104808210 B CN 104808210B CN 201510180927 A CN201510180927 A CN 201510180927A CN 104808210 B CN104808210 B CN 104808210B
Authority
CN
China
Prior art keywords
sonar
fusion
binocular
image
binocular vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510180927.2A
Other languages
Chinese (zh)
Other versions
CN104808210A (en
Inventor
徐渊
黄伟鑫
张志强
王亚洲
何凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Original Assignee
Shenzhen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University filed Critical Shenzhen University
Priority to CN201510180927.2A priority Critical patent/CN104808210B/en
Publication of CN104808210A publication Critical patent/CN104808210A/en
Application granted granted Critical
Publication of CN104808210B publication Critical patent/CN104808210B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8902Side-looking sonar
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Abstract

The present invention relates to a kind of sonar and the fusion of imaging device and method of binocular vision imaging system, the device includes following system:Rigid support, sonograms system, binocular vision imaging system and the fusion of imaging processing system for connecting the simultaneously picture signal of fusion treatment sonograms system and binocular vision imaging system.Sonograms system sends sonar detection target object, and when finding target object, by rigid support close to target object;When the distance of rigid support and target object reaches setpoint distance, start binocular vision imaging system, picture signal is gathered by sonograms system and binocular vision imaging system respectively simultaneously;Fusion of imaging processing system fusion acquired image signal formation fused images.The present invention can effectively detect specific profile, surface information and the depth information of object in water, solve the problem of object being particularly under water under deep water perceives difficult, and strong solution is provided for the related work that is particularly under water under deep water.

Description

A kind of fusion of imaging device and method of sonar and binocular vision imaging system
Technical field
The present invention relates to underwater imaging system, more specifically to melting for a kind of sonar and binocular vision imaging system Synthesized image device and method.
Background technology
It is always the problem of being difficult to solve very well that the object being particularly under water under deep water, which is perceived,.Conventionally used sonar is carried out The detection of object can only scan the information of a plane, can only know the Position Approximate of object and can not know the specific wheel of object Wide and surface information, and it is slow by scanning object height to be influenceed to occur interference on depth direction, sweep speed.
And depth information of the picture without object caught is caught using common camera, it is impossible to perceptual object and itself Distant relationships., must band using the pure passive binocular vision system detection that can have real-time display effect again because image-forming principle Enter the mistake of imaging.
The content of the invention
The technical problem to be solved in the present invention is, for the defect of prior art, there is provided a kind of sonar and binocular vision The fusion of imaging device and method of imaging system.
The technical solution adopted for the present invention to solve the technical problems is:Construct a kind of sonar and binocular vision imaging system Fusion of imaging device.
In sonar of the present invention and the fusion of imaging device of binocular vision imaging system, including rigid support, sound Sodium imaging system, binocular vision imaging system and connection and sonograms system and binocular vision imaging described in fusion treatment The fusion of imaging processing system of the picture signal of system;The sonograms system includes being fixedly mounted on the rigid support Sonar sensing probe and communicate the signal processing module that is connected with the sonar sensing probe;The binocular vision Imaging system includes the binocular camera being fixedly mounted on the rigid support and is connected with binocular camera communication Binocular cues processing module.
Preferably, the sonar sensing probe is simple beam sonar sensing probe.
Preferably, the rigid support includes being installed on anti-water chamber therein for the binocular camera;The waterproof Chamber is provided with light penetrating panel corresponding with the taking lens of the binocular camera.
Preferably, the fusion of imaging processing system includes FPGA processing modules, parsing graphics module and fusion treatment Module;
The FPGA processing modules are connected with the signal processing module and binocular cues processing module, and incite somebody to action both The data of collection merge processing, are sent to the parsing graphics module;
Parsing graphics module and the FPGA processing modules communication is connected, and by described in after receive, merging Data are parsed, and sonar image and binocular image are drawn out respectively;
The fusion treatment module is connected with the parsing graphics module, according to the height of sonar image offer, width Degree, abscissa and depth information, travel through each region of the binocular image, and fusion obtains fused images.
Preferably, the binocular camera is fixedly mounted on the rigidity by setpoint distance with the sonar sensing probe and propped up On frame.
In sonar of the present invention and the fusion of imaging device of binocular vision imaging system, the method for its fusion of imaging Comprise the following steps:
S1:Sonograms system sends sonar detection target object, and when finding target object, rigid support is close The target object;
Preferably, in the step S1, the sonograms system sends simple beam sonar and carries out target object detection.
S2:When the distance of the rigid support and the target object reaches setpoint distance, start binocular vision imaging System, picture signal is gathered by the sonograms system and binocular vision imaging system respectively simultaneously;
Preferably, in the step S2, comprise the following steps:
S2-1:The sonar sensing probe of the sonograms system gathers the sonar signal of the target object reflection, and It is sent to signal processing module to be handled, forms sonar image data;
S2-2:The binocular camera of the binocular cues processing module gathers the photosignal of the target object, and passes Deliver to binocular cues processing module to be handled, form binocular image data.
S3:The described image signal formation fused images that fusion of imaging processing system fusion steps S2 is gathered.
Preferably, in the step S3, comprise the following steps:
S3-1:The FPGA processing modules of the fusion of imaging processing system receive the sonar image data and binocular image Data, and carry out merging treatment, output to parsing graphics module;
S3-2:The parsing graphics module is simultaneously parsed the data after receive, merging, and sonar is drawn out respectively Image and binocular image, output to fusion treatment module;
S3-3:Height, width, abscissa and depth letter that the fusion treatment module is provided according to the sonar image Breath, travels through each region of the binocular image, and fusion obtains fused images.
Preferably, in the step S3-3, according to the binocular camera and the distance of sonar sensing probe position Relation is put, using the center position and profile information of the target object, the sonar image and the binocular image is adjusted In substantially overlap condition.
Implement the invention has the advantages that:The present invention passes through sonograms system and binocular vision imaging system Fusion, can effectively detect specific profile, surface information and the depth information of object in water, solve and be particularly under water under deep water Object the problem of perceive difficult, provide strong solution for the related work that is particularly under water under deep water.
Brief description of the drawings
Below in conjunction with drawings and Examples, the invention will be further described, in accompanying drawing:
Fig. 1 is the structural representation of sonar of the present invention and the fusion of imaging device of binocular vision imaging system;
Fig. 2-1 is top view of the acoustic wave beam on test surface in the present invention in simple beam image-forming principle;
Fig. 2-2 is front view of the acoustic wave beam on test surface in the present invention in simple beam image-forming principle;
Fig. 2-3 is the wave beam for the returns to be received such as sonar is after wave beam is sent in the present invention in simple beam image-forming principle, The schematic diagram of far and near different point cloud is drawn out in picture according to the time order and function received;
Fig. 2-4 is in the present invention in simple beam image-forming principle, it is assumed that have rod on the vertical plane of sonar detection Schematic diagram;
Fig. 2-5 is that sound wave gets to the schematic diagram for the sonar image that rod is obtained in the present invention in simple beam image-forming principle;
Fig. 2-6 is in the present invention in simple beam image-forming principle, because 2 points of location of A, B may be different, depth The reflection also schematic diagram of difference;
Fig. 3 is the workflow diagram of sonar of the present invention and the fusion of imaging device of binocular vision imaging system;
Fig. 4 is the blending algorithm flow chart of sonar image of the present invention and binocular vision image.
Reference numerals list:
1st, rigid support;2nd, sonograms system;3rd, binocular vision imaging system;4th, fusion of imaging processing system;5th, prevent Water chamber;6th, the water surface.
Embodiment
In order to which technical characteristic, purpose and effect to the present invention are more clearly understood from, now compare accompanying drawing and describe in detail The embodiment of the present invention.
As shown in figure 1, be the present invention sonar and binocular vision imaging system fusion of imaging device one embodiment, Including rigid support 1, sonograms system 2, binocular vision imaging system 3 and fusion of imaging processing system 4 etc., it can pass through The fusion of sonograms system 2 and binocular vision imaging system 3, can effectively detect specific profile, the surface letter of object in water Breath and depth information, solve the problem of object being particularly under water under deep water perceives difficult, are to be particularly under water under deep water Related work provides strong solution.
The rigid support 1 can be made variously-shaped as needed as the support member of whole device.As shown in figure 1, In the present embodiment, the rigid support 1 is frame shape, to facilitate ship or other devices to carry.Sonograms system 2 and binocular vision Imaging system 3 is fixedly mounted on the rigid support 1, so as to follow ship or other devices to move, carrys out searching target thing Body, and carry out fusion of imaging.
Because the fusion of imaging device 4 needs to be operated under water, the rigid support 1 is provided with anti-water chamber 5, so as to It is installed on wherein in for binocular vision imaging system 3.Further, anti-water chamber 5 is provided with double with binocular vision imaging system 3 The corresponding light penetrating panel of taking lens of mesh camera, in order to pass through the visual signal of photoelectric signal collection target object.
Further, binocular vision imaging system 3 is fixedly mounted on rigid support with sonograms system 2 by setpoint distance On 1, so as to be prepared for successive image fusion.It should be understood that binocular vision imaging system 3 and sonograms system 2 Distance can be configured according to actual needs, can be to be fixedly installed, setting for adjustable distance can also be made as needed Put.
The sonograms system 2 includes sonar sensing probe and signal processing module, passes through sonar sensing probe The detection that sonar carries out target object is sent, and the sonar signal of reflection is received by signal processing module, is carried out at analysis Reason obtains sonar image data.In the present embodiment, sonar sensing probe and signal processing module are integral type, fixed Installed in the bottom of rigid support 1.It should be understood that the sonar sensing probe and sonar signal processing module can also be split Formula, sonar sensing probe is fixedly mounted on the bottom of rigid support 1, and signal processing module then may be mounted on ship or On other devices, it is connected with sonar sensing probe by wired or wireless communication, receives the sound that sonar sensing probe is sensed Receive signal.
In the present embodiment, the sonar sensing probe is simple beam sonar sensing probe, and simple beam sonar is a moment Launch a branch of sound wave to a direction, as shown in Figure 2.
It can see from Fig. 2-1, angle very little of the acoustic wave beam on test surface is similar to a line.And from Fig. 2-2 On see, acoustic wave beam be on the vertical plane vertical with test surface in sector.
The wave beam of the sonar return to be received such as after wave beam is sent, draws out according to the time order and function received in picture Far and near different point cloud, as Figure 2-3.
This creates the terminal a phenomenon, it is assumed that has a rod on the vertical plane of sonar detection, as in Figure 2-4. Sound wave can be returned after A points and B points is got to, and due to distance, it is relatively early that A points are returned, and such rod is obtained Sonar image as shown in Figure 2-5.That is, although the scanning of simple beam sonar is two-dimentional image, but it is in radius Depth point on direction maintains the elevation information of object.And because 2 points of location of A, B may be different, such as Fig. 2-6, This depth reflects also difference, so this elevation information is a not sure dynamic value, can be with this highly Information roughly estimates the height of target object within the scope of which, in order to being carried out with binocular vision imaging system diagram picture Fusion.The squint very little on test surface direction is additionally, since, then the width of target object is then accurate value, Can directly with binocular vision imaging system image co-registration.
The binocular vision imaging system 3 includes binocular camera and binocular cues processing module, passes through binocular camera The detection of target object is carried out, and the binocular cues detected are received by binocular cues processing module, is analyzed and processed Binocular image data.In the present embodiment, binocular camera and binocular cues processing module are integral type, use waterproof material It is fixedly mounted on after sealing in the anti-water chamber 5 of rigid support 1, anti-water chamber 5 is provided with the binocular with binocular vision imaging system 3 The corresponding light penetrating panel of taking lens of camera, in order to pass through the visual signal of photoelectric signal collection target object.Can be with Understand, the binocular camera and binocular cues processing module can also be split type, and binocular camera is fixedly mounted on rigidity In the anti-water chamber 5 of support 1, and binocular cues processing module then may be mounted on ship or on other devices, be taken the photograph with binocular As head is connected by wired or wireless communication, the binocular cues that binocular camera is sensed are received.In the present embodiment, The binocular vision system is pure passive binocular vision system, if the biocular systems realized with other modes are substituted, can be completed Most of function that the device is realized.
The fusion of imaging processing system 4 is connected with sonograms system 2 and binocular vision imaging system 3, for merging figure As signal, including FPGA processing modules, parsing graphics module and fusion treatment module.In the present embodiment, the fusion of imaging Processing system 4 is arranged on the water surface, and cable connecting communication is passed through with sonograms system 2 and binocular vision imaging system 3;Can be with Understand, fusion of imaging processing system 4 can also pass through wireless signal with sonograms system 2 and binocular vision imaging system 3 It is attached communication;Fusion of imaging processing system 4 can also be set under water, by wired or wireless signal by after fusion treatment Image be sent to display device on the water surface.
Wherein, FPGA processing modules are connected with signal processing module and binocular cues processing module, and both are adopted The data of collection merge processing, and send into DDR, and data progress network is sent into parsing using ARM V2 interfaces draws Module.Parsing graphics module can be arranged on PC ends, be connected with the communication of FPGA processing modules, and by after receive, merging The data are parsed, and sonar image and binocular image are drawn out respectively;Fusion treatment module is connected with parsing graphics module, Height, width, abscissa and the depth information provided according to sonar image, travels through each region of the binocular image, fusion Obtain fused images.
As shown in figure 3, the method for the sonar and the fusion of imaging device fusion of imaging of binocular vision imaging system, including with Lower step:
S1:Sonograms system sends sonar detection target object, and when finding target object, rigid support is close The target object.
Specifically, sonograms systems stay sends sonar signal (include but is not limited to simple beam sonar), and passes through Monitor whether to receive the sonar signal of passback to judge whether to find target object.When finding target object, from target Object farther out when, driving ship or other devices drives the movement of rigid support 1 and close to target object;If do not detected Target object, continues to detection.
S2:When the distance of rigid support and target object reaches setpoint distance, start binocular vision imaging system, by sound Sodium imaging system and binocular vision imaging system gather picture signal respectively simultaneously;
Specifically, when the distance of rigid support and target object reaches setpoint distance, binocular vision imaging system is started, The sonar signal of the sonar sensing probe collection target object reflection of sonograms system, and it is sent to signal processing module Handled, form sonar image data;The binocular camera of binocular cues processing module gathers the photosignal of target object, And be sent to binocular cues processing module and handled, form binocular image data.Pass through sonograms system and binocular vision Imaging system gathers picture signal simultaneously, until completing the scanning probe of whole target object.
S3:Fusion of imaging processing system fusion steps S2 acquired images signal formation fused images.
Specifically, comprise the following steps:
S3-1:The FPGA processing modules of fusion of imaging processing system receive sonar image data and binocular image data, and Merge processing, output to parsing graphics module;Specifically, gathering image respectively by two subsystems, it will collect Data be incorporated at one and send into DDR by FPGA processing, data are subjected to network transmission to PC using ARM V2 interfaces End.
S3-2:Parsing graphics module is simultaneously parsed the data after receive, merging, and sonar image is drawn out respectively And binocular image, output to fusion treatment module;Specifically, PC ends are carried out respectively after receiving data via network to data Parsing, draws out two subsystems acquired image, output to fusion treatment module.
S3-3:Height, width, abscissa and depth information that fusion treatment module is provided according to the sonar image, time Each region of the binocular image is gone through, fusion obtains fused images.
Specifically, as shown in figure 4, comprising the following steps:
S3-3-1:Two image coordinates and visual field are adjusted, target object is in overlap condition;
Specifically, according to binocular camera and the distance and position relation of sonar sensing probe, the center of target object is utilized Point position and profile information, regulation sonar image and binocular image coordinate and visual field, make target object be in substantially overlap condition;
S3-3-2:Object general height, width, depth and abscissa information are obtained by sonar image;
Specifically, what is gathered by simple beam sonograms system is two dimensional image, and is influenceed by scanning object height A part of object height information can be retained, so being provided using simple beam sonograms system diagram picture for binocular vision electro-optical system Object abscissa information, object width information, object depth information and object general height information.
S3-3-3:Primary image is first obtained according to depth matching, then stain is filtered with width, abscissa;
S3-3-4:It is in place that four information traversal binocular image, one region provided by sonar image obtains object institute Put;
Specifically, this four information that binocular vision electro-optical system is provided by simple beam sonograms system by time Go through approximate location where the object for contrasting and obtaining in binocular image;
S3-3-5:Image information beyond object position is abandoned, processing is filtered to the position, further abandons and makes an uproar Sound point, obtains final fused images.
In the present invention, by the fusion of sonograms system and binocular vision imaging system, thing in water can be effectively detected Specific profile, surface information and the depth information of body, solve the problem of object being particularly under water under deep water perceives difficult, are The related work being particularly under water under deep water provides strong solution.
It should be understood that above example only expresses the preferred embodiment of the present invention, it describes more specific and detailed Carefully, but can not therefore and be interpreted as the limitation to the scope of the claims of the present invention;It should be pointed out that for the common skill of this area For art personnel, without departing from the inventive concept of the premise, independent assortment can be carried out to above-mentioned technical characterstic, can also done Go out several modifications and improvements, these belong to protection scope of the present invention;Therefore, it is all to be done with scope of the invention as claimed Equivalents and modification, all should belong to the covering scope of the claims in the present invention.

Claims (9)

1. the fusion of imaging device of a kind of sonar and binocular vision imaging system, it is characterised in that including rigid support, sonar into Sonograms system and binocular vision imaging system as described in system, binocular vision imaging system and connection and fusion treatment Picture signal fusion of imaging processing system;The sonograms system includes the sound being fixedly mounted on the rigid support Receive and sensing probe and communicate the signal processing module that is connected with the sonar sensing probe;The binocular vision imaging System includes the binocular camera being fixedly mounted on the rigid support and pair being connected is communicated with the binocular camera Mesh signal processing module;
The fusion of imaging processing system includes FPGA processing modules, parsing graphics module and fusion treatment module;
The FPGA processing modules are connected with the signal processing module and binocular cues processing module, and both are gathered Data merge processing, be sent to the parsing graphics module;
Parsing graphics module and the FPGA processing modules communication is connected, and by the data after receive, merging Parsed, sonar image and binocular image are drawn out respectively;
The fusion treatment module is connected with the parsing graphics module, according to the height, width, horizontal stroke of sonar image offer Coordinate and depth information, travel through each region of the binocular image, and fusion obtains fused images.
2. the fusion of imaging device of sonar according to claim 1 and binocular vision imaging system, it is characterised in that described Sonar sensing probe is simple beam sonar sensing probe.
3. the fusion of imaging device of sonar according to claim 1 and binocular vision imaging system, it is characterised in that described Rigid support includes being installed on anti-water chamber therein for the binocular camera;The anti-water chamber is provided with and taken the photograph with the binocular As the corresponding light penetrating panel of taking lens of head.
4. the fusion of imaging device of sonar according to claim 1 and binocular vision imaging system, it is characterised in that described Binocular camera is fixedly mounted on the rigid support with the sonar sensing probe by setpoint distance.
5. a kind of usage right requires any one of 1-4 sonar and the fusion of imaging device fusion of imaging of binocular vision imaging system Method, it is characterised in that comprise the following steps:
S1:Sonograms system sends sonar detection target object, and when finding target object, by rigid support close to described Target object;
S2:When the distance of the rigid support and the target object reaches setpoint distance, start binocular vision imaging system, Picture signal is gathered by the sonograms system and binocular vision imaging system respectively simultaneously;
S3:The described image signal formation fused images that fusion of imaging processing system fusion steps S2 is gathered.
6. method according to claim 5, it is characterised in that in the step S1, the sonograms system is sent Simple beam sonar carries out target object detection.
7. method according to claim 5, it is characterised in that in the step S2, comprise the following steps:
S2-1:The sonar sensing probe of the sonograms system gathers the sonar signal of the target object reflection, and transmits Handled to signal processing module, form sonar image data;
S2-2:The binocular camera of the binocular cues processing module gathers the photosignal of the target object, and is sent to Binocular cues processing module is handled, and forms binocular image data.
8. method according to claim 5, it is characterised in that in the step S3, comprise the following steps:
S3-1:The FPGA processing modules of the fusion of imaging processing system receive the sonar image data and binocular image number According to, and carry out merging treatment, output to parsing graphics module;
S3-2:The parsing graphics module is simultaneously parsed the data after receive, merging, and sonar image is drawn out respectively And binocular image, output to fusion treatment module;
S3-3:Height, width, abscissa and depth information that the fusion treatment module is provided according to the sonar image, time Each region of the binocular image is gone through, fusion obtains fused images.
9. method according to claim 8, it is characterised in that in the step S3-3, according to the binocular camera With the distance and position relation of the sonar sensing probe, the center position and profile information of the target object, regulation are utilized The sonar image and the binocular image are in overlap condition.
CN201510180927.2A 2015-04-16 2015-04-16 A kind of fusion of imaging device and method of sonar and binocular vision imaging system Active CN104808210B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510180927.2A CN104808210B (en) 2015-04-16 2015-04-16 A kind of fusion of imaging device and method of sonar and binocular vision imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510180927.2A CN104808210B (en) 2015-04-16 2015-04-16 A kind of fusion of imaging device and method of sonar and binocular vision imaging system

Publications (2)

Publication Number Publication Date
CN104808210A CN104808210A (en) 2015-07-29
CN104808210B true CN104808210B (en) 2017-07-18

Family

ID=53693197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510180927.2A Active CN104808210B (en) 2015-04-16 2015-04-16 A kind of fusion of imaging device and method of sonar and binocular vision imaging system

Country Status (1)

Country Link
CN (1) CN104808210B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105222760A (en) * 2015-10-22 2016-01-06 一飞智控(天津)科技有限公司 The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method
CN106908778A (en) * 2017-04-18 2017-06-30 上海达华测绘有限公司 Detecting system and detection method
CN108492323B (en) * 2018-01-18 2022-01-28 天津大学 Underwater moving object detection and identification method fusing machine vision and hearing
CN109143247B (en) * 2018-07-19 2020-10-02 河海大学常州校区 Three-eye underwater detection method for acousto-optic imaging
CN109298430B (en) * 2018-08-08 2020-10-27 西安交通大学 Underwater composite bionic detection device and detection target fusion identification method
CN109443545A (en) * 2018-11-28 2019-03-08 深圳市乾行达科技有限公司 Fault location system and method
CN109788163A (en) * 2019-03-26 2019-05-21 南京砺剑光电技术研究院有限公司 A kind of fusion of imaging device of two dimension sonar and technique of laser range gated imaging equipment
CN109884642B (en) * 2019-03-26 2022-12-13 南京砺剑光电技术研究院有限公司 Fusion imaging method adopting multi-beam sonar and laser auxiliary illumination imaging equipment
CN112113506A (en) * 2020-08-31 2020-12-22 天津蓝鳍海洋工程有限公司 Underwater moving object measuring device and method based on deep learning

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204740344U (en) * 2015-04-16 2015-11-04 深圳大学 Sonar and binocular vision imaging system's integration image device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005033629A2 (en) * 2003-09-19 2005-04-14 University Of Miami Multi-camera inspection of underwater structures
US8213740B1 (en) * 2009-05-18 2012-07-03 The United States Of America, As Represented By The Secretary Of The Navy Coherent image correlation
CN102042835B (en) * 2010-11-05 2012-10-24 中国海洋大学 Autonomous underwater vehicle combined navigation system
CN204228171U (en) * 2014-11-19 2015-03-25 山东华盾科技股份有限公司 A kind of underwater robot guider

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204740344U (en) * 2015-04-16 2015-11-04 深圳大学 Sonar and binocular vision imaging system's integration image device

Also Published As

Publication number Publication date
CN104808210A (en) 2015-07-29

Similar Documents

Publication Publication Date Title
CN104808210B (en) A kind of fusion of imaging device and method of sonar and binocular vision imaging system
US20210396842A1 (en) Multi-scale inspection and intelligent diagnosis system and method for tunnel structural defects
EP2909807B1 (en) Improvements in relation to underwater imaging for underwater surveys
CN109634279B (en) Object positioning method based on laser radar and monocular vision
CN110275169B (en) Near-field detection sensing system of underwater robot
WO2020093436A1 (en) Three-dimensional reconstruction method for inner wall of pipe
CN106384382A (en) Three-dimensional reconstruction system and method based on binocular stereoscopic vision
CN102785719B (en) Method for shooting water gage images of ship
CN102253057B (en) Endoscope system and measurement method using endoscope system
CN110988871A (en) Unmanned aerial vehicle-mounted through-wall radar high-rise building wall health offline detection system and detection method
CN110244314A (en) One kind " low slow small " target acquisition identifying system and method
CN107241533A (en) A kind of battle array scanning laser imaging device and method under water
CN105307115A (en) Distributed vision positioning system and method based on action robot
CN103750859A (en) Position information based ultrasonic wide view imaging method
CN108169743A (en) Agricultural machinery is unmanned to use farm environment cognitive method
CN110136186A (en) A kind of detection target matching method for mobile robot object ranging
CN110133667B (en) Underwater three-dimensional detection system based on mobile forward looking sonar
CN115880368A (en) Method and system for detecting obstacle of power grid inspection unmanned aerial vehicle and storage medium
CN109143167A (en) A kind of complaint message acquisition device and method
AU2019326321A1 (en) Mapping and tracking methods and systems principally for use in connection with swimming pools and spas
CN204740344U (en) Sonar and binocular vision imaging system's integration image device
KR101057419B1 (en) Non-contact type apparatus for measuring fish number and method thereof
CN105824024A (en) Novel underwater gate anti-frogman three-dimensional early warning identification system
CN105592294B (en) A kind of monitoring system of VSP excitations big gun group
RU139478U1 (en) ROBOTIC OBJECT MANAGEMENT SYSTEM

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Xu Yuan

Inventor after: Huang Weixin

Inventor after: Zhang Zhiqiang

Inventor after: Wang Yazhou

Inventor after: He Fan

Inventor before: Xu Yuan

Inventor before: Zhang Zhiqiang

Inventor before: Huang Weixin

Inventor before: Wang Yazhou

Inventor before: He Fan

COR Change of bibliographic data
GR01 Patent grant
GR01 Patent grant