KR101580956B1 - Sonar image emulator and method for sonar image forecast using the same - Google Patents

Sonar image emulator and method for sonar image forecast using the same Download PDF

Info

Publication number
KR101580956B1
KR101580956B1 KR1020140076600A KR20140076600A KR101580956B1 KR 101580956 B1 KR101580956 B1 KR 101580956B1 KR 1020140076600 A KR1020140076600 A KR 1020140076600A KR 20140076600 A KR20140076600 A KR 20140076600A KR 101580956 B1 KR101580956 B1 KR 101580956B1
Authority
KR
South Korea
Prior art keywords
information
intersection
view box
unit
sonar image
Prior art date
Application number
KR1020140076600A
Other languages
Korean (ko)
Inventor
유선철
구정회
강동중
김정현
Original Assignee
포항공과대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 포항공과대학교 산학협력단 filed Critical 포항공과대학교 산학협력단
Priority to KR1020140076600A priority Critical patent/KR101580956B1/en
Application granted granted Critical
Publication of KR101580956B1 publication Critical patent/KR101580956B1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52068Stereoscopic displays; Three-dimensional displays; Pseudo 3D displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows

Abstract

Sonar image emulator and sonar image prediction method using the same.
The sonar image emulator includes an ultrasonic model generating unit for generating an ultrasonic model including a plurality of straight lines starting from an ultrasonic source, an object model generating unit for generating a target object as a three-dimensional solid object, A modeling unit for obtaining intersection information between the plurality of slices and the plurality of straight lines, and a rendering unit for rendering a sonar image of the object using the intersection information.

Description

[0001] SONAR IMAGE EMULATOR AND METHOD FOR SONAR IMAGE FORECAST USING THE SAME [0002]

The present invention relates to a sonar image emulator and a sonar image predicting method using the same.

SONAR (SOUND Navigation and Ranging) is a sound detection device that recognizes the direction and distance of a target underwater by sound waves.

In general, sonar is divided into passive sonar and active sonar. The passive sonar receives the sound waves radiated by the detection object unilaterally and recognizes the orientation and distance of the detection object. On the other hand, the active sonar emits a sound wave in water and receives the echo that the emitted sound wave hits against the underwater object to find the direction of the object and the distance between the object and the object.

The image sonar is an object recognition system that searches the surrounding environment underwater using the Sonar's operating principle and implements it, and is often used for underwater exploration.

Image Sonar acquires an image of an object using sound waves, so sonar images obtained through image sonar appear different from optical images obtained with the same object.

The sonar image contains significant noise characteristic of recognizing objects using sound waves. Accordingly, there is a problem that it is difficult to obtain a high-quality sonar image for the object when the object is detected in the water.

Conventionally, when a specific object is to be detected in the water, the user directly identifies the sonar image obtained through the sonar to distinguish the object to be detected. In this case, the accuracy of the detection may depend on the skill of the user.

An object of the present invention is to provide a sonar image emulator for acquiring a high quality sonar image and predicting a sonar image to be detected and a sonar image predicting method using the same.

According to an aspect of the present invention, there is provided a sonar image emulator including an ultrasound model generating unit for generating an ultrasound model including a plurality of straight lines starting from an ultrasound source, An object model generating unit, a division unit for dividing the object into a plurality of slices, a modeling unit for obtaining intersection information between the plurality of slices and the plurality of straight lines, And a rendering unit for rendering.

According to another aspect of the present invention, there is provided a method of predicting a sonar image of a sonar image emulator, including the steps of generating an ultrasonic model including a plurality of straight lines starting from an ultrasonic source, generating a target object as a three- The method includes dividing the object into a plurality of slices, obtaining intersection information between the plurality of slices and the plurality of straight lines, and rendering the sonar image of the object using the intersection information.

Further, a recording medium on which a program for executing the sonar image prediction method of the present invention is recorded may be included.

According to embodiments of the present invention, it is possible to predict a high quality sonar image for a target object.

In addition, it is possible to acquire a high-quality sonar image by correcting the sonar image actually acquired in water based on the high-quality sonar image predicted through the sonar image emulator.

In addition, when a specific object is to be detected using the sonar system, the sonar image to be detected may be predicted in advance, and the predicted sonar image may be provided to the user, so that the user can easily and accurately identify the detection target.

1 is a structural diagram of a sonar image emulator according to an embodiment of the present invention.
2 is a view for explaining a method of dividing an object using a view box in a sonar image emulator according to an embodiment of the present invention.
3 is a diagram for explaining a method of acquiring intersection information in a sonar image emulator according to an embodiment of the present invention.
4 is a flowchart illustrating a method of predicting a sonar image of a sonar image emulator according to an embodiment of the present invention.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art to which the present invention pertains. The present invention may be embodied in many different forms and is not limited to the embodiments described herein.

In order to clearly illustrate the embodiments of the present invention, portions that are not related to the description are omitted, and the same or similar components are denoted by the same reference numerals throughout the specification.

Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "electrically connected" with another part in between . Also, when an element is referred to as "comprising ", it means that it can include other elements as well, without departing from the other elements unless specifically stated otherwise.

Hereinafter, a SONAR image and sound image emulator according to an embodiment of the present invention and a sonar image predicting method using the same will be described with reference to the drawings.

1 is a block diagram illustrating a sonar image emulator according to an embodiment of the present invention. 2 is a diagram for explaining a method of dividing an object through a viewbox in a sonar image emulator according to an embodiment of the present invention. 3 is a diagram for explaining a method of acquiring intersection information in a sonar image emulator according to an embodiment of the present invention.

1, a sonar image emulator 100 according to an exemplary embodiment of the present invention includes an ultrasound model generating unit 110, an object model generating unit 120, a dividing unit 130, a modeling unit 140, (150), and the like. In addition, the sonar image emulator 100 may further include a user input unit 160, a display 170, and the like. The components shown in FIG. 1 are not essential, so that the sonar image emulator 100 according to an embodiment of the present invention may be implemented with more or fewer components.

The ultrasound model generating unit 110 generates an ultrasound model based on the intrinsic characteristics of the ultrasound waves.

The ultrasound model generating unit 110 may generate an ultrasound model including a plurality of straight lines starting from a point corresponding to a source corresponding to an ultrasound source. Each straight line constituting the ultrasonic wave model can be moved in different directions with the point corresponding to the ultrasonic wave source as the origin. The ultrasonic-wave model generating unit 110 can determine a point of a source of the ultrasonic model and a direction of each of the straight lines constituting the ultrasonic model based on the control input inputted through the user input unit 160. [

The object model generation unit 120 generates a 3D object corresponding to an arbitrary object. To this end, the object model generation unit 120 may receive shape information, position information, attitude information, and the like of an object necessary for creating a stereoscopic object through the user input unit 160. Hereinafter, for convenience of description, the stereoscopic object generated by the object model generation unit 120 is referred to as a 'target object'.

When object shape information is inputted through the user input unit 160, the object model generation unit 120 can generate a target object by substituting it into a graphic library. Here, the graphics library may be OpenGL (Open Graphic Library) or the like. Further, the object may be formed in a hollow form in which the interior of the object is hollow.

The object model generation unit 120 can set the position of the object in the virtual three-dimensional space based on the positional information of the object. Further, when the posture information of the object is input, the object model generation unit 120 can set the posture of the object in the virtual three-dimensional space based on the posture information of the object.

In the virtual three-dimensional space, the origin of the coordinate system serving as a reference for determining the position of the object may correspond to the position of the ultrasonic wave source of the ultrasonic model generated by the ultrasonic model generation unit 110. [ The object model generation unit 120 can set the distance between the object and the ultrasonic source based on the position information of the object inputted through the user input unit 160. [ The object model generating unit 120 sets the angle formed by the object with the ultrasonic wave source based on the attitude information of the object to be input through the user input unit 160 so that the ultrasonic wave radiated from the ultrasonic wave source It is possible to set whether or not an angle is reached.

The partitioning unit 130 divides the stereoscopic object generated by the object model generation unit 120, that is, the object, into a plurality of slices.

The partitioning unit 130 can divide the object into a plurality of slices using the view box. The view box is a virtual three-dimensional structure that accommodates a part of an object in a virtual three-dimensional space in which a target object is disposed, and may be provided in a box shape, a cylindrical shape, or the like.

Referring to FIG. 2, the view box 7 receives a part of the object 5, and the divided part 130 can acquire a part accommodated in the view box 7 with one slice 5a. The partitioning unit 130 moves the view box 7 until the view box 7 passes through the entirety of the object 5 in order to divide the entire object 5 into a plurality of slices, And acquires a part accommodated in the memory 7 as a slice. In this exaggeration, the division unit 130 can move the view box 7 in the direction away from the origin of the ultrasonic source S, i.e., the coordinate system, or in the direction approaching the origin of the coordinate system. In addition, the partitioning unit 130 may move the view box by the thickness of the view box at each step so as to divide the object into a plurality of slices having the same thickness as the view box without overlapping each other.

The width of the view box can be designed to be greater than the maximum width of the object so that all the cross-sections of the object in the view box are accommodated. In addition, the thickness of the view box can be designed to be more than the coordinate resolution of the division unit 130, that is, the minimum distance that the division unit 130 can identify within the virtual three-dimensional space.

As the thickness of the view box decreases, the thickness of each slice obtained by dividing the object becomes thinner. Further, as the thickness of the slice becomes thinner, the accuracy of the crossing information acquired on the surface of the object by the modeling unit 140 described later increases, and accordingly, the quality of the sonar image obtained based on the crossing information also increases. On the other hand, as the thickness of the slice increases, the amount of calculation at the time of calculating the crossing information decreases. Therefore, the thickness of the view box can be selectively adjusted based on the system performance of the sonar image emulator 100 and the like.

The angle the view box makes with the ultrasonic source can be set by the user. That is, the division unit 130 can set the angle formed by the view box with the ultrasonic source, based on the control input inputted through the user input unit 160. [

The division unit 130 can sequentially project a slice of the object to the display 170 using the view box. The target object generated by the object model generation unit 120 is a hollow three-dimensional object. Thus, the cross section of the object to be cut by the view box can be displayed on the display 170 in such a form that the cut surface of the object forms an edge area having a predetermined thickness.

The modeling unit 140 calculates intersection information between each slice and the ultrasonic model obtained by the dividing unit 130. [

The ultrasound model generating unit 110 maps the position and orientation of each line constituting the ultrasound source and the ultrasound model to a coordinate system of a virtual three-dimensional space when the ultrasound model is generated. For example, the ultrasonic-wave-model generating unit 110 may represent each straight line constituting the ultrasonic wave model as a vector starting from the origin, with the ultrasonic source as the origin.

When the object is generated, the object model generation unit 120 maps the object to a coordinate system of a virtual three-dimensional space to which the ultrasonic model is mapped. That is, the object is placed in a virtual three-dimensional space in which the ultrasonic source is mapped as the origin.

The partitioning unit 130 may divide the view box into a plurality of slices by mapping the view box to a coordinate system of a virtual three-dimensional space mapped with the ultrasonic model and the object.

As described above, as the ultrasonic model, the object, and the view box are mapped to the same three-dimensional space, the modeling unit 140 can calculate the intersection information between the ultrasonic model and each slice.

3, the modeling unit 140 extracts an intersection P1 between the straight line S1 constituting the ultrasonic model and the edge region of each slice 5a, and calculates the intersection P1 of the ultrasonic source S with respect to the intersection P1, Intersection information including distance information and angle information with respect to the intersection point.

The modeling unit 140 calculates distance information between each slice 5a and the ultrasonic source S based on the position of the view box corresponding to each slice 5a and the angle information that the view box forms with the ultrasonic source S d), and coordinate information of an edge region corresponding to the surface of the object at each slice 5a can be calculated. It is also possible to detect which one of the plurality of straight lines constituting the ultrasonic model intersects the edge region of the slice 5a and to detect the direction of each intersection P1 from the orientation information (?,?) Of the straight line crossing the edge region The intersection point information can be obtained.

Referring again to FIG. 1, the rendering unit 150 calculates the intersection information of the plurality of slices constituting the object with the ultrasound model in the modeling unit 140, and merges the ultrasound models to render the object or image of the object . In addition, the generated sonar image is displayed on the screen through the display 170. In addition, the generated sonar image can be stored in an internal memory (not shown) or an external memory (not shown).

4 is a flowchart illustrating a method for generating a sonar image in a sonar image emulator according to an embodiment of the present invention.

Referring to FIG. 4, when the operation of the sonar image emulator 100 starts, the ultrasound model generating unit 110 generates an ultrasound model. In addition, the object model generation unit 120 generates a target object, which is a hollow object having a hollow shape, based on the control input input through the user input unit 160 (S100).

The ultrasound model generated in step S100 and the object may be mapped to a virtual three-dimensional space.

The partitioning unit 130 creates a view box within the virtual three-dimensional space in which the ultrasonic model and the object are disposed (S110). In addition, the partitioning unit 130 acquires a part of the object stored in the view box as one slice (S120). Here, the partitioning unit 130 sets the position and angle of the view box such that a part of the object is accommodated in the view box.

In step S120, when one slice is obtained, the partitioning unit 130 can project the slice on the screen through the display 170. FIG. On the projection screen, the surface cut out from the object by the view box can form an edge area of the slice.

When the slice is obtained by the division unit 130, the modeling unit 140 acquires the intersection information between the obtained slice and the ultrasonic model (S130).

In step S130, the modeling unit 140 extracts the intersection between each straight line constituting the ultrasonic model and the edge area of the slice corresponding to the surface area of the object. Then, the distance and angle between the extracted intersection point and the ultrasonic source can be calculated, and the intersection information can be generated.

When the intersection information acquisition for one slice is completed, the partitioning unit 130 moves the view box in a specific direction (S140).

In step S140, the partitioning unit 130 moves the view box by the thickness of the view box every time the view box is moved by one step so that slices can be acquired without overlapping areas or missing areas between the slices.

The sonar image emulator 100 repeats steps S120 through S140 until the view box passes through the entire object at step S150. Accordingly, the sonar image emulator 100 can divide the object into a plurality of slices, and acquire the intersection information with the ultrasonic model for all the slices.

The rendering unit 150 merges the intersection information of each slice divided from the object to generate a sonar image (S160).

The sonar image emulator 100 according to the embodiment of the present invention and the sonar image generated by the sonar image generating method using the same can be used to discriminate an object to be detected in a sonar system or to obtain a high quality sonar image.


The sonar image prediction method according to an embodiment of the present invention can be executed through software. When executed in software, the constituent means of the present invention are code segments that perform the necessary tasks. The program or code segments may be stored on a processor read functional medium or transmitted by a computer data signal coupled with a carrier wave in a transmission medium or a communication network.

A computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored. Examples of the computer-readable recording device include ROM, RAM, CD-ROM, DVD-ROM, DVD-RAM, magnetic tape, floppy disk, hard disk and optical data storage device. Also, the computer-readable recording medium may be distributed over a network-connected computer device so that computer-readable code can be stored and executed in a distributed manner.

It is to be understood that both the foregoing general description and the following detailed description of the present invention are illustrative and explanatory only and are intended to be illustrative of the invention and are not to be construed as limiting the scope of the invention as defined by the appended claims. It is not. Therefore, those skilled in the art can readily select and substitute it. Those skilled in the art will also appreciate that some of the components described herein can be omitted without degrading performance or adding components to improve performance. In addition, those skilled in the art may change the order of the method steps described herein depending on the process environment or equipment. Therefore, the scope of the present invention should be determined by the appended claims and equivalents thereof, not by the embodiments described.

100: Sonar image emulator 110: Ultrasonic model generating unit
120: object model generation unit 130:
140: modeling unit 150: rendering unit
160: user input unit 170: display

Claims (20)

User input,
An ultrasonic model generating unit for generating an ultrasonic model including a plurality of straight lines extending from the origin to different directions in an imaginary three dimensional space,
An object model generating unit for generating a three-dimensional solid object corresponding to the object in the three-dimensional space based on shape information of the object inputted through the user input unit,
A dividing unit dividing the three-dimensional object into a plurality of slices,
A modeling unit for obtaining intersection information between the plurality of slices and each straight line included in the ultrasonic model, and
A rendering unit for rendering a sonar image of the object using the intersection information,
A sonar image emulator containing.
The method according to claim 1,
Wherein the object model generation unit generates the three-dimensional solid object in a hollow form.
3. The method of claim 2,
Wherein an edge area of each of the plurality of slices corresponds to a surface of the object.
The method of claim 3,
Wherein the modeling unit obtains an intersection where each straight line included in the ultrasonic model crosses the edge region,
Wherein the intersection information includes distance information between the intersection and the origin, and angle information formed by the intersection with respect to the origin.
delete 5. The method of claim 4,
And the modeling unit obtains the angle information from the azimuth information of the straight line intersecting the intersection among the plurality of straight lines.
5. The method of claim 4,
Wherein the dividing unit divides the three-dimensional solid object into the plurality of slices using a view box.
8. The method of claim 7,
Wherein the dividing unit sequentially moves the view box in a predetermined direction and acquires a part of the three-dimensional solid object accommodated in the view box as one slice.
9. The method of claim 8,
Wherein the dividing unit sequentially moves the view box by a thickness of the view box.
9. The method of claim 8,
Wherein the modeling unit calculates the distance between each slice and the origin based on the position of the view box and calculates the distance information based on the distance between each slice and the origin.
Generating an ultrasonic model including a plurality of straight lines extending from the origin to different orientations with the ultrasonic source as an origin within a virtual three-dimensional space,
Receiving shape information of a target object,
Generating a three-dimensional solid object corresponding to the object within the three-dimensional space based on the shape information;
Dividing the three-dimensional solid object into a plurality of slices,
Obtaining intersection information between the plurality of slices and each straight line included in the ultrasonic model, and
Rendering the sonar image of the object using the intersection information
A method for predicting sonar images in a sonar image emulator.
12. The method of claim 11,
Wherein the three-dimensional object is hollow.
13. The method of claim 12,
Wherein an edge area of each of the plurality of slices corresponds to a surface of the object.
14. The method of claim 13,
Wherein the acquiring comprises:
Detecting an intersection where each straight line included in the ultrasonic model crosses the edge region,
Calculating distance information between the intersection and the origin, and
And obtaining angle information formed by the intersection with respect to the origin,
Wherein the intersection information includes the distance information and the angle information.
delete 15. The method of claim 14,
Wherein the obtaining of the angle information comprises:
And obtaining the angle information from the azimuth information of the straight line intersecting the intersection among the plurality of straight lines.
15. The method of claim 14,
Wherein the dividing step comprises:
And dividing the three-dimensional solid object into the plurality of slices using a view box.
18. The method of claim 17,
Wherein the dividing step comprises:
Acquiring a part of the three-dimensional solid object accommodated in the view box as one slice,
Moving the view box by a thickness of the view box, and
Acquiring a portion of the three-dimensional solid object as one slice and moving the view box repeatedly until the view box passes all of the three-dimensional solid object
/ RTI >
19. The method of claim 18,
The step of calculating the distance information includes:
Calculating a distance between each slice and the origin based on the position of the view box, and
And calculating the distance information based on the distance between each slice and the origin.
A recording medium on which a program for executing the method of any one of claims 11 to 14 and 16 to 19 is recorded.
KR1020140076600A 2014-06-23 2014-06-23 Sonar image emulator and method for sonar image forecast using the same KR101580956B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140076600A KR101580956B1 (en) 2014-06-23 2014-06-23 Sonar image emulator and method for sonar image forecast using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140076600A KR101580956B1 (en) 2014-06-23 2014-06-23 Sonar image emulator and method for sonar image forecast using the same

Publications (1)

Publication Number Publication Date
KR101580956B1 true KR101580956B1 (en) 2015-12-30

Family

ID=55088087

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140076600A KR101580956B1 (en) 2014-06-23 2014-06-23 Sonar image emulator and method for sonar image forecast using the same

Country Status (1)

Country Link
KR (1) KR101580956B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190138446A (en) * 2018-06-05 2019-12-13 국방과학연구소 Edge Enhancement Method and Apparatus based on Curvelet Transform for Object Recognition at Sonar Image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100077841A (en) * 2008-12-29 2010-07-08 한국과학기술연구원 Method for determining intersection between the objects using image of the configuration of the objects
JP2012005563A (en) * 2010-06-23 2012-01-12 Hitachi Aloka Medical Ltd Ultrasonic data processor
JP2013119035A (en) * 2011-12-08 2013-06-17 General Electric Co <Ge> Ultrasonic image formation system and method
JP2014087671A (en) * 2013-12-09 2014-05-15 Hitachi Medical Corp Ultrasonic diagnostic apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100077841A (en) * 2008-12-29 2010-07-08 한국과학기술연구원 Method for determining intersection between the objects using image of the configuration of the objects
JP2012005563A (en) * 2010-06-23 2012-01-12 Hitachi Aloka Medical Ltd Ultrasonic data processor
JP2013119035A (en) * 2011-12-08 2013-06-17 General Electric Co <Ge> Ultrasonic image formation system and method
JP2014087671A (en) * 2013-12-09 2014-05-15 Hitachi Medical Corp Ultrasonic diagnostic apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190138446A (en) * 2018-06-05 2019-12-13 국방과학연구소 Edge Enhancement Method and Apparatus based on Curvelet Transform for Object Recognition at Sonar Image
KR102096532B1 (en) * 2018-06-05 2020-04-02 국방과학연구소 Edge Enhancement Method and Apparatus based on Curvelet Transform for Object Recognition at Sonar Image

Similar Documents

Publication Publication Date Title
US9275302B1 (en) Object detection and identification
KR102197732B1 (en) Method and apparatus for generating 3d map of indoor space
KR101495333B1 (en) Apparatus and method for detecting obstacles
JP2016149132A (en) System and method for prediction in driver assist system of vehicle
JP2013137309A (en) Sonar rendering systems and associated methods
CN109313821B (en) Three-dimensional object scan feedback
KR101590253B1 (en) Method and device for simulation of sonar images of multi-beam imaging sonar
JP2006162480A (en) Underwater detection system
CN110663060B (en) Method, device, system and vehicle/robot for representing environmental elements
KR101842698B1 (en) Mesurement apparatus and method
JP2007078476A (en) Object location detection device, method, and program, mapping system, and autonomous transfer equipment
US10151829B2 (en) Systems and associated methods for producing sonar image overlay
Gu et al. Development of image sonar simulator for underwater object recognition
CN108827252B (en) Method, device, equipment and system for drawing underwater live-action map and storage medium
CN112507774A (en) Method and system for obstacle detection using resolution adaptive fusion of point clouds
EP3324208B1 (en) Positioning device and positioning method
JP5168134B2 (en) ENVIRONMENTAL MAP GENERATION PROGRAM, ENVIRONMENTAL MAP GENERATION METHOD, AND MOBILE ROBOT
KR101580956B1 (en) Sonar image emulator and method for sonar image forecast using the same
JP6185866B2 (en) Position detecting device and position detecting method for underwater moving body
JP2020052977A (en) Information processing device, information processing method, and program
JP2020135764A (en) Three-dimensional object modeling method, three-dimensional object modeling device, server, three-dimensional model creation system, and program
KR20130047455A (en) Apparatus and method for estimating the position of underwater robot
KR102469164B1 (en) Apparatus and method for geophysical navigation of USV(Unmanned Surface Vehicles)
KR20180045878A (en) Mesurement apparatus and method
KR20160005191A (en) Method and device for recognizing underwater object using sonar image template

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20181017

Year of fee payment: 4