CN110297235B - Radar and sonar data comprehensive processing method - Google Patents

Radar and sonar data comprehensive processing method Download PDF

Info

Publication number
CN110297235B
CN110297235B CN201910557542.1A CN201910557542A CN110297235B CN 110297235 B CN110297235 B CN 110297235B CN 201910557542 A CN201910557542 A CN 201910557542A CN 110297235 B CN110297235 B CN 110297235B
Authority
CN
China
Prior art keywords
data
radar
sonar
display
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910557542.1A
Other languages
Chinese (zh)
Other versions
CN110297235A (en
Inventor
冯影
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongke Haixun Digital Technology Co ltd
Original Assignee
Beijing Zhongke Haixun Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhongke Haixun Digital Technology Co ltd filed Critical Beijing Zhongke Haixun Digital Technology Co ltd
Priority to CN201910557542.1A priority Critical patent/CN110297235B/en
Publication of CN110297235A publication Critical patent/CN110297235A/en
Application granted granted Critical
Publication of CN110297235B publication Critical patent/CN110297235B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention uses the characteristics of radar and sonar to detect targets in different areas, collects and processes radar and sonar data, displays the data under the same interface, can intuitively and clearly compare the conditions of targets detected by two different devices of the radar and the sonar, is beneficial to operators to judge the targets in the navigation area more quickly and accurately, and improves the target analysis efficiency, and comprises the following steps: step one, video data transmitted from a radar and a sonar lower computer are respectively received through UDP communication protocols; step two, preprocessing radar and sonar data, respectively opening up a certain memory space for the radar and the sonar data, and storing received data into a storage space; thirdly, comprehensively displaying data, and fusing a radar picture and a sonar picture to be displayed under the same screen by using a graphic development tool; and step four, target detection, namely judging the target type according to the display condition of the comprehensive interface.

Description

Radar and sonar data comprehensive processing method
Technical Field
The invention belongs to the fields of navigation, offshore target detection, underwater target positioning and the like, and relates to a method for comprehensively processing and displaying radar and sonar data.
Background
The marine radar mainly utilizes electromagnetic waves to detect and position sea surfaces and air targets, and the sonar mainly utilizes underwater sound waves to detect and position water surfaces and underwater targets. At present, the display control of the radar and the sonar equipment are independent, the data of the radar and the sonar cannot be intuitively and clearly compared, and the position of the target cannot be rapidly and accurately judged to be on water or underwater, so that the analysis speed of the target is slow. In order to judge and position the target more accurately and rapidly, the invention provides a method for comprehensively displaying radar sonar data.
Disclosure of Invention
1. Technical problem to be solved by the invention
The invention aims to solve the technical problem of providing a method for comprehensively displaying radar and sonar data, which utilizes the characteristics of radar and sonar for detecting targets in different areas to collect and process the radar and sonar data, and displays the radar and the sonar data under the same interface, so that the conditions of targets detected by two different devices of the radar and the sonar can be intuitively and clearly compared, the method is beneficial to operators to judge targets in a navigation area more quickly and accurately, and the target analysis efficiency is improved.
2. The invention adopts the technical proposal that
In order to solve the technical problems, the invention is realized in the following way: a method for comprehensively processing radar sonar data comprises the following steps:
1) And acquiring radar and sonar data. Video data transmitted from the radar and sonar lower computers are respectively received through UDP communication protocols.
Specifically, each radar data packet contains 2320 data sampled at different distances in one direction, and each sonar data packet contains 256 sonar data within a 180-degree range acquired at one moment.
2) Radar and sonar data preprocessing. And a certain memory space is opened up for radar data and sonar data respectively, and the received data is stored in the memory space.
Preferably, the radar divides the 360-degree azimuth into 4096 parts, sequentially stores the received radar data into memory space with the size of 4096×2320, the longest display time of the sonar corresponds to 496 time units, and sequentially stores the received sonar data into memory with the size of 496×256.
3) And (5) comprehensively displaying the data.
And fusing the radar picture and the sonar picture to be displayed under the same screen by using an OpenGL graphic development tool.
Specifically, two fusion display schemes are employed: surrounding type and superposition type, wherein the surrounding type display uses a radar picture as a center, and the sonar display correspondingly surrounds the outside of a radar scanning area through azimuth; the superimposed display is centered on the radar screen, and the sonar display is displayed in the radar scanning area in the form of a base map or background through azimuth correspondence.
4) And (5) detecting a target. And judging the target type mainly according to the display condition of the comprehensive interface. If the target is a water surface target, the radar sonar can be detected, and the defect of the sonar is overcome by the high precision of the radar through the azimuth correspondence; if the target is an underwater target, only the sonar can detect the target, and the radar has no target echo in the same direction, so that the target can be determined to be the underwater target; similarly, if the target is an aerial target, only the radar can detect the target, and the sonar has no target echo in the same direction, so that the target can be determined to be the aerial target.
The invention is original:
(1) And the thought of radar sonar data comprehensive display is provided.
(2) Two methods for radar sonar comprehensive display are designed: surrounding type and superposition type.
The invention is characterized in that:
(1) According to the invention, radar and sonar data are displayed on the same interface, so that the target conditions detected by the two devices can be intuitively and clearly compared.
(2) The invention provides a comprehensive display method, which is divided into different modules according to functions, has clear structure, can independently design and modify each part, and is easy to develop, transplant and realize.
3. The invention has the positive effects that
According to the technical scheme, the radar sonar comprehensive display control device has the functions of fusing radar and sonar display, can clearly compare data information of the radar sonar, quickly and accurately locate the type attribute of a target, reduce misjudgment and promote overall situation control of surrounding sea area environments.
Drawings
FIG. 1 is a flow chart of the present invention
FIG. 2 is a schematic diagram of a wraparound display in the display method of the present invention:
1: a menu is drawn up, and the sonar is operated and controlled;
2: radar range and range ring selection menus;
3: a surround type integrated display mode sonar display area;
4: right radar cursor information and ship information display area;
5: displaying the table page by the radar sonar target comprehensive data on the right side;
6: a surrounding type integrated display mode radar display area;
7: the current time and the radar control menu;
8: a warhead line;
FIG. 3 is a schematic diagram of an overlay display in the display method of the present invention:
9: displaying an interface switching menu on the left side;
10: superimposed integrated display sonar display data;
11: the radar display data is displayed in a superimposed mode;
FIG. 4 is a radar sonar surrounding ratio of 1:1
FIG. 5 is a surround ratio of 2:1
FIG. 6 is a surround ratio of 3:1
Detailed Description
A radar and sonar data comprehensive processing method comprises the following steps:
step one, acquiring radar and sonar data: the radar signal is derived from a radar processor, is a video signal obtained by signal processing and data processing of the radar original signal, and is transmitted to the display control terminal through a UDP communication protocol. The sonar data is derived from a sonar processor and is also a video signal, and is transmitted to the display control terminal through a UDP communication protocol. And the display control terminal establishes a UDP communication protocol by utilizing a Socket, and the radar data and the sonar data are respectively received by different network ports.
Step two, preprocessing radar and sonar data: the radar processor sends data to the display control terminal at different time intervals according to the rotating speed of the radar, the data are stored in a direction-distance format, and each direction-distance corresponds to one radar signal intensity. The radar scans for one circle correspond to 4096 azimuth directions, and each azimuth direction contains 2320 radar data, namely, the range direction is divided into 2320 range units. The display terminal opens up a memory space with the size of 4096 x 2320 for storing radar data, and sequentially stores the received radar data into the memory according to the azimuth. And the sonar signals send data to the display control terminal at certain time intervals, the data are stored in a time-azimuth format, and each time-azimuth corresponds to one sonar signal intensity. The duration of the sonar signals in different display modes is different, the display time of the sonar signals in the superposition mode is longest, the number of corresponding time units is 496, and one time unit receives 256 sonar data. Therefore, the display terminal opens up a memory space with the size of 496 x 256 for sonar data, and the received sonar data are sequentially stored in the memory according to time.
Step three, data comprehensive display: the display mode includes a surrounding mode (shown in fig. 2) and a superposition mode (shown in fig. 3). But for practical use, separate radar and sonar display interfaces are reserved, and a picture to be displayed is selected through a left side sliding menu of the screen, as indicated by 9 in fig. 3.
Fourth, target detection: the target detection module is used for judging the target type mainly according to the display condition of the comprehensive interface. If the target is a water surface target, the radar sonar can detect that echo signals are generated by the sonar and the radar in the azimuth, but the azimuth accuracy of the radar is higher than that of the sonar, so that the azimuth and the distance of the target can be accurately positioned from a radar display area; if the target is an underwater target, only the sonar can detect that the radar has no target echo in the same direction, and then the target can be determined to be the underwater target; similarly, if the target is an aerial target, only the radar can detect the target, and the sonar has no target echo in the same direction, so that the target can be determined to be the aerial target.
Specifically, two drawing modes in the third step are as follows:
surrounding type: and the radar picture is taken as the center, and the passive sonar cannot accurately measure the target distance, so that the sonar scanning area is corresponding to the radar picture according to the azimuth and surrounds the periphery of the radar scanning area. The radar data is drawn in a P-type display circular area (indicated by 6 in fig. 2), and the sonar data is drawn in a semicircular sector area (indicated by 3 in fig. 2) at the periphery of the radar picture, and is symmetrical left and right by taking a warhead line (indicated by 8 in fig. 2) as an axis. The size of the area for displaying the radar and sonar data can be determined according to the surrounding ratio set by the operation interface, wherein the surrounding ratio refers to the ratio of the radius of the circle displaying area of the radar P to the radius of the sonar fan in the comprehensive display circle area, for example, the surrounding ratio of the radar to the sonar is 1:1 in fig. 4, the surrounding ratio of the radar to the sonar is 2:1 in fig. 5, and the surrounding ratio of the radar to the sonar is 3:1 in fig. 6.
(1) The radar picture drawing method comprises the following steps:
1) And determining the radius of the radar P. The radius of the comprehensive display circle is R, and the unit is a pixel. And determining the radius length R1 of the radar P-type display area according to the radar sonar surrounding ratio, wherein the unit is a pixel. For example, if the surround ratio is set to 2:1, r1=2/3R.
2) The radar data length is determined. The range of the radar is 1, 3, 6, 12, 24 and 48 sea, a plurality of ranges are selectable, and the distances between two pixel points on the corresponding screen of different ranges are different, so that radar data DataRadar with different lengths are required to be acquired according to the range of the radar, and the length is marked as L1.
3) The radar data is processed into display data. Because the radar data lengths corresponding to different ranges are different, the radar data needs to be scaled so as to be consistent with the display area data length. Therefore, the radar data DataRadar needs to be scaled from L1 to R1, the scaling adopts a linear interpolation scaling method, and the scaled data is saved in a radar data buffer.
4) And drawing a radar P display picture. And drawing the data on each scanning line in turn according to the azimuth direction according to the radar P-type explicit mode. Drawing the data in the radar data buffer onto the screen by using the drawing function in OpenGL, and performing coordinate conversion when the radar data is displayed on the display because the display is positioned in a rectangular coordinate system and the radar echo data is positioned as polar coordinates:
x=x 0 +ρcosθ
y=y 0 +ρsinθ
wherein (x) 0 ,y 0 ) And (3) representing the coordinates of the P display center point, wherein theta is an azimuth angle, rho is the length from the center point, and the radar data corresponding to the P display (x, y) coordinate points are 8192 x theta/360 th row and rho column data in a radar data buffer area.
5) Updating the data. The data reading and displaying are different threads, the data reading part updates the radar data received by UDP into the corresponding memory space according to the azimuth, and the display part continuously reads the data from the memory, performs related processing and then draws the data into the display screen.
(2) The sonar image drawing method comprises the following steps:
1) And determining the radius of the semicircular sector of the sonar. The radius of the comprehensive display circle is R, and the unit is a pixel. And determining the semicircular sector radius length R2 of the sonar according to the radar sonar surrounding ratio, wherein the unit is a pixel. For example, if the surround ratio is set to 2:1, r2=1/3R.
2) And processing the sonar data into display data. In order to facilitate the subsequent data transferring of the display part, the received original sonar data is processed and stored in a sonar data buffer area. The buffer area size is 496 x L2, L2 is the data length corresponding to the outer arc length of the sonar semicircular sector, and the value is:
L2=R×π
since the number of original sonar data received at the same time is 256, interpolation processing is required to be performed to amplify the original sonar data to the L2 size, and then the original sonar data is stored in a sonar data buffer.
3) And drawing a sonar image. And drawing the data in the sonar data buffer area into a sonar display area by using a function in OpenGL. Because the sonar display area is in a sector shape, the data length of the inner ring and the outer ring is inconsistent, wherein the data length corresponding to the outermost peripheral data is L2, and the data length corresponding to the innermost ring data is (R-R2) multiplied by pi, the data length corresponding to the innermost ring data is equal to the radius R i The corresponding data length is:
L i =R i ×π
therefore, according to the displayed position, the data in the sonar data buffer area is sampled to different degrees, so that the length of the data is consistent with the data length of the display area, and finally the semicircular arc is sequentially filled according to the corresponding direction by using the function in OpenGL.
4) And updating sonar data. The data reading part sequentially stores the received sonar signals into the memory space according to time, and the pointers point to the first address of the memory space after the memory space is full and circularly reciprocates. And simultaneously, the display part continuously reads data from the memory, and after relevant processing, the sonar data are drawn into the display screen.
Superposition type: based on the radar scanning area (indicated by 11 in fig. 3), the sonar scanning picture (indicated by 10 in fig. 3) is corresponding to the radar scanning picture according to the azimuth, and the radar and sonar targets are displayed under the radar scanning picture in a background or base map mode, so that the correspondence of the radar and the sonar targets in the azimuth can be more intuitively seen through superposition.
(1) The radar picture drawing method comprises the following steps: the radar P-type display area is the whole comprehensive display circular area, the P-type display radius is R, and the drawing method is the same as that of the surrounding radar picture.
(2) The sonar image drawing method comprises the following steps: the sonar display area is the whole comprehensive display circular area, the length of the radius of the semicircular sector of the sonar is R, the sonar image drawing method is the same as the surrounding type sonar image drawing method, the display range is only expanded from the semicircular sector to the semicircular shape, and the sonar data are sequentially filled into the inner circle until the circle center is located. Since the superimposed sonar image area and the radar image area coincide, the sonar data is superimposed on the radar image as a background or a base map.

Claims (4)

1. A radar and sonar data comprehensive processing method comprises the following steps:
step one, video data transmitted from a radar and a sonar lower computer are respectively received through UDP communication protocols;
step two, preprocessing radar and sonar data, respectively opening up a certain memory space for the radar and the sonar data, and storing received data into a storage space;
thirdly, comprehensively displaying data, and fusing a radar picture and a sonar picture to be displayed under the same screen by using a graphic development tool; wherein, in the third step, two fusion display schemes are adopted: surrounding type and superposition type, wherein the surrounding type display uses a radar picture as a center, and the sonar display correspondingly surrounds the outside of a radar scanning area through azimuth; the superposition type display is displayed in a radar scanning area in a base map or background mode through azimuth correspondence by taking a radar picture as a center; and in the third step, radar data are drawn in a P-type display circular area in a surrounding type display mode, sonar data are drawn in a semicircular sector area at the periphery of a radar picture, the radar and sonar data are symmetrically arranged left and right by taking a warhead line as an axis, the size of the area for displaying the radar and the sonar data is determined according to the surrounding ratio set by an operation interface, and the radar and the sonar data are drawn specifically according to the following method:
(1) The radar picture drawing method comprises the following steps:
1) Determining the radius of the radar P display, wherein the radius of the comprehensive display circular area is R, the unit is a pixel, and the radius length R1 of the radar P display area is determined according to the radar sonar surrounding ratio, and the unit is a pixel;
2) Determining the length of radar data, and acquiring radar data DataRadar with different lengths according to the range of the radar, wherein the length is marked as L1;
3) Processing radar data into display data, scaling the radar data to be consistent with the data length of a display area, scaling the radar data DataRadar from L1 to R1, scaling the radar data by adopting a linear interpolation scaling method, and storing the scaled data into a radar data buffer area;
4) Drawing a radar P display picture, sequentially drawing data on each scanning line according to the azimuth direction in a radar P type explicit mode, carrying out coordinate transformation on the data in a radar data buffer area, and drawing the data on a screen;
5) Updating data, reading the data and displaying the data as different threads, updating the radar data received by UDP into a corresponding memory space according to the azimuth by a data reading part, continuously reading the data from the memory by a display part, performing related processing and then drawing the data into a display screen;
(2) The sonar image drawing method comprises the following steps:
1) Determining the radius of a sonar semicircular sector, wherein the radius of a comprehensive display circular area is R, the unit is a pixel, and determining the radius length R2 of the sonar semicircular sector according to the radar sonar surrounding ratio, and the unit is a pixel;
2) Processing the sonar data into display data, processing the received original sonar data, and storing the processed sonar data into a sonar data buffer area, wherein the size of the buffer area is 496 x L2, L2 is the data length corresponding to the arc length of the outermost periphery of the semicircular sector of the sonar, and the value is as follows:
L2=R×π
interpolation processing is carried out on the received original sonar data, so that the received original sonar data is amplified to the L2 size and then stored in a sonar data buffer area;
3) Drawing a sonar image, drawing data in a sonar data buffer zone into a sonar display zone, wherein the data length of the inner ring and the outer ring of the sonar display zone is inconsistent, the data length corresponding to the outermost peripheral data is L2, the data length corresponding to the innermost ring data is (R-R2) x pi, and the corresponding data length at the radius Ri is:
Li=Ri×π
according to the displayed position, sampling the data in the sonar data buffer area to different degrees to ensure that the length of the data is consistent with the length of the data in the display area, and finally filling the semicircular arc according to the corresponding direction;
4) Updating sonar data, reading and displaying the data as different threads, sequentially storing the received sonar signals into a memory space by a data reading part according to time, pointing a pointer to a memory space head address after the sonar signals are stored fully, circularly reciprocating, simultaneously continuously reading the data from the memory by a display part, and drawing the sonar data into a display screen after relevant processing;
and step four, target detection, namely judging the target type according to the display condition of the comprehensive interface.
2. The method for comprehensively processing radar and sonar data according to claim 1, wherein in the second step, the radar divides the 360-degree direction into 4096 parts, the received radar data are sequentially stored in a memory space with a size of 4096 x 2320, the longest display time of the sonar corresponds to 496 time units, and the received sonar data are sequentially stored in a memory with a size of 496 x 256.
3. A method for integrated radar and sonar data processing as defined in claim 2, wherein in said step one, each radar data packet contains 2320 data sampled at different distances in a direction, and each sonar data packet contains 256 sonar data within a 180 degree range collected at a moment.
4. The method for comprehensively processing radar and sonar data as defined in claim 3, wherein the method for judging the target type in the fourth step is as follows: if the target is a water surface target, the radar sonar can be detected, and the defect of the sonar is overcome by the high precision of the radar through the azimuth correspondence; if the target is an underwater target, only the sonar can detect the target, and the radar has no target echo in the same direction, so that the target can be determined to be the underwater target; if the target is an aerial target, only the radar can detect the target, and the sonar has no target echo in the same direction, so that the target can be determined to be the aerial target.
CN201910557542.1A 2019-06-25 2019-06-25 Radar and sonar data comprehensive processing method Active CN110297235B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910557542.1A CN110297235B (en) 2019-06-25 2019-06-25 Radar and sonar data comprehensive processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910557542.1A CN110297235B (en) 2019-06-25 2019-06-25 Radar and sonar data comprehensive processing method

Publications (2)

Publication Number Publication Date
CN110297235A CN110297235A (en) 2019-10-01
CN110297235B true CN110297235B (en) 2023-08-15

Family

ID=68028771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910557542.1A Active CN110297235B (en) 2019-06-25 2019-06-25 Radar and sonar data comprehensive processing method

Country Status (1)

Country Link
CN (1) CN110297235B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111490832B (en) * 2020-06-03 2024-01-30 天津大学 Underwater sound communication device and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS55109975A (en) * 1979-02-15 1980-08-23 Mitsubishi Heavy Ind Ltd Radar unit for ship with depth sounder display part
CN105353354A (en) * 2015-11-11 2016-02-24 陕西长岭电子科技有限责任公司 Double-B display method of landing guidance radar
CN105654133A (en) * 2015-12-31 2016-06-08 中船重工(昆明)灵湖科技发展有限公司 Multi-source data-based ship trajectory fusion system and realization method thereof
CN109714567A (en) * 2018-11-08 2019-05-03 中国船舶重工集团公司七五0试验场 A kind of real-time construction method of three-dimensional virtual scene based on infrared viewing device and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9201142B2 (en) * 2013-03-14 2015-12-01 Navico Holding As Sonar and radar display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS55109975A (en) * 1979-02-15 1980-08-23 Mitsubishi Heavy Ind Ltd Radar unit for ship with depth sounder display part
CN105353354A (en) * 2015-11-11 2016-02-24 陕西长岭电子科技有限责任公司 Double-B display method of landing guidance radar
CN105654133A (en) * 2015-12-31 2016-06-08 中船重工(昆明)灵湖科技发展有限公司 Multi-source data-based ship trajectory fusion system and realization method thereof
CN109714567A (en) * 2018-11-08 2019-05-03 中国船舶重工集团公司七五0试验场 A kind of real-time construction method of three-dimensional virtual scene based on infrared viewing device and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
多声呐雷达数据融合系统实现;吴晓潭等;《 网络新媒体技术》;第3卷(第4期);第57-59页 *

Also Published As

Publication number Publication date
CN110297235A (en) 2019-10-01

Similar Documents

Publication Publication Date Title
JP6150418B2 (en) Information display device, fish finder and information display method
US7541973B2 (en) Radar apparatus for combining and displaying data from a plurality of radar antennas
JP2012233743A (en) Information display device
CN105137397B (en) One kind is based on ARM platform marine navigation radar echo high-resolution display methods
EP4082890A1 (en) Administrative server in ship navigation assistance system, ship navigation assistance method, and ship navigation assistance program
CN101592729B (en) Device and method for partial enlarged display of radar PPI images based on target details
JP2011059018A (en) Image processor, radar device for mounting the same, method of processing image and image processing program
CN112526490B (en) Underwater small target sonar detection system and method based on computer vision
JP2008051745A (en) Mobile locating program, storage medium recording the same, mobile locating device, and mobile locating method
CN103592650A (en) Three-dimensional sonar imaging system based on graph processor and three-dimensional image method thereof
CN110297235B (en) Radar and sonar data comprehensive processing method
JP2016184295A (en) Display control method, display control program, and information processing apparatus
JP5398099B2 (en) Radar detector
JP5800386B2 (en) Map display device, map display method, and program
CN113567932A (en) Radar display and control device
CN111551930B (en) Radar image display method based on layered display
JP2010286359A (en) Signal processor, radar device including the signal processor, and method of detecting pixel omission of the signal processor
JP4649695B2 (en) Infrared detector
JP2001296348A (en) Video display for three dimensional radar
JP4931429B2 (en) Mobile track display device
WO2023226593A1 (en) Picture display method, system and apparatus, device, and storage medium
JPH03251782A (en) Automatic radar plotting apparatus
JP3603206B2 (en) Display method of radar device
JP2022066630A (en) Vessel monitoring system, distance measuring method, distance measuring program, display control method and display control program
KR0142683B1 (en) Radar apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant