WO2022185430A1 - Système radar et procédé de détection d'objet - Google Patents

Système radar et procédé de détection d'objet Download PDF

Info

Publication number
WO2022185430A1
WO2022185430A1 PCT/JP2021/008092 JP2021008092W WO2022185430A1 WO 2022185430 A1 WO2022185430 A1 WO 2022185430A1 JP 2021008092 W JP2021008092 W JP 2021008092W WO 2022185430 A1 WO2022185430 A1 WO 2022185430A1
Authority
WO
WIPO (PCT)
Prior art keywords
radar
control device
shape
devices
controls
Prior art date
Application number
PCT/JP2021/008092
Other languages
English (en)
Japanese (ja)
Inventor
洋介 佐藤
亮喜 原本
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Priority to PCT/JP2021/008092 priority Critical patent/WO2022185430A1/fr
Priority to JP2023503585A priority patent/JP7449443B2/ja
Publication of WO2022185430A1 publication Critical patent/WO2022185430A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles

Definitions

  • the present invention relates to a radar system that detects objects existing in a monitored area with a radar device.
  • FMCW (Frequency Modulated Continuous-Wave) radar devices having a structure as shown in FIG. 1 are known as radar devices using microwaves, millimeter wave bands, and the like.
  • Radar apparatus 100 in FIG. 1 amplifies a frequency-modulated radar signal from FMCW transmission source 101 with transmission power amplifier 103 and emits it from transmission antenna 104 .
  • the object T reflects the radar transmission wave.
  • a reflected wave from the object T is received by the receiving antenna 105 of the radar device 100, amplified by the receiving power amplifier 106, and then mixed with the transmission radar signal component from the power divider 102 by the mixer 107 to form an IF signal. converted.
  • the IF signal output from the mixer 107 is A/D-converted and signal-processed by the signal processing section 108 .
  • the reflected received power (reflected wave power) by the object T, the distance to the object T, the azimuth of the object T, and the speed when the object T is moving (relative speed with respect to the radar device 100) are obtained.
  • a detection result is obtained.
  • Patent Document 1 a millimeter-wave radar is installed on a moving body, and the distance between a first reflector and a second reflector respectively installed near the target position, and the distance from these reflectors
  • An invention is disclosed that measures the distance to a target position based on the reception result of reflected waves.
  • Radar devices are used to detect objects on road surfaces such as roads and runways, which are areas to be monitored. There are usually no reflective objects such as fallen objects or abandoned objects on roads or runways. Therefore, the radar equipment continues to send radar transmission waves to the detection range where there is no reflecting object, and only when some reflecting object appears within the detection range, a received wave (reflected wave) is obtained, and the object is detected. be done.
  • a camera device is used in combination with the radar device. That is, when the existence of an object is recognized by the power of the reflected wave from the object by the radar device, the angle of view and focus of the camera device are adjusted to the location of the object, and the image is captured. As a result, a photographed image of the object to be removed can be obtained, making it easier to recognize the detailed shape of the object.
  • the worker can use the information on the location of the object obtained from the radar device and the information on the shape of the object obtained from the camera device.
  • the angle of view taken by the camera device only shape information of an object viewed from one direction can be obtained, and it is difficult to faithfully capture the actual shape.
  • the present invention has been made in view of the conventional circumstances as described above, and an object of the present invention is to provide a radar system capable of more accurately capturing the shape of an object detected by a radar device.
  • the present invention configures the radar system as follows. That is, the radar system according to the present invention is a radar system that detects an object existing in a monitoring target area with a radar device, and is installed at different positions, and the shape of the object detected by the radar device when viewed from its own position.
  • the present invention is characterized by comprising a plurality of object shape acquisition devices that acquire information, and a display device that displays an object shape image based on a plurality of pieces of shape information acquired by the plurality of object shape acquisition devices.
  • the plurality of object shape acquisition devices may include camera devices. Also, the plurality of object shape acquisition devices may include radar devices. Note that the plurality of object shape acquisition devices may be configured with only a plurality of camera devices, may be configured with only a plurality of radar devices, or may be configured with a combination of one or more camera devices and one or more radar devices. may
  • the radar system further includes a first radar control device that controls operation of the first radar device, a second radar control device that controls operation of the second radar device, and a first radar device.
  • a synchronization processing device for transmitting a synchronization signal to the radar control device and the second radar control device, wherein the first radar control device synchronizes the first radar device at a timing according to the synchronization signal received from the synchronization processing device.
  • the second radar control device may control the operation of the second radar device at timing according to the synchronization signal received from the synchronization processing device.
  • each of the first radar control device and the second radar control device has a GNSS receiver that outputs time information obtained from GNSS satellites, and the first radar control device , controls the operation of the first radar device at the timing according to the time information output from the GNSS receiver, and the second radar control device controls the second radar device at the timing according to the time information output from the GNSS receiver may control the operation of the radar equipment.
  • FIG. 1 is a diagram showing an overview of a radar system according to an embodiment of the invention
  • FIG. It is a figure which shows a mode that an object shape is specified using one sensor apparatus.
  • 4A and 4B are diagrams showing an example when a detected object is photographed by the camera device 120 of FIG. 3;
  • FIG. It is a figure which shows the installation example of several sensor apparatuses.
  • FIG. 4 is a diagram showing how a plurality of sensor devices are used to specify the shape of an object;
  • FIG. 7 is a diagram showing an example when a detected object is photographed by the camera device 120A of FIG. 6;
  • FIG. 7 is a diagram showing an example of a case where a detected object is photographed by the camera device 120B of FIG.
  • FIG. 7 is a diagram showing an example of a case in which a detected object is photographed by the camera device 120C of FIG. 6;
  • FIG. 7 is a diagram showing an example of a case in which a detected object is photographed by the camera device 120D of FIG. 6;
  • FIG. 4 is a diagram showing an example of a processing flow when an object is detected by the radar system according to one embodiment of the present invention;
  • FIG. 3 is a diagram showing an example of a coordinate system and detected objects of a radar device rotating in a horizontal direction;
  • FIG. 3 is a diagram showing a first configuration example for synchronizing a plurality of radar control devices;
  • FIG. 5 is a diagram showing a second configuration example for synchronizing a plurality of radar control devices;
  • FIG. 2 shows an overview of a radar system according to one embodiment of the invention.
  • the radar system of this example includes a sensor device 200 including a radar device 100 and a camera device 120 installed toward a predetermined detection range R, and a radar control device 300 and a display device 400 installed in a control room, a monitoring room, or the like. and
  • a sensor device 200 including a radar device 100 and a camera device 120 installed toward a predetermined detection range R, and a radar control device 300 and a display device 400 installed in a control room, a monitoring room, or the like.
  • FIG. 2 only one sensor device 200 is shown for simplification of explanation, but as shown in FIG. be done.
  • the radar device 100 receives the reflected wave of the radar transmission wave transmitted to the detection range R, and outputs the radar detection result obtained by signal processing to the radar control device 300 . Based on the radar detection results output from the radar device 100 , the radar monitoring device 300 causes the display device 400 to display the detection information of the object X (falling object or abandoned object) existing within the detection range R.
  • the detection range R of the radar device 100 is a predetermined section on a road surface such as a road or runway, which is the area to be monitored, and the antenna angle of the radar device 100 is set so as to include the detection range R.
  • the distance from the radar device 100 to the object X can be calculated by subjecting the reflected wave from the object X to signal processing. Further, when the antenna of the radar device 100 is mechanically rotated, the angle (azimuth) of the object X with respect to the radar device 100 can be specified based on the rotation angle information of the antenna.
  • the angle (azimuth) of the object X with respect to the radar device 100 is specified based on the beam scanning angle information. can do.
  • the distance and angle information obtained by the radar device 100 is transmitted to the camera device 120 via the radar control device 300.
  • the camera device 120 turns according to the angle information obtained by the radar device 100, adjusts the focus according to the distance information obtained by the radar device 100, and photographs the object X existing within the angle of view.
  • FIG. 3 when there is only one set of the radar device 100 and the camera device 120 (that is, when there is only one sensor device 200), the object X can only be photographed from one direction. Problems in this case will be described below.
  • FIG. 4 shows an example of photographing the object X in the arrangement shown in FIG.
  • the upper portion of FIG. 4 shows the shape of the actual object X viewed from above (that is, the shape of the object viewed from above), and the lower portion of FIG.
  • the object shape is shown. In this way, there may be a difference between the shape of the object identified from the captured image and the actual shape of the object due to factors such as the angle of view of the camera device 120, the distance to the object X, and the condition of light irradiation during photography.
  • factors such as the angle of view of the camera device 120, the distance to the object X, and the condition of light irradiation during photography.
  • the object shape information transmitted to the worker is the "rod-shaped object", and the worker searches for the "rod-shaped object” at the site.
  • the shape of the collected object differs from the shape of the prior information, it may lead to a lack of confidence in the collection result.
  • FIG. 5 shows an installation example of a plurality of sensor devices 200 .
  • a plurality of sensor devices 200 are installed in parallel with the road surface on both sides of the road surface to be monitored.
  • Each sensor device 200 is equipped with a radar device 100 and a camera device 120 .
  • These multiple sensor devices 200 are connected by optical fiber cables or the like to a radar control device 300 installed in a control room, a monitoring room, or the like. That is, the radar control device 300 controls all sensor devices 200 and manages information obtained from these sensor devices 200 .
  • the sensor device 200A includes a radar device 100A and a camera device 120A, and the same applies to the sensor devices 200B, 200C, and 200D.
  • the radar device 100A detects the reflected power from the object X and recognizes the existence of the object X.
  • the radar device 100A can obtain the distance and angle to the object X with respect to its own installation origin.
  • the camera device 120A which has the same installation origin as the radar device 100A, adjusts the angle of view and focus on the existing location of the object X based on the distance and angle information acquired by the radar device 100A, and shoots the object.
  • FIG. 7 shows an example when the object X is photographed by the camera device 120A of FIG.
  • the upper part of FIG. 7 shows the shape of an actual object X viewed from above, and the lower part of FIG. is shown.
  • the radar control device 300 controls all the sensor devices 200 and manages the information obtained from these sensor devices 200, so the position information of the object X detected by the radar device 100A is The entire system can be shared. Therefore, the other sensor devices 200B, 200C, and 200D near the object X, based on the position information of the object X, determine the radio wave irradiation angles of the radar devices 100B, 100C, and 100D, the camera devices 120B, 120C, The angle of view and focus of 120D can be adjusted to match the location of object X.
  • FIG. 8 shows an example when the object X is photographed by the camera device 120B of FIG.
  • the upper part of FIG. 8 shows the shape of an actual object X viewed from above, and the lower part of FIG. is shown.
  • FIG. 9 shows an example when the object X is photographed by the camera device 120C of FIG.
  • the upper part of FIG. 9 shows the shape of the actual object X viewed from above, and the lower part of FIG. is shown.
  • FIG. 10 shows an example when the object X is photographed by the camera device 120D of FIG.
  • the upper part of FIG. 10 shows the shape of an actual object X viewed from above, and the lower part of FIG. is shown.
  • the radar control device 300 performs image processing using a plurality of captured images of the object X to generate an object shape image representing the shape of the object X.
  • FIG. Various known techniques can be used to generate the object shape image.
  • the angle at which the partial images of the object X from each camera device are pasted together is determined in advance in order to generate the object shape image, and four partial images are obtained at that angle. You can glue them together.
  • four sensor devices 200 are used to generate an object shape image, but the number of sensor devices 200 is not limited to this, and an arbitrary number of two or more sensor devices 200 may be used. It is possible. By using images obtained by photographing the object X from more directions, it is possible to generate an object shape image that is closer to the actual shape.
  • Fig. 11 shows an example of the processing flow when an object is detected by the radar system of this example. It is assumed that the radar control device 300 stores in advance the position information of the sensor device 200 in the system (that is, the position information of the radar device 100 and the camera device 120). A plurality of radar devices 100 monitor a road surface, which is a monitoring target area, under the control of a radar control device 300 . When an object on the road surface is detected by any radar device 100 (step S101), the radar detection result is transmitted from the radar device 100 to the radar control device 300.
  • the radar control device 300 When the radar control device 300 receives the radar detection result, the distance and angle of the detected object included in the radar detection result (that is, the relative distance and angle with respect to the radar device 100) and the radar device from which the radar detection result is transmitted The position information of the detected object is calculated based on the 100 position information. Also, the radar control device 300 identifies several camera devices 120 near the detected object based on the calculated position information of the detected object and the position information of each radar device 100 (step S102). For example, four camera devices 120 are identified in order of proximity to the position of the detected object. In the case of the camera arrangement as shown in FIGS. 5 and 6, four camera devices 120 surrounding the object to be detected are specified. Note that this method is an example, and the camera device 120 may be specified by another method. For example, several camera devices 120 may be set in advance for each block that divides the area to be monitored, and the camera device 120 set for the block containing the object to be detected may be specified.
  • the radar control device 300 controls each of the identified camera devices 120 to adjust the angle of view and focus, and causes the object to be photographed (step S103). Images captured by these camera devices 120 are transmitted to the radar control device 300 .
  • the radar control device 300 performs image processing using a plurality of captured images received from each identified camera device 120, and generates an object shape image representing the shape of the detected object (step S104).
  • the radar control device 300 transmits the generated object shape image to the display device 400 and displays it on the display device 400 (step S105).
  • FIG. 12 shows the coordinate system of the radar device 100 rotating in the horizontal direction and the object X existing there. Assuming that the position of the radar device 100 is the origin, the position of the object X can be determined by the distance and angle to the object X. FIG. At this time, if the resolution in the distance direction and the angle direction with respect to the size of the object X is set sufficiently small, the radar device 100 can acquire the reflected wave power from a plurality of locations according to the shape of the object X. . In the example of FIG. 12, the reflected wave power from grid points of black circles corresponding to the region of the object X is obtained. Therefore, by analyzing the reception result of the reflected wave power, it is possible to identify the size of the object seen from the radar device 100 . It is also possible to specify approximate object shapes.
  • an object shape image that is closer to the actual shape can be generated. becomes possible.
  • four radar devices 100A, 100B, 100C, and 100D identify object sizes viewed from respective directions. Although one radar device 100 may be used to identify the object size, it is possible to identify the object size more accurately by using a larger number of radar devices 100 . As a result, the difference between the object shape image transmitted to the operator and the actual object shape can be reduced, and the search time can be shortened.
  • the radar device 100 can also identify the approximate object shape of the detected object X, as shown in FIG. Therefore, it is also possible to generate an object shape image using only the radar device 100 without using the camera device 120 . In this case, the accuracy of the object shape image is lower than when the camera device 120 is used, but the system can be simplified and the cost can be reduced.
  • an object shape image may be generated by combining the object shape obtained using the camera device 120 and the object shape obtained using the radar device 100 .
  • one radar control device 300 controls the radio wave emission direction and It manages the radio wave emission/stop timing.
  • a large number of radar devices 100 are required for a radar system that monitors a wide area, but it may not be possible to secure an optical fiber cable for connecting all the radar devices 100 to one radar control device 300 .
  • FIG. 13 shows a first configuration example in which a plurality of radar control devices 300 are synchronized.
  • the synchronization processing device 500 transmits synchronization signals to the radar control device 300A and the radar control device 300B.
  • the radar control device 300A controls the radar device 100 connected to the radar control device 300A at preset timing according to the synchronization signal received from the synchronization processing device 500 .
  • the radar control device 300B controls the radar device 100 connected to the radar control device 300B at preset timing according to the synchronization signal received from the synchronization processing device 500 . That is, the radar control device 300A and the radar control device 300B control the radar device 100 under their control in a state of being synchronized with each other. At this time, the radio wave irradiation direction and the radio wave emission/stop timing are controlled so that the radio waves from each radar device 100 do not enter the other radar devices 100 . Thereby, interference between radar devices 100 connected to different radar control devices 300 can be prevented.
  • FIG. 14 shows a second configuration example in which a plurality of radar control devices 300 are synchronized.
  • the system shown in the figure includes a radar control device 300A that controls a plurality of radar devices 100 installed on one side of the road surface, and a radar control device 300B that controls a plurality of radar devices 100 installed on the other side of the road surface.
  • the radar control device 300A also includes a GNSS (Global Navigation Satellite System) receiver 520, and so does the radar control device 300B.
  • the GNSS receiver 520 has a function of outputting accurate time information obtained from GNSS satellites.
  • the radar control device 300A controls the radar device 100 connected to the radar control device 300A at preset timing according to the time information output from the GNSS receiver 520. Further, the radar control device 300B controls the radar device 100 connected to the radar control device 300B at preset timing according to time information output from the GNSS receiver 520 . That is, the radar control device 300A and the radar control device 300B control the radar device 100 under their control in a state of being synchronized with each other. At this time, the radio wave irradiation direction and the radio wave emission/stop timing are controlled so that the radio waves from each radar device 100 do not enter the other radar devices 100 . Thereby, interference between radar devices 100 connected to different radar control devices 300 can be prevented.
  • the radar system of this example is installed at a position different from the radar device 100 that detects an object existing on the road surface, which is the area to be monitored, and the object detected by the radar device 100 is viewed from its own position. and a display device 400 for displaying an object shape image based on the plurality of shape information acquired by the plurality of sensor devices 200 .
  • a display device 400 for displaying an object shape image based on the plurality of shape information acquired by the plurality of sensor devices 200 .
  • the shape information used to generate the object shape image can be acquired by the radar device 100 and/or the camera device 120 of the sensor device 200. That is, the shape information used to generate the object shape image may be an image captured by the camera device 120, an object shape specified by the radar device 100, or a combination thereof.
  • the radar system of this example has a mechanism for synchronizing them by the synchronization processing device 500 or the GNSS receiver 520 when using a plurality of radar control devices 300 . Therefore, according to the radar system of this example, interference between radar devices 100 connected to different radar control devices 300 can be prevented.
  • the present invention has been described above based on one embodiment, the present invention is not limited to the wireless communication system described here, and it goes without saying that it can be widely applied to other wireless communication systems.
  • the present invention provides, for example, a method including technical procedures related to the above processing, a program for causing a processor to execute the above processing, and a storage medium storing such a program in a computer-readable manner. is also possible.
  • the present invention can be used in a radar system that detects objects existing in a monitored area with a radar device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

L'invention concerne un système radar pouvant capturer plus précisément la forme d'un objet détecté par un dispositif radar. Le système radar comprend un dispositif radar (100) qui détecte un objet sur une surface de route constituant une zone à surveiller, une pluralité de dispositifs capteurs (200) qui sont installés dans des positions respectivement différentes et qui acquièrent des informations de forme de l'objet, tel qu'observé depuis une position propre, détectée par le dispositif radar (100), et un dispositif d'affichage (400) qui affiche une image de forme d'objet en fonction d'une pluralité d'éléments des informations de forme acquises par la pluralité de dispositifs capteurs (200).
PCT/JP2021/008092 2021-03-03 2021-03-03 Système radar et procédé de détection d'objet WO2022185430A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/008092 WO2022185430A1 (fr) 2021-03-03 2021-03-03 Système radar et procédé de détection d'objet
JP2023503585A JP7449443B2 (ja) 2021-03-03 2021-03-03 レーダーシステム及び物体検知方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/008092 WO2022185430A1 (fr) 2021-03-03 2021-03-03 Système radar et procédé de détection d'objet

Publications (1)

Publication Number Publication Date
WO2022185430A1 true WO2022185430A1 (fr) 2022-09-09

Family

ID=83153964

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/008092 WO2022185430A1 (fr) 2021-03-03 2021-03-03 Système radar et procédé de détection d'objet

Country Status (2)

Country Link
JP (1) JP7449443B2 (fr)
WO (1) WO2022185430A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58189570A (ja) * 1982-04-28 1983-11-05 Oki Electric Ind Co Ltd レ−ダの干渉除去方式
JP2002222487A (ja) * 2001-01-26 2002-08-09 Mitsubishi Electric Corp 道路監視システム
JP2010256133A (ja) * 2009-04-23 2010-11-11 Toyota Motor Corp 干渉防止レーダ装置
JP2012099014A (ja) * 2010-11-04 2012-05-24 Saxa Inc 通行車両監視システム及び通行車両監視装置
WO2020188697A1 (fr) * 2019-03-18 2020-09-24 株式会社日立国際電気 Système de surveillance et procédé de surveillance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58189570A (ja) * 1982-04-28 1983-11-05 Oki Electric Ind Co Ltd レ−ダの干渉除去方式
JP2002222487A (ja) * 2001-01-26 2002-08-09 Mitsubishi Electric Corp 道路監視システム
JP2010256133A (ja) * 2009-04-23 2010-11-11 Toyota Motor Corp 干渉防止レーダ装置
JP2012099014A (ja) * 2010-11-04 2012-05-24 Saxa Inc 通行車両監視システム及び通行車両監視装置
WO2020188697A1 (fr) * 2019-03-18 2020-09-24 株式会社日立国際電気 Système de surveillance et procédé de surveillance

Also Published As

Publication number Publication date
JP7449443B2 (ja) 2024-03-13
JPWO2022185430A1 (fr) 2022-09-09

Similar Documents

Publication Publication Date Title
EP1441318B1 (fr) Système de sécurité
US7450251B2 (en) Fanned laser beam metrology system
US8362946B2 (en) Millimeter wave surface imaging radar system
CN101408618B (zh) 机载激光雷达的宽光束照明三维选通成像系统及成像方法
US20070176822A1 (en) Target detection apparatus and system
JP4741365B2 (ja) 物体検知センサ
CN107238842B (zh) 一种面阵目标搜索扫描成像装置及方法
RU2444754C1 (ru) Способ обнаружения и пространственной локализации воздушных объектов
RU2522982C2 (ru) Радиолокационная станция кругового обзора
JP6054435B2 (ja) 強化された撮像システム
RU2004102190A (ru) Способ повышения радиолокационного разрешения, система для его осуществления и способ дистанционного выявления системой малоразмерных объектов
JP6233606B2 (ja) 目標識別レーザ観測システム
WO2022185430A1 (fr) Système radar et procédé de détection d'objet
WO2002021641A2 (fr) Systeme radar passif couple en signalisation a un systeme radar actif
JP2004286461A (ja) 地中探知方法及び装置
Espeter et al. Progress of hybrid bistatic SAR: Synchronization experiments and first imaging results
CN109188395A (zh) 一种全偏振条纹管激光成像雷达装置
JP6923799B2 (ja) レーダー装置
CN111505654A (zh) 物体位置探测方法和激光雷达
US6204800B1 (en) Method for monitoring the earth surface
US5625452A (en) Passive detection of source of known spectral emission
CN108363057B (zh) 合成孔径雷达探测方法、装置及存储介质
JP6679371B2 (ja) 物体検知装置
JP7324296B2 (ja) レーダーシステム
JP2013171003A (ja) 目標輪郭識別装置および指示画像生成装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21929006

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023503585

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21929006

Country of ref document: EP

Kind code of ref document: A1