WO2022185430A1 - Radar system and object detection method - Google Patents

Radar system and object detection method Download PDF

Info

Publication number
WO2022185430A1
WO2022185430A1 PCT/JP2021/008092 JP2021008092W WO2022185430A1 WO 2022185430 A1 WO2022185430 A1 WO 2022185430A1 JP 2021008092 W JP2021008092 W JP 2021008092W WO 2022185430 A1 WO2022185430 A1 WO 2022185430A1
Authority
WO
WIPO (PCT)
Prior art keywords
radar
control device
shape
devices
controls
Prior art date
Application number
PCT/JP2021/008092
Other languages
French (fr)
Japanese (ja)
Inventor
洋介 佐藤
亮喜 原本
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Priority to JP2023503585A priority Critical patent/JP7449443B2/en
Priority to PCT/JP2021/008092 priority patent/WO2022185430A1/en
Publication of WO2022185430A1 publication Critical patent/WO2022185430A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles

Definitions

  • the present invention relates to a radar system that detects objects existing in a monitored area with a radar device.
  • FMCW (Frequency Modulated Continuous-Wave) radar devices having a structure as shown in FIG. 1 are known as radar devices using microwaves, millimeter wave bands, and the like.
  • Radar apparatus 100 in FIG. 1 amplifies a frequency-modulated radar signal from FMCW transmission source 101 with transmission power amplifier 103 and emits it from transmission antenna 104 .
  • the object T reflects the radar transmission wave.
  • a reflected wave from the object T is received by the receiving antenna 105 of the radar device 100, amplified by the receiving power amplifier 106, and then mixed with the transmission radar signal component from the power divider 102 by the mixer 107 to form an IF signal. converted.
  • the IF signal output from the mixer 107 is A/D-converted and signal-processed by the signal processing section 108 .
  • the reflected received power (reflected wave power) by the object T, the distance to the object T, the azimuth of the object T, and the speed when the object T is moving (relative speed with respect to the radar device 100) are obtained.
  • a detection result is obtained.
  • Patent Document 1 a millimeter-wave radar is installed on a moving body, and the distance between a first reflector and a second reflector respectively installed near the target position, and the distance from these reflectors
  • An invention is disclosed that measures the distance to a target position based on the reception result of reflected waves.
  • Radar devices are used to detect objects on road surfaces such as roads and runways, which are areas to be monitored. There are usually no reflective objects such as fallen objects or abandoned objects on roads or runways. Therefore, the radar equipment continues to send radar transmission waves to the detection range where there is no reflecting object, and only when some reflecting object appears within the detection range, a received wave (reflected wave) is obtained, and the object is detected. be done.
  • a camera device is used in combination with the radar device. That is, when the existence of an object is recognized by the power of the reflected wave from the object by the radar device, the angle of view and focus of the camera device are adjusted to the location of the object, and the image is captured. As a result, a photographed image of the object to be removed can be obtained, making it easier to recognize the detailed shape of the object.
  • the worker can use the information on the location of the object obtained from the radar device and the information on the shape of the object obtained from the camera device.
  • the angle of view taken by the camera device only shape information of an object viewed from one direction can be obtained, and it is difficult to faithfully capture the actual shape.
  • the present invention has been made in view of the conventional circumstances as described above, and an object of the present invention is to provide a radar system capable of more accurately capturing the shape of an object detected by a radar device.
  • the present invention configures the radar system as follows. That is, the radar system according to the present invention is a radar system that detects an object existing in a monitoring target area with a radar device, and is installed at different positions, and the shape of the object detected by the radar device when viewed from its own position.
  • the present invention is characterized by comprising a plurality of object shape acquisition devices that acquire information, and a display device that displays an object shape image based on a plurality of pieces of shape information acquired by the plurality of object shape acquisition devices.
  • the plurality of object shape acquisition devices may include camera devices. Also, the plurality of object shape acquisition devices may include radar devices. Note that the plurality of object shape acquisition devices may be configured with only a plurality of camera devices, may be configured with only a plurality of radar devices, or may be configured with a combination of one or more camera devices and one or more radar devices. may
  • the radar system further includes a first radar control device that controls operation of the first radar device, a second radar control device that controls operation of the second radar device, and a first radar device.
  • a synchronization processing device for transmitting a synchronization signal to the radar control device and the second radar control device, wherein the first radar control device synchronizes the first radar device at a timing according to the synchronization signal received from the synchronization processing device.
  • the second radar control device may control the operation of the second radar device at timing according to the synchronization signal received from the synchronization processing device.
  • each of the first radar control device and the second radar control device has a GNSS receiver that outputs time information obtained from GNSS satellites, and the first radar control device , controls the operation of the first radar device at the timing according to the time information output from the GNSS receiver, and the second radar control device controls the second radar device at the timing according to the time information output from the GNSS receiver may control the operation of the radar equipment.
  • FIG. 1 is a diagram showing an overview of a radar system according to an embodiment of the invention
  • FIG. It is a figure which shows a mode that an object shape is specified using one sensor apparatus.
  • 4A and 4B are diagrams showing an example when a detected object is photographed by the camera device 120 of FIG. 3;
  • FIG. It is a figure which shows the installation example of several sensor apparatuses.
  • FIG. 4 is a diagram showing how a plurality of sensor devices are used to specify the shape of an object;
  • FIG. 7 is a diagram showing an example when a detected object is photographed by the camera device 120A of FIG. 6;
  • FIG. 7 is a diagram showing an example of a case where a detected object is photographed by the camera device 120B of FIG.
  • FIG. 7 is a diagram showing an example of a case in which a detected object is photographed by the camera device 120C of FIG. 6;
  • FIG. 7 is a diagram showing an example of a case in which a detected object is photographed by the camera device 120D of FIG. 6;
  • FIG. 4 is a diagram showing an example of a processing flow when an object is detected by the radar system according to one embodiment of the present invention;
  • FIG. 3 is a diagram showing an example of a coordinate system and detected objects of a radar device rotating in a horizontal direction;
  • FIG. 3 is a diagram showing a first configuration example for synchronizing a plurality of radar control devices;
  • FIG. 5 is a diagram showing a second configuration example for synchronizing a plurality of radar control devices;
  • FIG. 2 shows an overview of a radar system according to one embodiment of the invention.
  • the radar system of this example includes a sensor device 200 including a radar device 100 and a camera device 120 installed toward a predetermined detection range R, and a radar control device 300 and a display device 400 installed in a control room, a monitoring room, or the like. and
  • a sensor device 200 including a radar device 100 and a camera device 120 installed toward a predetermined detection range R, and a radar control device 300 and a display device 400 installed in a control room, a monitoring room, or the like.
  • FIG. 2 only one sensor device 200 is shown for simplification of explanation, but as shown in FIG. be done.
  • the radar device 100 receives the reflected wave of the radar transmission wave transmitted to the detection range R, and outputs the radar detection result obtained by signal processing to the radar control device 300 . Based on the radar detection results output from the radar device 100 , the radar monitoring device 300 causes the display device 400 to display the detection information of the object X (falling object or abandoned object) existing within the detection range R.
  • the detection range R of the radar device 100 is a predetermined section on a road surface such as a road or runway, which is the area to be monitored, and the antenna angle of the radar device 100 is set so as to include the detection range R.
  • the distance from the radar device 100 to the object X can be calculated by subjecting the reflected wave from the object X to signal processing. Further, when the antenna of the radar device 100 is mechanically rotated, the angle (azimuth) of the object X with respect to the radar device 100 can be specified based on the rotation angle information of the antenna.
  • the angle (azimuth) of the object X with respect to the radar device 100 is specified based on the beam scanning angle information. can do.
  • the distance and angle information obtained by the radar device 100 is transmitted to the camera device 120 via the radar control device 300.
  • the camera device 120 turns according to the angle information obtained by the radar device 100, adjusts the focus according to the distance information obtained by the radar device 100, and photographs the object X existing within the angle of view.
  • FIG. 3 when there is only one set of the radar device 100 and the camera device 120 (that is, when there is only one sensor device 200), the object X can only be photographed from one direction. Problems in this case will be described below.
  • FIG. 4 shows an example of photographing the object X in the arrangement shown in FIG.
  • the upper portion of FIG. 4 shows the shape of the actual object X viewed from above (that is, the shape of the object viewed from above), and the lower portion of FIG.
  • the object shape is shown. In this way, there may be a difference between the shape of the object identified from the captured image and the actual shape of the object due to factors such as the angle of view of the camera device 120, the distance to the object X, and the condition of light irradiation during photography.
  • factors such as the angle of view of the camera device 120, the distance to the object X, and the condition of light irradiation during photography.
  • the object shape information transmitted to the worker is the "rod-shaped object", and the worker searches for the "rod-shaped object” at the site.
  • the shape of the collected object differs from the shape of the prior information, it may lead to a lack of confidence in the collection result.
  • FIG. 5 shows an installation example of a plurality of sensor devices 200 .
  • a plurality of sensor devices 200 are installed in parallel with the road surface on both sides of the road surface to be monitored.
  • Each sensor device 200 is equipped with a radar device 100 and a camera device 120 .
  • These multiple sensor devices 200 are connected by optical fiber cables or the like to a radar control device 300 installed in a control room, a monitoring room, or the like. That is, the radar control device 300 controls all sensor devices 200 and manages information obtained from these sensor devices 200 .
  • the sensor device 200A includes a radar device 100A and a camera device 120A, and the same applies to the sensor devices 200B, 200C, and 200D.
  • the radar device 100A detects the reflected power from the object X and recognizes the existence of the object X.
  • the radar device 100A can obtain the distance and angle to the object X with respect to its own installation origin.
  • the camera device 120A which has the same installation origin as the radar device 100A, adjusts the angle of view and focus on the existing location of the object X based on the distance and angle information acquired by the radar device 100A, and shoots the object.
  • FIG. 7 shows an example when the object X is photographed by the camera device 120A of FIG.
  • the upper part of FIG. 7 shows the shape of an actual object X viewed from above, and the lower part of FIG. is shown.
  • the radar control device 300 controls all the sensor devices 200 and manages the information obtained from these sensor devices 200, so the position information of the object X detected by the radar device 100A is The entire system can be shared. Therefore, the other sensor devices 200B, 200C, and 200D near the object X, based on the position information of the object X, determine the radio wave irradiation angles of the radar devices 100B, 100C, and 100D, the camera devices 120B, 120C, The angle of view and focus of 120D can be adjusted to match the location of object X.
  • FIG. 8 shows an example when the object X is photographed by the camera device 120B of FIG.
  • the upper part of FIG. 8 shows the shape of an actual object X viewed from above, and the lower part of FIG. is shown.
  • FIG. 9 shows an example when the object X is photographed by the camera device 120C of FIG.
  • the upper part of FIG. 9 shows the shape of the actual object X viewed from above, and the lower part of FIG. is shown.
  • FIG. 10 shows an example when the object X is photographed by the camera device 120D of FIG.
  • the upper part of FIG. 10 shows the shape of an actual object X viewed from above, and the lower part of FIG. is shown.
  • the radar control device 300 performs image processing using a plurality of captured images of the object X to generate an object shape image representing the shape of the object X.
  • FIG. Various known techniques can be used to generate the object shape image.
  • the angle at which the partial images of the object X from each camera device are pasted together is determined in advance in order to generate the object shape image, and four partial images are obtained at that angle. You can glue them together.
  • four sensor devices 200 are used to generate an object shape image, but the number of sensor devices 200 is not limited to this, and an arbitrary number of two or more sensor devices 200 may be used. It is possible. By using images obtained by photographing the object X from more directions, it is possible to generate an object shape image that is closer to the actual shape.
  • Fig. 11 shows an example of the processing flow when an object is detected by the radar system of this example. It is assumed that the radar control device 300 stores in advance the position information of the sensor device 200 in the system (that is, the position information of the radar device 100 and the camera device 120). A plurality of radar devices 100 monitor a road surface, which is a monitoring target area, under the control of a radar control device 300 . When an object on the road surface is detected by any radar device 100 (step S101), the radar detection result is transmitted from the radar device 100 to the radar control device 300.
  • the radar control device 300 When the radar control device 300 receives the radar detection result, the distance and angle of the detected object included in the radar detection result (that is, the relative distance and angle with respect to the radar device 100) and the radar device from which the radar detection result is transmitted The position information of the detected object is calculated based on the 100 position information. Also, the radar control device 300 identifies several camera devices 120 near the detected object based on the calculated position information of the detected object and the position information of each radar device 100 (step S102). For example, four camera devices 120 are identified in order of proximity to the position of the detected object. In the case of the camera arrangement as shown in FIGS. 5 and 6, four camera devices 120 surrounding the object to be detected are specified. Note that this method is an example, and the camera device 120 may be specified by another method. For example, several camera devices 120 may be set in advance for each block that divides the area to be monitored, and the camera device 120 set for the block containing the object to be detected may be specified.
  • the radar control device 300 controls each of the identified camera devices 120 to adjust the angle of view and focus, and causes the object to be photographed (step S103). Images captured by these camera devices 120 are transmitted to the radar control device 300 .
  • the radar control device 300 performs image processing using a plurality of captured images received from each identified camera device 120, and generates an object shape image representing the shape of the detected object (step S104).
  • the radar control device 300 transmits the generated object shape image to the display device 400 and displays it on the display device 400 (step S105).
  • FIG. 12 shows the coordinate system of the radar device 100 rotating in the horizontal direction and the object X existing there. Assuming that the position of the radar device 100 is the origin, the position of the object X can be determined by the distance and angle to the object X. FIG. At this time, if the resolution in the distance direction and the angle direction with respect to the size of the object X is set sufficiently small, the radar device 100 can acquire the reflected wave power from a plurality of locations according to the shape of the object X. . In the example of FIG. 12, the reflected wave power from grid points of black circles corresponding to the region of the object X is obtained. Therefore, by analyzing the reception result of the reflected wave power, it is possible to identify the size of the object seen from the radar device 100 . It is also possible to specify approximate object shapes.
  • an object shape image that is closer to the actual shape can be generated. becomes possible.
  • four radar devices 100A, 100B, 100C, and 100D identify object sizes viewed from respective directions. Although one radar device 100 may be used to identify the object size, it is possible to identify the object size more accurately by using a larger number of radar devices 100 . As a result, the difference between the object shape image transmitted to the operator and the actual object shape can be reduced, and the search time can be shortened.
  • the radar device 100 can also identify the approximate object shape of the detected object X, as shown in FIG. Therefore, it is also possible to generate an object shape image using only the radar device 100 without using the camera device 120 . In this case, the accuracy of the object shape image is lower than when the camera device 120 is used, but the system can be simplified and the cost can be reduced.
  • an object shape image may be generated by combining the object shape obtained using the camera device 120 and the object shape obtained using the radar device 100 .
  • one radar control device 300 controls the radio wave emission direction and It manages the radio wave emission/stop timing.
  • a large number of radar devices 100 are required for a radar system that monitors a wide area, but it may not be possible to secure an optical fiber cable for connecting all the radar devices 100 to one radar control device 300 .
  • FIG. 13 shows a first configuration example in which a plurality of radar control devices 300 are synchronized.
  • the synchronization processing device 500 transmits synchronization signals to the radar control device 300A and the radar control device 300B.
  • the radar control device 300A controls the radar device 100 connected to the radar control device 300A at preset timing according to the synchronization signal received from the synchronization processing device 500 .
  • the radar control device 300B controls the radar device 100 connected to the radar control device 300B at preset timing according to the synchronization signal received from the synchronization processing device 500 . That is, the radar control device 300A and the radar control device 300B control the radar device 100 under their control in a state of being synchronized with each other. At this time, the radio wave irradiation direction and the radio wave emission/stop timing are controlled so that the radio waves from each radar device 100 do not enter the other radar devices 100 . Thereby, interference between radar devices 100 connected to different radar control devices 300 can be prevented.
  • FIG. 14 shows a second configuration example in which a plurality of radar control devices 300 are synchronized.
  • the system shown in the figure includes a radar control device 300A that controls a plurality of radar devices 100 installed on one side of the road surface, and a radar control device 300B that controls a plurality of radar devices 100 installed on the other side of the road surface.
  • the radar control device 300A also includes a GNSS (Global Navigation Satellite System) receiver 520, and so does the radar control device 300B.
  • the GNSS receiver 520 has a function of outputting accurate time information obtained from GNSS satellites.
  • the radar control device 300A controls the radar device 100 connected to the radar control device 300A at preset timing according to the time information output from the GNSS receiver 520. Further, the radar control device 300B controls the radar device 100 connected to the radar control device 300B at preset timing according to time information output from the GNSS receiver 520 . That is, the radar control device 300A and the radar control device 300B control the radar device 100 under their control in a state of being synchronized with each other. At this time, the radio wave irradiation direction and the radio wave emission/stop timing are controlled so that the radio waves from each radar device 100 do not enter the other radar devices 100 . Thereby, interference between radar devices 100 connected to different radar control devices 300 can be prevented.
  • the radar system of this example is installed at a position different from the radar device 100 that detects an object existing on the road surface, which is the area to be monitored, and the object detected by the radar device 100 is viewed from its own position. and a display device 400 for displaying an object shape image based on the plurality of shape information acquired by the plurality of sensor devices 200 .
  • a display device 400 for displaying an object shape image based on the plurality of shape information acquired by the plurality of sensor devices 200 .
  • the shape information used to generate the object shape image can be acquired by the radar device 100 and/or the camera device 120 of the sensor device 200. That is, the shape information used to generate the object shape image may be an image captured by the camera device 120, an object shape specified by the radar device 100, or a combination thereof.
  • the radar system of this example has a mechanism for synchronizing them by the synchronization processing device 500 or the GNSS receiver 520 when using a plurality of radar control devices 300 . Therefore, according to the radar system of this example, interference between radar devices 100 connected to different radar control devices 300 can be prevented.
  • the present invention has been described above based on one embodiment, the present invention is not limited to the wireless communication system described here, and it goes without saying that it can be widely applied to other wireless communication systems.
  • the present invention provides, for example, a method including technical procedures related to the above processing, a program for causing a processor to execute the above processing, and a storage medium storing such a program in a computer-readable manner. is also possible.
  • the present invention can be used in a radar system that detects objects existing in a monitored area with a radar device.

Abstract

Provided is a radar system which can more accurately capture the shape of an object detected by a radar device. The radar system comprises a radar device 100 that detects an object on a road surface that is an area to be monitored, a plurality of sensor devices 200 that are installed in respectively different positions and acquire shape information of the object, as viewed from an own position, detected by the radar device 100, and a display device 400 that displays an object shape image based on a plurality of pieces of the shape information acquired by the plurality of sensor devices 200.

Description

レーダーシステム及び物体検知方法Radar system and object detection method
 本発明は、監視対象エリアに存在する物体をレーダー装置により検知するレーダーシステムに関する。 The present invention relates to a radar system that detects objects existing in a monitored area with a radar device.
 マイクロ波やミリ波帯などを用いたレーダー装置として、図1のような構造のFMCW(Frequency Modulated Continuous-Wave)レーダー装置がある。
 図1のレーダー装置100は、FMCW送信源101からの周波数変調されたレーダー信号を、送信電力増幅器103で増幅し、送信アンテナ104から発射する。レーダー装置100の検出範囲内に物体T(反射物)が存在する場合には、レーダー送信波が物体Tで反射される。物体Tからの反射波は、レーダー装置100の受信アンテナ105で受信され、受信電力増幅器106で増幅された後、電力分配器102からの送信レーダー信号成分と混合器107によってミキシングされ、IF信号に変換される。混合器107から出力されたIF信号は、信号処理部108でA/D変換及び信号処理される。その結果として、物体Tによる反射受信電力(反射波電力)、物体Tまでの距離、物体Tの方位、また、物体Tが移動している場合の速度(レーダー装置100に対する相対速度)等のレーダー検出結果が得られる。
FMCW (Frequency Modulated Continuous-Wave) radar devices having a structure as shown in FIG. 1 are known as radar devices using microwaves, millimeter wave bands, and the like.
Radar apparatus 100 in FIG. 1 amplifies a frequency-modulated radar signal from FMCW transmission source 101 with transmission power amplifier 103 and emits it from transmission antenna 104 . When an object T (reflecting object) exists within the detection range of the radar device 100, the object T reflects the radar transmission wave. A reflected wave from the object T is received by the receiving antenna 105 of the radar device 100, amplified by the receiving power amplifier 106, and then mixed with the transmission radar signal component from the power divider 102 by the mixer 107 to form an IF signal. converted. The IF signal output from the mixer 107 is A/D-converted and signal-processed by the signal processing section 108 . As a result, the reflected received power (reflected wave power) by the object T, the distance to the object T, the azimuth of the object T, and the speed when the object T is moving (relative speed with respect to the radar device 100) are obtained. A detection result is obtained.
 このようなレーダー装置に関し、これまでに種々の発明が提案されている。
 例えば、特許文献1には、移動体にミリ波レーダーを設置し、目標位置の近くにそれぞれ設置された第1の反射体と第2の反射体との間の距離と、これら反射体からの反射波の受信結果に基づいて、目標位置までの距離を測定する発明が開示されている。
Various inventions have been proposed for such radar devices.
For example, in Patent Document 1, a millimeter-wave radar is installed on a moving body, and the distance between a first reflector and a second reflector respectively installed near the target position, and the distance from these reflectors An invention is disclosed that measures the distance to a target position based on the reception result of reflected waves.
国際公開2018/235397号WO2018/235397
 レーダー装置の使用用途として、監視対象エリアである道路や滑走路などの路面上に存在する物体を検知する用途がある。道路や滑走路上には通常、落下物や放置物などの反射物は存在しない。そのため、レーダー装置は、反射物が存在しない検出範囲に対してレーダー送信波を送り続け、何らかの反射物が検出範囲内に現れた場合にのみ受信波(反射波)が得られ、その物体が検知される。 Radar devices are used to detect objects on road surfaces such as roads and runways, which are areas to be monitored. There are usually no reflective objects such as fallen objects or abandoned objects on roads or runways. Therefore, the radar equipment continues to send radar transmission waves to the detection range where there is no reflecting object, and only when some reflecting object appears within the detection range, a received wave (reflected wave) is obtained, and the object is detected. be done.
 レーダー装置によって物体が検知された物体を撤去・回収する必要がある場合、物体形状の情報があれば、その物体の捜索が容易になる。このとき、例えば、レーダー装置と併用してカメラ装置が用いられる。すなわち、レーダー装置による物体からの反射波電力によって物体の存在を認識した場合に、その物体の存在場所にカメラ装置の画角及び焦点を合わせて撮影する。これにより、撤去すべき物体の撮影画像が得られるので、物体の詳細形状を認識し易くなる。 When it is necessary to remove or recover an object that has been detected by a radar device, it will be easier to search for that object if there is information on the shape of the object. At this time, for example, a camera device is used in combination with the radar device. That is, when the existence of an object is recognized by the power of the reflected wave from the object by the radar device, the angle of view and focus of the camera device are adjusted to the location of the object, and the image is captured. As a result, a photographed image of the object to be removed can be obtained, making it easier to recognize the detailed shape of the object.
 検知された物体を捜索・撤去する際、作業者は、レーダー装置から得られた物体の存在位置の情報と、カメラ装置から得られた物体形状の情報を使用することができる。しかしながら、カメラ装置が撮影する画角によっては、物体を一方向から見た形状情報しか得られず、実際の形状を忠実に捉えることは困難である。その結果、カメラ装置から得られた物体形状の情報と実際の物体形状との間に差異が生じ、物体を捜索する際に混乱を招いてしまう。 When searching and removing the detected object, the worker can use the information on the location of the object obtained from the radar device and the information on the shape of the object obtained from the camera device. However, depending on the angle of view taken by the camera device, only shape information of an object viewed from one direction can be obtained, and it is difficult to faithfully capture the actual shape. As a result, there is a difference between the information on the shape of the object obtained from the camera device and the actual shape of the object, causing confusion when searching for the object.
 本発明は、上記のような従来の事情に鑑みて為されたものであり、レーダー装置により検知された物体の形状をより的確に捉えることが可能なレーダーシステムを提供することを目的とする。 The present invention has been made in view of the conventional circumstances as described above, and an object of the present invention is to provide a radar system capable of more accurately capturing the shape of an object detected by a radar device.
 本発明では、上記目的を達成するために、レーダーシステムを以下のように構成した。
 すなわち、本発明に係るレーダーシステムは、監視対象エリアに存在する物体をレーダー装置により検知するレーダーシステムであって、それぞれ異なる位置に設置され、レーダー装置により検知された物体を自位置から見た形状情報を取得する複数の物体形状取得装置と、複数の物体形状取得装置により取得された複数の形状情報に基づく物体形状画像を表示する表示装置とを備えたことを特徴とする。
In order to achieve the above objects, the present invention configures the radar system as follows.
That is, the radar system according to the present invention is a radar system that detects an object existing in a monitoring target area with a radar device, and is installed at different positions, and the shape of the object detected by the radar device when viewed from its own position. The present invention is characterized by comprising a plurality of object shape acquisition devices that acquire information, and a display device that displays an object shape image based on a plurality of pieces of shape information acquired by the plurality of object shape acquisition devices.
 ここで、複数の物体形状取得装置は、カメラ装置を含み得る。また、複数の物体形状取得装置は、レーダー装置を含み得る。なお、複数の物体形状取得装置は、複数のカメラ装置だけで構成されてもよく、複数のレーダー装置だけで構成されてもよく、1以上のカメラ装置と1以上のレーダー装置の組み合わせで構成されてもよい。 Here, the plurality of object shape acquisition devices may include camera devices. Also, the plurality of object shape acquisition devices may include radar devices. Note that the plurality of object shape acquisition devices may be configured with only a plurality of camera devices, may be configured with only a plurality of radar devices, or may be configured with a combination of one or more camera devices and one or more radar devices. may
 また、本発明に係るレーダーシステムは更に、第1のレーダー装置の動作を制御する第1のレーダー制御装置と、第2のレーダー装置の動作を制御する第2のレーダー制御装置と、第1のレーダー制御装置及び第2のレーダー制御装置に対して同期信号を送信する同期処理装置を備え、第1のレーダー制御装置は、同期処理装置から受信した同期信号に従ったタイミングで第1のレーダー装置の動作を制御し、第2のレーダー制御装置は、同期処理装置から受信した同期信号に従ったタイミングで第2のレーダー装置の動作を制御してもよい。 Further, the radar system according to the present invention further includes a first radar control device that controls operation of the first radar device, a second radar control device that controls operation of the second radar device, and a first radar device. A synchronization processing device for transmitting a synchronization signal to the radar control device and the second radar control device, wherein the first radar control device synchronizes the first radar device at a timing according to the synchronization signal received from the synchronization processing device. and the second radar control device may control the operation of the second radar device at timing according to the synchronization signal received from the synchronization processing device.
 また、同期処理装置に代えて、第1のレーダー制御装置及び第2のレーダー制御装置の各々は、GNSS衛星から得られる時刻情報を出力するGNSS受信機を有し、第1のレーダー制御装置は、GNSS受信機から出力される時刻情報に従ったタイミングで第1のレーダー装置の動作を制御し、第2のレーダー制御装置は、GNSS受信機から出力される時刻情報に従ったタイミングで第2のレーダー装置の動作を制御してもよい。 Further, instead of the synchronization processing device, each of the first radar control device and the second radar control device has a GNSS receiver that outputs time information obtained from GNSS satellites, and the first radar control device , controls the operation of the first radar device at the timing according to the time information output from the GNSS receiver, and the second radar control device controls the second radar device at the timing according to the time information output from the GNSS receiver may control the operation of the radar equipment.
 本発明によれば、レーダー装置により検知された物体の形状をより的確に捉えることが可能なレーダーシステムを提供することができる。 According to the present invention, it is possible to provide a radar system that can more accurately capture the shape of an object detected by a radar device.
レーダー装置の構成例を示す図である。It is a figure which shows the structural example of a radar apparatus. 本発明の一実施形態に係るレーダーシステムの概要を示す図である。1 is a diagram showing an overview of a radar system according to an embodiment of the invention; FIG. 1台のセンサー装置を用いて物体形状を特定する様子を示す図である。It is a figure which shows a mode that an object shape is specified using one sensor apparatus. 図3のカメラ装置120で検知物を撮影した場合の例を示す図である。4A and 4B are diagrams showing an example when a detected object is photographed by the camera device 120 of FIG. 3; FIG. 複数台のセンサー装置の設置例を示す図である。It is a figure which shows the installation example of several sensor apparatuses. 複数台のセンサー装置を用いて物体形状を特定する様子を示す図である。FIG. 4 is a diagram showing how a plurality of sensor devices are used to specify the shape of an object; 図6のカメラ装置120Aで検知物を撮影した場合の例を示す図である。FIG. 7 is a diagram showing an example when a detected object is photographed by the camera device 120A of FIG. 6; 図6のカメラ装置120Bで検知物を撮影した場合の例を示す図である。FIG. 7 is a diagram showing an example of a case where a detected object is photographed by the camera device 120B of FIG. 6; 図6のカメラ装置120Cで検知物を撮影した場合の例を示す図である。FIG. 7 is a diagram showing an example of a case in which a detected object is photographed by the camera device 120C of FIG. 6; 図6のカメラ装置120Dで検知物を撮影した場合の例を示す図である。FIG. 7 is a diagram showing an example of a case in which a detected object is photographed by the camera device 120D of FIG. 6; 本発明の一実施形態に係るレーダーシステムによる物体検知時の処理フローの例を示す図である。FIG. 4 is a diagram showing an example of a processing flow when an object is detected by the radar system according to one embodiment of the present invention; 水平方向に回転するレーダー装置の座標系及び検知物の例を示す図である。FIG. 3 is a diagram showing an example of a coordinate system and detected objects of a radar device rotating in a horizontal direction; 複数のレーダー制御装置を同期させる第1構成例を示す図である。FIG. 3 is a diagram showing a first configuration example for synchronizing a plurality of radar control devices; 複数のレーダー制御装置を同期させる第2構成例を示す図である。FIG. 5 is a diagram showing a second configuration example for synchronizing a plurality of radar control devices;
 本発明の一実施形態に係るレーダーシステムについて、図面を参照して説明する。
 図2には、本発明の一実施形態に係るレーダーシステムの概要を示してある。本例のレーダーシステムは、所定の検出範囲Rに向けて設置されたレーダー装置100及びカメラ装置120を含むセンサー装置200と、制御室や監視室などに設置されたレーダー制御装置300及び表示装置400とを備えている。図2では、説明の簡略化のためにセンサー装置200を1台のみ示しているが、図5に示すように、実際には、それぞれ異なる検出範囲Rを監視する複数台のセンサー装置200が配置される。
A radar system according to an embodiment of the present invention will be described with reference to the drawings.
FIG. 2 shows an overview of a radar system according to one embodiment of the invention. The radar system of this example includes a sensor device 200 including a radar device 100 and a camera device 120 installed toward a predetermined detection range R, and a radar control device 300 and a display device 400 installed in a control room, a monitoring room, or the like. and In FIG. 2, only one sensor device 200 is shown for simplification of explanation, but as shown in FIG. be done.
 レーダー装置100は、検出範囲Rに対して送信したレーダー送信波の反射波を受信し、信号処理して得られるレーダー検出結果をレーダー制御装置300に出力する。レーダー監視装置300は、レーダー装置100から出力されたレーダー検出結果に基づいて、検出範囲R内に存在する物体X(落下物や放置物)の検出情報を表示装置400に表示させる。 The radar device 100 receives the reflected wave of the radar transmission wave transmitted to the detection range R, and outputs the radar detection result obtained by signal processing to the radar control device 300 . Based on the radar detection results output from the radar device 100 , the radar monitoring device 300 causes the display device 400 to display the detection information of the object X (falling object or abandoned object) existing within the detection range R.
 レーダー装置100の検出範囲Rは、監視対象エリアである道路や滑走路などの路面上の所定区間であり、レーダー装置100のアンテナ角度は検出範囲Rを包含するように設定される。レーダー装置100から物体Xまでの距離は、物体Xからの反射波を信号処理することで算出することができる。また、レーダー装置100のアンテナが機械的に回転している場合は、アンテナの回転角度情報に基づいて、レーダー装置100に対する物体Xの角度(方位)を特定することができる。或いは、レーダー装置100のアンテナから発射するレーダー送信波のビームの角度が電子的に走査されている場合は、ビームの走査角度情報に基づいて、レーダー装置100に対する物体Xの角度(方位)を特定することができる。 The detection range R of the radar device 100 is a predetermined section on a road surface such as a road or runway, which is the area to be monitored, and the antenna angle of the radar device 100 is set so as to include the detection range R. The distance from the radar device 100 to the object X can be calculated by subjecting the reflected wave from the object X to signal processing. Further, when the antenna of the radar device 100 is mechanically rotated, the angle (azimuth) of the object X with respect to the radar device 100 can be specified based on the rotation angle information of the antenna. Alternatively, when the angle of the beam of the radar transmission wave emitted from the antenna of the radar device 100 is electronically scanned, the angle (azimuth) of the object X with respect to the radar device 100 is specified based on the beam scanning angle information. can do.
 レーダー装置100によって得られた距離及び角度の情報は、レーダー制御装置300を経由してカメラ装置120に送信される。カメラ装置120は、レーダー装置100によって得られた角度の情報に従って旋回し、レーダー装置100によって得られた距離の情報に従って焦点を調整して、画角内に存在する物体Xを撮影する。ここで、図3に示すように、レーダー装置100及びカメラ装置120が1組しか存在しない場合(すなわち、センサー装置200が1台しかない場合)には、物体Xは一方向からしか撮影できない。この場合の問題について以下に説明する。 The distance and angle information obtained by the radar device 100 is transmitted to the camera device 120 via the radar control device 300. The camera device 120 turns according to the angle information obtained by the radar device 100, adjusts the focus according to the distance information obtained by the radar device 100, and photographs the object X existing within the angle of view. Here, as shown in FIG. 3, when there is only one set of the radar device 100 and the camera device 120 (that is, when there is only one sensor device 200), the object X can only be photographed from one direction. Problems in this case will be described below.
 図4には、図3の配置で物体Xを撮影した場合の例を示してある。図4の上部には、実際の物体Xを上空方向から見た物体形状(すなわち、平面視した物体形状)を示してあり、図4の下部には、物体Xを矢印方向から撮影して得られた物体形状を示してある。このように、カメラ装置120の画角、物体Xまでの距離、撮影時の光の照射具合などの要因により、撮影画像から特定される物体形状と実際の物体形状との間に差異が生じることがある。 FIG. 4 shows an example of photographing the object X in the arrangement shown in FIG. The upper portion of FIG. 4 shows the shape of the actual object X viewed from above (that is, the shape of the object viewed from above), and the lower portion of FIG. The object shape is shown. In this way, there may be a difference between the shape of the object identified from the captured image and the actual shape of the object due to factors such as the angle of view of the camera device 120, the distance to the object X, and the condition of light irradiation during photography. There is
 例えば、図4の上部に示すように実際の物体形状が或る程度の面積を有するものでも、側面方向の撮影画像しか得られないと、図4の下部に示すように棒状の物体として認識されてしまう可能性がある。この場合、作業者(物体の捜索者)に対して送信される物体形状の情報は「棒状の物体」であり、作業者は現場において「棒状の物体」を捜索することになる。その結果、実際の物体形状との差異により、回収すべき物体の捜索時間を必要以上に要してしまう。また、回収した物体の形状が事前情報の形状と異なると、回収結果に確信を持てないことにもつながる。 For example, even if the actual shape of the object has a certain area as shown in the upper part of FIG. 4, it will be recognized as a rod-like object as shown in the lower part of FIG. There is a possibility that In this case, the object shape information transmitted to the worker (object searcher) is the "rod-shaped object", and the worker searches for the "rod-shaped object" at the site. As a result, due to the difference from the actual shape of the object, it takes more time than necessary to search for the object to be collected. In addition, if the shape of the collected object differs from the shape of the prior information, it may lead to a lack of confidence in the collection result.
 そこで、本例のレーダーシステムでは、撮影画像から特定される物体形状と実際の物体形状との間の差異を低減するために、レーダー装置100及びカメラ装置120を含むセンサー装置200を複数台使用する。図5には、複数台のセンサー装置200の設置例を示してある。図5では、監視対象の路面の両側に、路面に並行させて複数台のセンサー装置200が設置されている。センサー装置200の各々は、レーダー装置100及びカメラ装置120を搭載している。これら複数台のセンサー装置200は、制御室や監視室などに設置されたレーダー制御装置300に対して光ファイバーケーブル等により接続されている。すなわち、レーダー制御装置300が、全てのセンサー装置200を制御し、これらセンサー装置200から得られる情報を管理する。 Therefore, in the radar system of this example, a plurality of sensor devices 200 including radar devices 100 and camera devices 120 are used in order to reduce the difference between the object shape specified from the captured image and the actual object shape. . FIG. 5 shows an installation example of a plurality of sensor devices 200 . In FIG. 5, a plurality of sensor devices 200 are installed in parallel with the road surface on both sides of the road surface to be monitored. Each sensor device 200 is equipped with a radar device 100 and a camera device 120 . These multiple sensor devices 200 are connected by optical fiber cables or the like to a radar control device 300 installed in a control room, a monitoring room, or the like. That is, the radar control device 300 controls all sensor devices 200 and manages information obtained from these sensor devices 200 .
 ここで、監視対象の路面上に1つの物体が存在しているものとし、そのときの状況を上空方向から見た様子を図6に示してある。図6の例では、物体Xの周囲に4台のセンサー装置200A、200B、200C、200Dが設置されている。センサー装置200Aは、レーダー装置100A及びカメラ装置120Aを備えており、センサー装置200B、200C、200Dについても同様である。 Here, it is assumed that one object exists on the road surface to be monitored, and the situation at that time is shown in FIG. 6 as seen from the sky. In the example of FIG. 6, four sensor devices 200A, 200B, 200C, and 200D are installed around the object X. In FIG. The sensor device 200A includes a radar device 100A and a camera device 120A, and the same applies to the sensor devices 200B, 200C, and 200D.
 最初に、レーダー装置100Aが物体Xからの反射電力を検出し、物体Xの存在を認識したとする。レーダー装置100Aは、自身の設置原点に対する物体Xまでの距離及び角度を得ることができる。これにより、設置原点をレーダー装置100Aと同じくするカメラ装置120Aは、レーダー装置100Aにより取得された距離及び角度の情報に基づいて、物体Xの存在場所に画角及び焦点を合わせて撮影を行う。 First, assume that the radar device 100A detects the reflected power from the object X and recognizes the existence of the object X. The radar device 100A can obtain the distance and angle to the object X with respect to its own installation origin. As a result, the camera device 120A, which has the same installation origin as the radar device 100A, adjusts the angle of view and focus on the existing location of the object X based on the distance and angle information acquired by the radar device 100A, and shoots the object.
 図7には、図6のカメラ装置120Aで物体Xを撮影した場合の例を示してある。図7の上部には、実際の物体Xを上空方向から見た物体形状を示してあり、図7の下部には、カメラ装置120Aが物体Xを矢印方向から撮影して得られた物体形状を示してある。 FIG. 7 shows an example when the object X is photographed by the camera device 120A of FIG. The upper part of FIG. 7 shows the shape of an actual object X viewed from above, and the lower part of FIG. is shown.
 本例のレーダーシステムは、レーダー制御装置300が全てのセンサー装置200を制御し、これらセンサー装置200から得られる情報を管理しているので、レーダー装置100Aにより検知された物体Xの位置情報は、システム全体として共有することができる。このため、物体Xの近くにある他のセンサー装置200B、200C、200Dは、物体Xの位置情報を元に、各レーダー装置100B、100C、100Dの電波照射角度や、各カメラ装置120B、120C、120Dの画角及び焦点を、物体Xの存在場所に合うように調整することが可能である。 In the radar system of this example, the radar control device 300 controls all the sensor devices 200 and manages the information obtained from these sensor devices 200, so the position information of the object X detected by the radar device 100A is The entire system can be shared. Therefore, the other sensor devices 200B, 200C, and 200D near the object X, based on the position information of the object X, determine the radio wave irradiation angles of the radar devices 100B, 100C, and 100D, the camera devices 120B, 120C, The angle of view and focus of 120D can be adjusted to match the location of object X.
 図8には、図6のカメラ装置120Bで物体Xを撮影した場合の例を示してある。図8の上部には、実際の物体Xを上空方向から見た物体形状を示してあり、図8の下部には、カメラ装置120Bが物体Xを矢印方向から撮影して得られた物体形状を示してある。 FIG. 8 shows an example when the object X is photographed by the camera device 120B of FIG. The upper part of FIG. 8 shows the shape of an actual object X viewed from above, and the lower part of FIG. is shown.
 図9には、図6のカメラ装置120Cで物体Xを撮影した場合の例を示してある。図9の上部には、実際の物体Xを上空方向から見た物体形状を示してあり、図9の下部には、カメラ装置120Cが物体Xを矢印方向から撮影して得られた物体形状を示してある。 FIG. 9 shows an example when the object X is photographed by the camera device 120C of FIG. The upper part of FIG. 9 shows the shape of the actual object X viewed from above, and the lower part of FIG. is shown.
 図10には、図6のカメラ装置120Dで物体Xを撮影した場合の例を示してある。図10の上部には、実際の物体Xを上空方向から見た物体形状を示してあり、図10の下部には、カメラ装置120Dが物体Xを矢印方向から撮影して得られた物体形状を示してある。 FIG. 10 shows an example when the object X is photographed by the camera device 120D of FIG. The upper part of FIG. 10 shows the shape of an actual object X viewed from above, and the lower part of FIG. is shown.
 このように、物体Xを複数の方向(本例では4方向)からカメラ撮影することで、それぞれ異なる態様の4つの画像を得ることができる。レーダー制御装置300は、物体Xを撮影した複数の画像を用いて画像処理を行い、物体Xの形状を表す物体形状画像を生成する。物体形状画像の生成には公知の種々の技術を利用することができ、例えば、特願平4-151821(特開平5-342368号公報)に記載された技術を用いてもよい。また、本例ではカメラ装置の位置が固定のため、物体形状画像を生成するために各カメラ装置の物体Xの部分画像の張り合わせる角度を予め決定しておき、その角度で4つの部分画像を張り合わせても良い。 In this way, by photographing the object X from a plurality of directions (four directions in this example), four images with different aspects can be obtained. The radar control device 300 performs image processing using a plurality of captured images of the object X to generate an object shape image representing the shape of the object X. FIG. Various known techniques can be used to generate the object shape image. Further, in this example, since the position of the camera device is fixed, the angle at which the partial images of the object X from each camera device are pasted together is determined in advance in order to generate the object shape image, and four partial images are obtained at that angle. You can glue them together.
 これにより、1台のカメラ装置で撮影した画像に比べて、実際の形状に近い物体形状画像を生成することが可能となる。本例では、4台のセンサー装置200を使用して物体形状画像を生成しているが、センサー装置200の台数はこれに限定されず、2台以上の任意の台数のセンサー装置200を使用することが可能である。より多くの方向から物体Xを撮影した画像を用いることで、より実際の形状に近い物体形状画像を生成できるようになる。 This makes it possible to generate an object shape image that is closer to the actual shape than an image captured by a single camera device. In this example, four sensor devices 200 are used to generate an object shape image, but the number of sensor devices 200 is not limited to this, and an arbitrary number of two or more sensor devices 200 may be used. It is possible. By using images obtained by photographing the object X from more directions, it is possible to generate an object shape image that is closer to the actual shape.
 図11には、本例のレーダーシステムによる物体検知時の処理フローの例を示してある。なお、レーダー制御装置300は、システム内のセンサー装置200の位置情報(すなわち、レーダー装置100及びカメラ装置120の位置情報)を予め記憶しているものとする。複数のレーダー装置100は、レーダー制御装置300による制御の下で、監視対象エリアである路面を監視する。いずれかのレーダー装置100で路面上の物体が検知されると(ステップS101)、そのレーダー装置100からレーダー制御装置300にレーダー検出結果が送信される。 Fig. 11 shows an example of the processing flow when an object is detected by the radar system of this example. It is assumed that the radar control device 300 stores in advance the position information of the sensor device 200 in the system (that is, the position information of the radar device 100 and the camera device 120). A plurality of radar devices 100 monitor a road surface, which is a monitoring target area, under the control of a radar control device 300 . When an object on the road surface is detected by any radar device 100 (step S101), the radar detection result is transmitted from the radar device 100 to the radar control device 300. FIG.
 レーダー制御装置300は、レーダー検出結果を受信すると、レーダー検出結果に含まれる検知物の距離及び角度(すなわち、レーダー装置100に対する相対的な距離及び角度)と、レーダー検出結果の送信元のレーダー装置100の位置情報とに基づいて、検知物の位置情報を計算する。また、レーダー制御装置300は、計算した検知物の位置情報と、各レーダー装置100の位置情報とに基づいて、検知物の近傍にある幾つかのカメラ装置120を特定する(ステップS102)。例えば、検知物の位置に近い順に4台のカメラ装置120を特定する。図5や図6のようなカメラ配置の場合、検知物を取り囲む4台のカメラ装置120が特定される。なお、本手法は一例であり、他の手法によりカメラ装置120を特定しても構わない。例えば、監視対象エリアを区分したブロック毎に幾つかのカメラ装置120を予め設定しておき、検知物があるブロックに対して設定されたカメラ装置120を特定してもよい。 When the radar control device 300 receives the radar detection result, the distance and angle of the detected object included in the radar detection result (that is, the relative distance and angle with respect to the radar device 100) and the radar device from which the radar detection result is transmitted The position information of the detected object is calculated based on the 100 position information. Also, the radar control device 300 identifies several camera devices 120 near the detected object based on the calculated position information of the detected object and the position information of each radar device 100 (step S102). For example, four camera devices 120 are identified in order of proximity to the position of the detected object. In the case of the camera arrangement as shown in FIGS. 5 and 6, four camera devices 120 surrounding the object to be detected are specified. Note that this method is an example, and the camera device 120 may be specified by another method. For example, several camera devices 120 may be set in advance for each block that divides the area to be monitored, and the camera device 120 set for the block containing the object to be detected may be specified.
 レーダー制御装置300は、特定した各カメラ装置120を制御して画角及び焦点を調整し、検知物の撮影を実行させる(ステップS103)。これらのカメラ装置120による撮影画像は、レーダー制御装置300に送信される。レーダー制御装置300は、特定した各カメラ装置120から受信した複数の撮影画像を用いて画像処理を行い、検知物の形状を表す物体形状画像を生成する(ステップS104)。レーダー制御装置300は、生成した物体形状画像を表示装置400に送信し、表示装置400に表示させる(ステップS105)。 The radar control device 300 controls each of the identified camera devices 120 to adjust the angle of view and focus, and causes the object to be photographed (step S103). Images captured by these camera devices 120 are transmitted to the radar control device 300 . The radar control device 300 performs image processing using a plurality of captured images received from each identified camera device 120, and generates an object shape image representing the shape of the detected object (step S104). The radar control device 300 transmits the generated object shape image to the display device 400 and displays it on the display device 400 (step S105).
 なお、上記の説明では、複数のカメラ装置120により得られた複数の画像に対して画像処理を施す例を示したが、レーダー装置100による反射波電力の情報を物体形状画像の生成に使用することも可能である。以下、図12を参照して説明する。 In the above description, an example in which image processing is performed on a plurality of images obtained by a plurality of camera devices 120 has been described. is also possible. Description will be made below with reference to FIG.
 図12は、水平方向に回転するレーダー装置100の座標系と、そこに存在する物体Xを示したものである。レーダー装置100の位置を原点とすると、物体Xの位置は、物体Xまでの距離及び角度によって決定することができる。このとき、物体Xの大きさに対する距離方向及び角度方向の分解能を十分に小さく設定すれば、レーダー装置100は、物体Xの形状に応じた複数の箇所からの反射波電力を取得することができる。図12の例では、物体Xの領域に対応する黒丸の格子点からの反射波電力が取得される。したがって、反射波電力の受信結果を分析することで、レーダー装置100から見た物体サイズを特定することが可能となる。また、おおよその物体形状を特定することも可能である。 FIG. 12 shows the coordinate system of the radar device 100 rotating in the horizontal direction and the object X existing there. Assuming that the position of the radar device 100 is the origin, the position of the object X can be determined by the distance and angle to the object X. FIG. At this time, if the resolution in the distance direction and the angle direction with respect to the size of the object X is set sufficiently small, the radar device 100 can acquire the reflected wave power from a plurality of locations according to the shape of the object X. . In the example of FIG. 12, the reflected wave power from grid points of black circles corresponding to the region of the object X is obtained. Therefore, by analyzing the reception result of the reflected wave power, it is possible to identify the size of the object seen from the radar device 100 . It is also possible to specify approximate object shapes.
 そこで、カメラ装置120を用いて得られた物体形状画像だけでなく、レーダー装置100を用いて得られた物体サイズの情報を使用することで、更に実際の形状に近い物体形状画像を生成することが可能となる。図6の例では、4台のレーダー装置100A、100B、100C、100Dが、それぞれの方向から見た物体サイズを特定する。物体サイズの特定に使用するレーダー装置100は1台でもよいが、より多くの台数のレーダー装置100を使用することで、より正確な物体サイズを特定することが可能である。これにより、作業者に対して送信される物体形状画像と実際の物体形状との差異を小さくすることができ、捜索時間の短縮が可能となる。 Therefore, by using not only the object shape image obtained using the camera device 120 but also the object size information obtained using the radar device 100, an object shape image that is closer to the actual shape can be generated. becomes possible. In the example of FIG. 6, four radar devices 100A, 100B, 100C, and 100D identify object sizes viewed from respective directions. Although one radar device 100 may be used to identify the object size, it is possible to identify the object size more accurately by using a larger number of radar devices 100 . As a result, the difference between the object shape image transmitted to the operator and the actual object shape can be reduced, and the search time can be shortened.
 また、レーダー装置100は、図12に示したように、検知された物体Xのおおよその物体形状を特定することもできる。そこで、カメラ装置120を用いずに、レーダー装置100だけを用いて物体形状画像を生成することも可能である。この場合、カメラ装置120を使用する場合に比べて物体形状画像の精度が落ちるが、システムを簡易化及び低コスト化することができる。もちろん、カメラ装置120を用いて得られる物体形状とレーダー装置100を用いて得られる物体形状とを組み合わせて、物体形状画像を生成しても構わない。 The radar device 100 can also identify the approximate object shape of the detected object X, as shown in FIG. Therefore, it is also possible to generate an object shape image using only the radar device 100 without using the camera device 120 . In this case, the accuracy of the object shape image is lower than when the camera device 120 is used, but the system can be simplified and the cost can be reduced. Of course, an object shape image may be generated by combining the object shape obtained using the camera device 120 and the object shape obtained using the radar device 100 .
 ここで、本例のレーダーシステムでは複数のレーダー装置100を使用するので、レーダー装置100同士で干渉することを防止するために、1台のレーダー制御装置300が各レーダー装置100の電波発射方向及び電波発射/停止タイミングを管理している。広範なエリアを監視するレーダーシステムでは多数のレーダー装置100が必要となるが、全てのレーダー装置100を1台のレーダー制御装置300に接続するための光ファイバーケーブルを確保できない場合がある。 Here, since the radar system of this example uses a plurality of radar devices 100, in order to prevent the radar devices 100 from interfering with each other, one radar control device 300 controls the radio wave emission direction and It manages the radio wave emission/stop timing. A large number of radar devices 100 are required for a radar system that monitors a wide area, but it may not be possible to secure an optical fiber cable for connecting all the radar devices 100 to one radar control device 300 .
 この場合、別のレーダー制御装置300を追加で設置する必要性が生じるが、単にレーダー制御装置300を追加するだけでは、異なるレーダー制御装置300に接続されたレーダー装置100同士の干渉を防止することはできない。そこで、本例では、異なるレーダー制御装置300に接続されたレーダー装置100同士の干渉を防止するために、複数のレーダー制御装置300を互いに同期させる仕組みを設けている。以下、その仕組みについて説明する。 In this case, it is necessary to additionally install another radar control device 300, but merely adding the radar control device 300 does not prevent interference between the radar devices 100 connected to different radar control devices 300. can't. Therefore, in this example, in order to prevent interference between the radar devices 100 connected to different radar control devices 300, a mechanism is provided to synchronize the plurality of radar control devices 300 with each other. The mechanism will be described below.
 図13には、複数のレーダー制御装置300を同期させる第1構成例を示してある。同図のシステムは、路面の片側に設置された複数のレーダー装置100を制御するレーダー制御装置300Aと、路面の別側に設置された複数のレーダー装置100を制御するレーダー制御装置300Bと、これらと接続された同期処理装置500とを備えている。 FIG. 13 shows a first configuration example in which a plurality of radar control devices 300 are synchronized. The system shown in FIG. and a synchronization processing device 500 connected to.
 同期処理装置500は、レーダー制御装置300A及びレーダー制御装置300Bに対して同期信号を送信する。レーダー制御装置300Aは、同期処理装置500から受信した同期信号に従って、予め設定されたタイミングで、レーダー制御装置300Aに接続されたレーダー装置100を制御する。また、レーダー制御装置300Bは、同期処理装置500から受信した同期信号に従って、予め設定されたタイミングで、レーダー制御装置300Bに接続されたレーダー装置100を制御する。すなわち、レーダー制御装置300A及びレーダー制御装置300Bは、互いに同期した状態で、配下のレーダー装置100を制御する。このとき、各レーダー装置100の電波が他のレーダー装置100に入らないように、電波照射方向及び電波発射/停止タイミングを制御する。これにより、異なるレーダー制御装置300に接続されたレーダー装置100同士の干渉を防止することができる。 The synchronization processing device 500 transmits synchronization signals to the radar control device 300A and the radar control device 300B. The radar control device 300A controls the radar device 100 connected to the radar control device 300A at preset timing according to the synchronization signal received from the synchronization processing device 500 . Further, the radar control device 300B controls the radar device 100 connected to the radar control device 300B at preset timing according to the synchronization signal received from the synchronization processing device 500 . That is, the radar control device 300A and the radar control device 300B control the radar device 100 under their control in a state of being synchronized with each other. At this time, the radio wave irradiation direction and the radio wave emission/stop timing are controlled so that the radio waves from each radar device 100 do not enter the other radar devices 100 . Thereby, interference between radar devices 100 connected to different radar control devices 300 can be prevented.
 図14には、複数のレーダー制御装置300を同期させる第2構成例を示してある。同図のシステムは、路面の片側に設置された複数のレーダー装置100を制御するレーダー制御装置300Aと、路面の別側に設置された複数のレーダー装置100を制御するレーダー制御装置300Bとを備える。また、レーダー制御装置300AはGNSS(Global Navigation Satellite System)受信機520を備えており、レーダー制御装置300Bも同様である。GNSS受信機520は、GNSS衛星から得られる正確な時刻情報を出力する機能を有する。 FIG. 14 shows a second configuration example in which a plurality of radar control devices 300 are synchronized. The system shown in the figure includes a radar control device 300A that controls a plurality of radar devices 100 installed on one side of the road surface, and a radar control device 300B that controls a plurality of radar devices 100 installed on the other side of the road surface. . The radar control device 300A also includes a GNSS (Global Navigation Satellite System) receiver 520, and so does the radar control device 300B. The GNSS receiver 520 has a function of outputting accurate time information obtained from GNSS satellites.
 レーダー制御装置300Aは、GNSS受信機520から出力される時刻情報に従って、予め設定されたタイミングで、レーダー制御装置300Aに接続されたレーダー装置100を制御する。また、レーダー制御装置300Bは、GNSS受信機520から出力される時刻情報に従って、予め設定されたタイミングで、レーダー制御装置300Bに接続されたレーダー装置100を制御する。すなわち、レーダー制御装置300A及びレーダー制御装置300Bは、互いに同期した状態で、配下のレーダー装置100を制御する。このとき、各レーダー装置100の電波が他のレーダー装置100に入らないように、電波照射方向及び電波発射/停止タイミングを制御する。これにより、異なるレーダー制御装置300に接続されたレーダー装置100同士の干渉を防止することができる。 The radar control device 300A controls the radar device 100 connected to the radar control device 300A at preset timing according to the time information output from the GNSS receiver 520. Further, the radar control device 300B controls the radar device 100 connected to the radar control device 300B at preset timing according to time information output from the GNSS receiver 520 . That is, the radar control device 300A and the radar control device 300B control the radar device 100 under their control in a state of being synchronized with each other. At this time, the radio wave irradiation direction and the radio wave emission/stop timing are controlled so that the radio waves from each radar device 100 do not enter the other radar devices 100 . Thereby, interference between radar devices 100 connected to different radar control devices 300 can be prevented.
 以上のように、本例のレーダーシステムは、監視対象エリアである路面に存在する物体を検知するレーダー装置100と、それぞれ異なる位置に設置され、レーダー装置100により検知された物体を自位置から見た形状情報を取得する複数のセンサー装置200と、複数のセンサー装置200により取得された複数の形状情報に基づく物体形状画像を表示する表示装置400とを備えている。このように、検知物をそれぞれ異なる方向から見た複数の形状情報を処理することで、より実際の形状に近い物体形状画像を表示できるようになる。したがって、本例のレーダーシステムによれば、レーダー装置により検知された物体の形状をより的確に捉えることが可能となる。 As described above, the radar system of this example is installed at a position different from the radar device 100 that detects an object existing on the road surface, which is the area to be monitored, and the object detected by the radar device 100 is viewed from its own position. and a display device 400 for displaying an object shape image based on the plurality of shape information acquired by the plurality of sensor devices 200 . In this way, by processing a plurality of pieces of shape information obtained by viewing the detected object from different directions, it is possible to display an object shape image that is closer to the actual shape. Therefore, according to the radar system of this example, it is possible to more accurately capture the shape of the object detected by the radar device.
 ここで、物体形状画像の生成に用いられる形状情報は、センサー装置200が有するレーダー装置100及び/又はカメラ装置120により取得することができる。すなわち、物体形状画像の生成に用いられる形状情報は、カメラ装置120による撮影画像であってよく、レーダー装置100により特定された物体形状であってもよく、これらの組み合わせであってもよい。 Here, the shape information used to generate the object shape image can be acquired by the radar device 100 and/or the camera device 120 of the sensor device 200. That is, the shape information used to generate the object shape image may be an image captured by the camera device 120, an object shape specified by the radar device 100, or a combination thereof.
 また、本例のレーダーシステムは、複数のレーダー制御装置300を用いる場合に、これらを同期処理装置500又はGNSS受信機520により同期させる仕組みとなっている。したがって、本例のレーダーシステムによれば、異なるレーダー制御装置300に接続されたレーダー装置100同士の干渉を防止することができる。 In addition, the radar system of this example has a mechanism for synchronizing them by the synchronization processing device 500 or the GNSS receiver 520 when using a plurality of radar control devices 300 . Therefore, according to the radar system of this example, interference between radar devices 100 connected to different radar control devices 300 can be prevented.
 以上、本発明について一実施形態に基づいて説明したが、本発明はここに記載された無線通信システムに限定されるものではなく、他の無線通信システムに広く適用することができることは言うまでもない。
 また、本発明は、例えば、上記の処理に関する技術的手順を含む方法や、上記の処理をプロセッサにより実行させるためのプログラム、そのようなプログラムをコンピュータ読み取り可能に記憶する記憶媒体などとして提供することも可能である。
Although the present invention has been described above based on one embodiment, the present invention is not limited to the wireless communication system described here, and it goes without saying that it can be widely applied to other wireless communication systems.
In addition, the present invention provides, for example, a method including technical procedures related to the above processing, a program for causing a processor to execute the above processing, and a storage medium storing such a program in a computer-readable manner. is also possible.
 なお、本発明の範囲は、図示され記載された例示的な実施形態に限定されるものではなく、本発明が目的とするものと均等な効果をもたらす全ての実施形態をも含む。更に、本発明の範囲は、全ての開示されたそれぞれの特徴のうち特定の特徴のあらゆる所望する組み合わせによって画され得る。 It should be noted that the scope of the present invention is not limited to the illustrated and described exemplary embodiments, but includes all embodiments that achieve effects equivalent to those intended by the present invention. Moreover, the scope of the invention may be defined by any desired combination of the specific features of each and every disclosed feature.
 本発明は、監視対象エリアに存在する物体をレーダー装置により検知するレーダーシステムに利用することが可能である。 The present invention can be used in a radar system that detects objects existing in a monitored area with a radar device.
 100:レーダー装置、 101:FMCW送信源、 102:電力分配器、 103:送信電力増幅器、 104:送信アンテナ、 105:受信アンテナ、 106:受信電力増幅器、 107:混合器、 108:信号処理部、 120:カメラ装置、 200:センサー装置、 300:レーダー制御装置、 400:表示装置、 500:同期処理装置、 520:GNSS受信機 100: radar device, 101: FMCW transmission source, 102: power divider, 103: transmission power amplifier, 104: transmission antenna, 105: reception antenna, 106: reception power amplifier, 107: mixer, 108: signal processor, 120: camera device, 200: sensor device, 300: radar control device, 400: display device, 500: synchronization processing device, 520: GNSS receiver

Claims (6)

  1.  監視対象エリアに存在する物体をレーダー装置により検知するレーダーシステムにおいて、
     それぞれ異なる位置に設置され、前記レーダー装置により検知された物体を自位置から見た形状情報を取得する複数の物体形状取得装置と、
     前記複数の物体形状取得装置により取得された複数の形状情報に基づく物体形状画像を表示する表示装置とを備えたことを特徴とするレーダーシステム。
    In a radar system that detects objects existing in a monitored area with a radar device,
    a plurality of object shape acquisition devices installed at different positions and acquiring shape information of the object detected by the radar device as viewed from its own position;
    and a display device that displays an object shape image based on the plurality of pieces of shape information acquired by the plurality of object shape acquisition devices.
  2.  請求項1に記載のレーダーシステムにおいて、
     前記複数の物体形状取得装置は、カメラ装置を含むことを特徴とするレーダーシステム。
    A radar system according to claim 1, wherein
    The radar system, wherein the plurality of object shape acquisition devices include a camera device.
  3.  請求項1に記載のレーダーシステムにおいて、
     前記複数の物体形状取得装置は、レーダー装置を含むことを特徴とするレーダーシステム。
    A radar system according to claim 1, wherein
    The radar system, wherein the plurality of object shape acquisition devices include radar devices.
  4.  請求項1に記載のレーダーシステムにおいて、
     第1のレーダー装置の動作を制御する第1のレーダー制御装置と、第2のレーダー装置の動作を制御する第2のレーダー制御装置と、前記第1のレーダー制御装置及び前記第2のレーダー制御装置に対して同期信号を送信する同期処理装置とを備え、
     前記第1のレーダー制御装置は、前記同期処理装置から受信した同期信号に従ったタイミングで前記第1のレーダー装置の動作を制御し、
     前記第2のレーダー制御装置は、前記同期処理装置から受信した同期信号に従ったタイミングで前記第2のレーダー装置の動作を制御することを特徴とするレーダーシステム。
    A radar system according to claim 1, wherein
    A first radar control device that controls the operation of the first radar device, a second radar control device that controls the operation of the second radar device, the first radar control device and the second radar control A synchronization processing device that transmits a synchronization signal to the device,
    The first radar control device controls the operation of the first radar device at timing according to the synchronization signal received from the synchronization processing device,
    The radar system according to claim 1, wherein the second radar control device controls the operation of the second radar device at timing according to the synchronization signal received from the synchronization processing device.
  5.  請求項1に記載のレーダーシステムにおいて、
     第1のレーダー装置の動作を制御する第1のレーダー制御装置と、第2のレーダー装置の動作を制御する第2のレーダー制御装置とを備え、
     前記第1のレーダー制御装置及び前記第2のレーダー制御装置は、それぞれ、GNSS衛星から得られる時刻情報を出力するGNSS受信機を有し、
     前記第1のレーダー制御装置は、前記GNSS受信機から出力される時刻情報に従ったタイミングで前記第1のレーダー装置の動作を制御し、
     前記第2のレーダー制御装置は、前記GNSS受信機から出力される時刻情報に従ったタイミングで前記第2のレーダー装置の動作を制御することを特徴とするレーダーシステム。
    A radar system according to claim 1, wherein
    A first radar control device that controls the operation of the first radar device, and a second radar control device that controls the operation of the second radar device,
    The first radar control device and the second radar control device each have a GNSS receiver that outputs time information obtained from GNSS satellites,
    The first radar control device controls the operation of the first radar device at a timing according to time information output from the GNSS receiver,
    The radar system, wherein the second radar control device controls the operation of the second radar device at timing according to time information output from the GNSS receiver.
  6.  監視対象エリアに存在する物体をレーダー装置により検知する物体検知方法において、
     前記レーダー装置により検知された物体を、それぞれ異なる位置に設置された複数の物体形状取得装置から見た複数の形状情報を取得するステップと、
     前記複数の形状情報に基づく物体形状画像を表示装置に表示するステップとを有することを特徴とする物体検知方法。
    In an object detection method for detecting an object existing in a monitoring target area with a radar device,
    acquiring a plurality of shape information of the object detected by the radar device as viewed from a plurality of object shape acquisition devices installed at different positions;
    and displaying an object shape image based on the plurality of shape information on a display device.
PCT/JP2021/008092 2021-03-03 2021-03-03 Radar system and object detection method WO2022185430A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023503585A JP7449443B2 (en) 2021-03-03 2021-03-03 Radar system and object detection method
PCT/JP2021/008092 WO2022185430A1 (en) 2021-03-03 2021-03-03 Radar system and object detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/008092 WO2022185430A1 (en) 2021-03-03 2021-03-03 Radar system and object detection method

Publications (1)

Publication Number Publication Date
WO2022185430A1 true WO2022185430A1 (en) 2022-09-09

Family

ID=83153964

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/008092 WO2022185430A1 (en) 2021-03-03 2021-03-03 Radar system and object detection method

Country Status (2)

Country Link
JP (1) JP7449443B2 (en)
WO (1) WO2022185430A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58189570A (en) * 1982-04-28 1983-11-05 Oki Electric Ind Co Ltd Interference eliminating system of radar
JP2002222487A (en) * 2001-01-26 2002-08-09 Mitsubishi Electric Corp Road monitoring system
JP2010256133A (en) * 2009-04-23 2010-11-11 Toyota Motor Corp Interference prevention radar device
JP2012099014A (en) * 2010-11-04 2012-05-24 Saxa Inc Passing vehicle monitoring system and passing vehicle monitoring device
WO2020188697A1 (en) * 2019-03-18 2020-09-24 株式会社日立国際電気 Monitoring system and monitoring method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58189570A (en) * 1982-04-28 1983-11-05 Oki Electric Ind Co Ltd Interference eliminating system of radar
JP2002222487A (en) * 2001-01-26 2002-08-09 Mitsubishi Electric Corp Road monitoring system
JP2010256133A (en) * 2009-04-23 2010-11-11 Toyota Motor Corp Interference prevention radar device
JP2012099014A (en) * 2010-11-04 2012-05-24 Saxa Inc Passing vehicle monitoring system and passing vehicle monitoring device
WO2020188697A1 (en) * 2019-03-18 2020-09-24 株式会社日立国際電気 Monitoring system and monitoring method

Also Published As

Publication number Publication date
JP7449443B2 (en) 2024-03-13
JPWO2022185430A1 (en) 2022-09-09

Similar Documents

Publication Publication Date Title
EP1441318B1 (en) Security system
US7450251B2 (en) Fanned laser beam metrology system
US8362946B2 (en) Millimeter wave surface imaging radar system
CN101408618B (en) Wide light beam illumination three-dimensional gating imaging system of airborne laser radar
US20070176822A1 (en) Target detection apparatus and system
JP4741365B2 (en) Object detection sensor
RU2444754C1 (en) Method for detection and spatial localisation of air objects
RU2522982C2 (en) All-around looking radar
RU2004102190A (en) METHOD FOR IMPROVING RADAR RESOLUTION, SYSTEM FOR ITS IMPLEMENTATION AND METHOD FOR REMOTE IDENTIFICATION OF THE SYSTEM OF SMALL-SIZED OBJECTS
JP6233606B2 (en) Target identification laser observation system
CN104678389A (en) Continuous wave one-dimensional phase scanning miss distance vector detection method and device
JP2015166732A (en) Enhanced imaging system
WO2022185430A1 (en) Radar system and object detection method
WO2002021641A2 (en) Passive radar system cuing active radar system
CN109188395A (en) A kind of full polarized fringe pipe laser imaging radar device
Espeter et al. Progress of hybrid bistatic SAR: Synchronization experiments and first imaging results
JP6923799B2 (en) Radar device
CN111505654A (en) Object position detection method and laser radar
US6204800B1 (en) Method for monitoring the earth surface
US5625452A (en) Passive detection of source of known spectral emission
CN108363057B (en) Synthetic aperture radar detection method, synthetic aperture radar detection device and storage medium
JP6679371B2 (en) Object detection device
JP2013171003A (en) Target outline identification device and indication image generator
RU2656361C1 (en) Method of mobile object positioning
RU2704348C1 (en) Method of determining an object, which inspects a spacecraft in passive mode

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21929006

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023503585

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21929006

Country of ref document: EP

Kind code of ref document: A1