WO2019008716A1 - Non-visible measurement device and non-visible measurement method - Google Patents

Non-visible measurement device and non-visible measurement method Download PDF

Info

Publication number
WO2019008716A1
WO2019008716A1 PCT/JP2017/024748 JP2017024748W WO2019008716A1 WO 2019008716 A1 WO2019008716 A1 WO 2019008716A1 JP 2017024748 W JP2017024748 W JP 2017024748W WO 2019008716 A1 WO2019008716 A1 WO 2019008716A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
range
information
electromagnetic wave
visible
Prior art date
Application number
PCT/JP2017/024748
Other languages
French (fr)
Japanese (ja)
Inventor
白水 信弘
利樹 石井
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2017/024748 priority Critical patent/WO2019008716A1/en
Publication of WO2019008716A1 publication Critical patent/WO2019008716A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/46Indirect determination of position data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • An object of the present invention is to provide a technique capable of detecting an unvisible object which is blinded by a wall or the like with high accuracy.
  • the analysis unit includes a shape recognition unit, a range analysis unit, a signal analysis unit, and a position analysis unit.
  • the shape recognition unit generates three-dimensional shape information from the image information acquired by the image input unit.
  • the range analysis unit calculates the reflection direction of the electromagnetic wave irradiated by the radar irradiation unit from the shape information generated by the shape recognition unit by reflecting it on the reflection surface and expanding the range information indicating the irradiation range of the electromagnetic wave from the calculated reflection direction Calculate
  • Embodiment 1 Hereinafter, the embodiment will be described in detail.
  • FIG. 1 is an explanatory view showing an example of the configuration of the non-visible position measurement apparatus according to the first embodiment.
  • the non-visible measurement device 110 inputs, via the image input unit 103, image information of the obstacle 115 and the reflective surface 114 that hide the target object 111 to be measured in the measurement range and the periphery thereof, and outputs the image information to the analysis unit 104.
  • the radar irradiation unit 101 irradiates, for example, an electromagnetic wave as the irradiation wave 112, reflects the electromagnetic wave on the reflection surface 114, and irradiates the target object 111 to be measured.
  • the electromagnetic wave reflected by the object 111 reaches the radar detection unit 102 as a reflected wave 113 via the reflection surface 114.
  • the analysis unit 104 Based on the image information from the image input unit 103, the analysis unit 104 extracts the size and position of a structure that will be a reflection surface such as a wall or mirror around the object, and emits an electromagnetic wave to the reflection surface. By performing the position analysis, it measures the object outside the visible.
  • the analysis unit 104 includes a shape recognition unit 105, a range analysis unit 106, a position analysis unit 107, and a signal analysis unit 108.
  • the shape recognition unit 105 has an image recognition function, and an image of the image input unit 103 Three-dimensional shape information is created using information, that is, the above-described camera shot image, infrared camera image, stereo shot image, map image, or TOF image.
  • the range analysis unit 106 calculates range information by calculating the angle at which the irradiation wave is reflected and spread on the reflection surface using electromagnetic field analysis, a ray tracing method, or the like based on the shape information output from the shape recognition unit 105. Do.
  • the range information includes an irradiation angle range ⁇ t at which the irradiation wave hits the reflection surface, an arrival range of the irradiation wave spread at the reflection surface, and an arrival angle range ⁇ r of the reflection wave estimated to arrive at the radar detection unit 102 from the reflection surface. .
  • the collision warning signal may be output by predicting the risk of collision from the movement speed of the object and the movement speed of the outside-of-visible measurement device 110.
  • the radar irradiation unit 101 and the radar detection unit 102 may be independent or separated, or may be an integral module.
  • FIG. 2 is an explanatory view showing an example of a front view when the non-visible measurement device of FIG. 1 is mounted on a car.
  • the shape recognition unit 105 detects the reflection surface 114 or the curve mirror 206 from the image information acquired by the forward monitoring camera, that is, the image input unit 103, and the range analysis unit 106 detects the analysis range of the radar detection signal from the reflection surface Restrict to
  • the head-up display 205 or monitor displays the position of the object 111 such as a pedestrian based on the information output from the position analysis unit 107. As a result, the driver is alerted.
  • the front right wall in FIG. 2 may be used as the reflective surface 114a, or a curved mirror 206 installed at intersections or curves with poor visibility. Or the like may be used.
  • FIG. 3 is an explanatory view showing an example of detection of an object by the outside-of-visible measurement device of FIG.
  • the outside-of-visible measurement device 110 is installed, for example, at the center of the front of the car.
  • the example which installs the non-visible measurement device 110 in the center of the front of a car was shown, for example, you may make it install only the radar irradiation part 101 and the radar detection part 102 in the front center of a car.
  • the car is in a state of traveling and approaching an intersection.
  • the object 111 which is a pedestrian, is located on the roadside of the road on the left side of the intersection, which is closer to the car, with respect to the traveling direction of the car, and proceeds from the roadside to the intersection. Further, the object 111 which is a pedestrian is in a state of being hidden by the obstacle 115.
  • the reflected wave 113 reflected on the radar detection unit 102 via the irradiation wave 112 emitted from the radar irradiation unit 101 and the reflection surface 114
  • the wave propagation distance is about 21.2 m from the non-visible measurement device 110 to the reflection surface 114 and about 5.1 m from the reflection surface 114 to the object 111 which is a pedestrian.
  • the signal intensity of the reflected wave from the pedestrian, that is, the object 111 in the arrangement example shown in FIG. 3 was calculated.
  • FIG. 4 is an example of the level diagram used for calculating the signal strength of the reflected wave of FIG.
  • the solid line is the level diagram calculated for the example of FIG. 3, and the dotted line is the level diagram of the vehicle 100 m ahead within the visible range calculated by the standard specification value.
  • the output 10 dBm of the radar irradiation unit 101 is amplified to 33 dBm by the antenna gain 23 dBi.
  • the attenuation factor of the propagation path is 63.5 dB from the non-visible measurement device 110 to the object 111 which is a pedestrian.
  • the scattering cross section representing the reflectance at the object to be measured 111 was set to standard -15 dBsm.
  • the power of the reflected wave reflected by the object 111 is ⁇ 51.9 dBm.
  • the propagation attenuation of the reflected wave is also 63.5 dB and the attenuation due to reflection is 6.4 dB as in the case of the irradiation wave.
  • the attenuation factor during rainfall is known to be -20 dB / km, and the attenuation factor is 1 dB for 52.5 m of the entire path.
  • the reflected wave reaches the radar detection unit 102 through the gain 16 dBmi of the receiving antenna.
  • the power of the radio wave reaching the radar detection unit 102 is -117 dBm, and the margin of the radar detection unit 102 is 3 dB with respect to the minimum detection sensitivity of -120 dBm of the radar detection unit 102 It is possible. Further, in the case of a bicycle or an automobile having a larger scattering cross-sectional area than the pedestrian whose object is the object 111, the margin is further increased.
  • the distance d 15 m from the car to the intersection means 0.75 seconds of running distance until the driver senses that the warning is given and the driver steps on the brake. It is the distance that can be stopped including.
  • the speed of the vehicle is about 50 km / h If there is a distance that can be stopped.
  • the reflecting surface 114 is on the left front side of the center line in the vehicle width direction of the vehicle, that is, in the same direction as the position of the object 111. Therefore, the electromagnetic wave is emitted from the radar irradiation unit 101 to the left front side of the center line in the vehicle width direction of the vehicle as illustrated.
  • the irradiation range of the electromagnetic wave in this case is taken as a first range.
  • the first range is a range having a positive horizontal angle, for example, a horizontal angle of about 90 degrees.
  • the horizontal angle in the left direction is a positive horizontal angle
  • the horizontal angle in the right direction is a negative horizontal angle, with respect to the center of the entire irradiation range of the radar irradiation unit 101, in other words, the center line in the vehicle width direction of the vehicle. I assume.
  • the detection range becomes wide.
  • the reflecting surface 114 is on the left front side of the center line in the vehicle width direction of the vehicle as described above, in the example shown in FIG. 5, the reflecting surface 114a is the center in the vehicle width direction of the vehicle It is a wall on the opposite lane side, not to the right front side of the line, that is, not on the lane side of the vehicle.
  • the electromagnetic wave is emitted from the radar irradiation unit 101 on the right front side of the center line in the vehicle width direction of the vehicle.
  • the irradiation range of the electromagnetic wave in this case is taken as a second range.
  • the second range is a range having the above-described negative horizontal angle, for example, a horizontal angle of about -90 °.
  • the object may be detected using paths from a plurality of reflecting surfaces.
  • FIG. 6 is an explanatory view showing an example of detection of an object by the outside-of-visible measurement device of FIG.
  • FIG. 6 the example which measures the target object 111 by the path
  • One is a path of the electromagnetic wave by the reflecting surface 114 as in FIG. 3, and the reflecting surface 114 is on the left front side of the center line in the vehicle width direction of the vehicle.
  • the other is the electromagnetic wave path by the reflecting surface 114a as in FIG. 5, and the reflecting surface 114a is on the left front side of the center line in the vehicle width direction of the vehicle.
  • the path of the electromagnetic wave by the reflecting surface 114 is indicated by the irradiation wave 112 and the reflected wave 113
  • the path of the electromagnetic wave by the reflecting surface 114a is indicated by the irradiation wave 112a and the reflected wave 113a.
  • the position of the object 111 is measured a plurality of times from a plurality of paths of the electromagnetic wave by the two reflecting surfaces 114 and 114a. Then, the position of the target object 111 having the highest possibility may be calculated and output from the plurality of measured position information.
  • the measurement accuracy of the position can be further improved by performing measurement a plurality of times.
  • FIG. 7 is an explanatory view showing another example of detection of an object by the outside-of-visible measurement device of FIG.
  • FIG. 3, FIG. 5, and FIG. 6 mentioned above showed the example which installed the non-visible measuring device 110 in the center part of the front of a car
  • FIG. 7 the example which mounted two non-visible measuring devices in a car Is shown.
  • two non-visible measurement devices 110 and 110a are mounted, for example, at both front ends of the car.
  • the connection configuration of the outside-of-visible measurement device 110a is the same as that of the outside-of-visible measurement device 110 in FIG.
  • the non-visible measurement devices 110 and 110a are mounted in the vicinity of the headlights of a car.
  • the radar irradiation unit 101 and the radar detection unit 102 may be installed at both front end portions of the vehicle.
  • non-visible measurement device 110, 110a By mounting the non-visible measurement device 110, 110a in the vicinity of the headlight of the vehicle in this manner, measurement can be performed in a wider range.
  • the irradiation wave 112 and the reflected wave 113 can be a path shown in FIG. 7 to allow the electromagnetic wave to reach the object 111 at the intersection.
  • the non-visible measurement device 110 mounted on the left side When the non-visible measurement device 110 mounted on the left side is used, a wall surface on the right front side or the like with respect to the center line in the vehicle width direction of the vehicle serves as the reflection surface 114a as in FIG. Therefore, the irradiation wave 112a and the reflected wave 113a become a path
  • the non-visible measurement device 110a since the non-visible measurement device 110a is installed offset to the left with respect to the center of the vehicle, the paths of the irradiation wave 112a and the reflected wave 113a are different from those in FIG. As a result, the electromagnetic waves reach a position farther from the intersection than in FIG. As a result, it is possible to detect an object 111a further away from the intersection.
  • two radar irradiation units 101 and two radar detection units 102 may be provided, and the remaining analysis units 104 may be common.
  • the image input unit 103 outputs the acquired image information (step S101).
  • the shape recognition unit 105 recognizes the shape and position of the reflective surface from the image information output from the image input unit 103, and outputs the recognition result as shape information (step S102).
  • the range analysis unit 106 calculates the irradiation range and the detection range of the electromagnetic wave based on the shape information output from the shape recognition unit 105, and outputs the calculation result as the range information (step S103).
  • the position analysis unit 107 integrates the shape information and the reflection position and outputs the position information of the object (step S107). The position analysis unit 107 determines whether or not the measurement has been completed (step S108). If the measurement has not been completed, the process returns to the process of step S101 to perform measurement again.
  • step S106 if the range in which signal analysis is performed is the entire range of the field of view, the analysis time becomes large, and there is a possibility that the analysis may not be made before warning of the danger of collision. Therefore, in the process of step S105, the measurement time can be shortened by the range analysis unit 106 limiting the analysis range to, for example, only the part having the reflection surface. Thereby, analysis can be performed within a practical time.
  • the object 111 hidden by the obstacle 115 can be easily detected.
  • the reflection position on the object can be identified with high accuracy in a shorter time, the position of the object 111 which is out of visible can be detected at high speed with low calculation load.
  • the first embodiment shows an example of using the non-visible measurement device 110 in a vehicle to support driving safety for preventing collisions at intersections with poor visibility etc.
  • the application of the non-visible measurement device 110 is It is not limited to driving safety support.
  • monitoring of conditions of vehicles and robots For example, monitoring of conditions of vehicles and robots, safety support for pedestrians and bicycles, virtual reality (VR) / augmented reality (AR) / mixed reality (MR) perimeter monitoring sensors, monitoring devices installed in drone, shape recognition
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • the image information from the image input unit 103, the shape information calculated by the shape recognition unit 105, and the range information calculated by the range analysis unit 106 are accumulated and held as analysis information, and reused at the next measurement. May be
  • the analysis information is accumulated, for example, in a storage unit (not shown) of the non-visible measurement device 110 or the like.
  • the processing time of the signal analysis unit 108 can be speeded up or the calculation load can be increased by reading out the analysis information acquired earlier from the storage unit described above and performing range analysis and signal analysis. It can be reduced.
  • the danger position information can be stored and shared with other non-visible measurement devices to warn of the danger without analysis. Thereby, the safety can be improved.
  • the non-visible measurement device that detects the position of an object that is out of view at high speed with lower calculation load using a low-cost radar.
  • FIG. 9 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement device according to the second embodiment.
  • the map input unit 109 acquires map information from an internal storage device or recording medium (not shown), an external server connected by communication, or the like based on current position information acquired from a satellite positioning system (GPS) or the like, and analyzes it. It is output to the shape recognition unit 105 that the unit 104 has.
  • GPS satellite positioning system
  • the shape recognition unit 105 can estimate the existing position of the reflective surface by using the map information acquired by the map input unit 109.
  • the positional accuracy of the reflective surface is improved, and the analysis time of the image information can be shortened.
  • the map information acquired by the map input unit 109 may be information including a three-dimensional structure of a building, in which case the calculation time of shape recognition can be further shortened.
  • FIG. 10 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement apparatus according to the third embodiment.
  • the non-visible measurement device 110 shown in FIG. 10 is different from the non-visible measurement device 110 of FIG. 1 of the first embodiment in the irradiation technique by the radar irradiation unit 101.
  • the connection configuration is the same as the outside-of-visible measurement device 110 in FIG.
  • the radar irradiation unit 101 irradiates a diffracted wave as the irradiation wave 601.
  • the radar detection unit 102 also detects a diffracted wave as the reflected wave 602.
  • the irradiation wave 601, which is a diffracted wave irradiated from the radar irradiation unit 101, is diffracted by the wall surface of the obstacle 115 and the like, turns around, and reaches the object 111.
  • the reflected wave 602 of the diffracted wave reflected by the object of measurement 111 is diffracted again by the wall surface of the obstacle 115 or the like to wrap around and reach the radar detection unit 102. At this time, the paths of the irradiation wave and the reflected wave may not be straight.
  • the radar irradiation unit 101 can limit the irradiation range of the electromagnetic wave based on the irradiation range information output from the range analysis unit 106. By limiting the irradiation range of the radar irradiation unit 101, the measurement time can be shortened and the amount of analysis calculation can be reduced.
  • the signal analysis unit 108 determines whether or not the irradiation range of the electromagnetic wave covers the range information (step S206). When the range is covered, the signal analysis is performed (step S207). When not covering, it returns to the process of step S204, changes the irradiation range, and irradiates electromagnetic waves.
  • the processing time can be shortened, and low-cost non-visible measurement can be performed at high speed.
  • FIG. 13 is a flowchart showing an example of measurement processing in the non-visible measurement device according to the fifth embodiment.
  • FIG. 13 shows another example of the process shown in FIG. 12 of the fourth embodiment. Further, the connection configuration of the outside-of-visible measurement device 110 is the same as the outside-of-visible measurement device 110 of FIG.
  • step S304 irradiation with electromagnetic wave
  • step S305 electromagnétique wave detection
  • step S306 determination of range end
  • steps S301 to S303 is respectively the first step to the third step
  • step S304 is the fourth step
  • step S305 is the fifth step
  • step S306 is the fourth step. 9 steps.
  • steps S301 to S303 is the same as step S201 (image acquisition), step S202 (shape recognition), and step S203 (range analysis) in FIG.
  • step S201 image acquisition
  • step S202 shape recognition
  • step S203 range analysis
  • the electromagnetic wave irradiation in step S304 the electromagnetic wave detection in step S305
  • the range end determination in step S306 are simultaneously performed.
  • the range serving as the reference of the determination in the process of step S306 is the range set in advance by the process of step S303 in the previous measurement.
  • FIG. 14 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement device according to the sixth embodiment.
  • the speed input unit 141 acquires the moving speed of the mounting device on which the outside-of-visible measurement device 110 is mounted. For example, when the non-visible measurement device 110 is mounted on a car, the car becomes a mounting device.
  • the mounting device will be described as a car.
  • the speed input unit 141 acquires the moving speed from a speed measuring device or a satellite positioning system provided in the car.
  • the speed input unit 141 may be configured to include a speed measuring device.
  • the analysis unit 104 sets a measurement range, an analysis range, an irradiation range of the radar, a detection range, and the like based on the moving speed acquired from the speed input unit 141.
  • the position analysis unit 107 calculates the distance from the car to the object from the moving speed of the car and the moving direction of the object and the moving speed, and outputs a collision warning signal when the distance is less than a certain value.
  • the distance from the car to the object 111 is the distance from the car to the intersection of the moving direction of the car and the moving direction of the object 111.
  • step S401 speed acquisition
  • steps S402 to S410 after the processing of step S401 is the same as the processing of steps S201 to S209 of FIG.
  • the speed input unit 141 acquires the moving speed from a speed measuring device of a car that is a mounted device, a satellite positioning system, or the like.
  • the obtained moving speed is a narrow area at high speed traveling and a vicinity at low speed traveling within the range analysis of step S404, the area end determination of step S407, the range of signal analysis of step S408, and the position analysis of step S409. It is used to set a wide area of
  • the movement direction and movement speed are calculated from the difference between the measured reflection position of the object 111 and the reflection position measured one time ago, and in the position analysis of step S409 the movement of the vehicle
  • the distance from the car to the object 111 at the intersection of the moving direction of the car and the moving direction of the object is calculated from the speed and the moving direction and moving speed of the object 111, and the distance is less than a certain value In case of collision output warning signal.
  • a collision warning signal between the mounting apparatus and the object 111 can be output with high accuracy.
  • FIG. 16 is an explanatory drawing showing an example of the configuration of the image input unit 103 included in the outside-of-visible measurement device 110 according to the seventh embodiment.
  • the non-visible measurement device 110 is assumed to be the same as that shown in FIG. 1 of the first embodiment.
  • the image input unit 103 includes a light irradiation unit 161, a light detection unit 162, and a control calculation unit 163, as shown in FIG.
  • the light irradiator 161 irradiates, for example, light of a pulse waveform to the object and the periphery thereof in accordance with the light control signal of the control calculator 163.
  • the light is reflected by an obstacle or a wall or a structure serving as the reflective surface 114, and the reflected light is incident on the light detection unit 162.
  • the light detection unit 162 detects the light of the reflected pulse waveform.
  • the control calculation unit 163 uses the TOF (Time Of Flight) method from the time difference ⁇ T between the irradiated pulse wave and the light detection signal of the reflected pulse wave and the light velocity, and the reflecting surface from the image input unit 103 The distance L to 114 is calculated.
  • TOF Time Of Flight
  • the light to be irradiated may be a continuous wave, and in this case, the control calculation unit 163 may calculate the distance L using a phase difference or a frequency difference.
  • the control calculation unit 163 scans the irradiation direction so as to cover the object and the range around it, and maps the value of the distance L to a two-dimensional image for each irradiation direction, thereby generating image information including distance information.
  • the analysis unit 104 can reduce the calculation load of shape recognition and range analysis by using the image information including the distance information, and as a result, it is possible to realize speeding up of the non-visible measurement.
  • the present invention is not limited to the above-mentioned embodiment, and can be variously changed in the range which does not deviate from the gist. Needless to say.
  • the present invention is not limited to the above-described embodiment, but includes various modifications.
  • the above-described embodiments are described in detail to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations.
  • part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. .
  • the image input unit is an imaging device for capturing a video, and outputs the captured image data as image information.
  • the non-visible measurement device according to (1) includes a plurality of radar irradiation units and a plurality of radar detection units.
  • the analysis unit generates a three-dimensional shape information from the image information acquired by the image input unit, and a radar from the shape information generated by the shape recognition unit.

Abstract

The present invention detects, with high accuracy, a non-visible object that is in a blind spot due to a wall or similar. A non-visible measurement device 110 has a radar emission unit 101, a radar detection unit 102, an image input unit 103, and an analysis unit 104. The radar emission unit 101 emits electromagnetic waves to an object 111. The radar detection unit 102 detects electromagnetic waves that were reflected by the object 111, and outputs a detection signal for the detected electromagnetic waves. The image input unit 103 acquires topographical image information. The analysis unit 104 analyzes the detection signal that was outputted by the radar detection unit 102 and the image information that was acquired by the image input unit 103, and calculates location information for the object 111. The analysis unit 104 calculates the location information for a non-visible object 111 that is not included in the image information by extracting, on the basis of the image information that was acquired by the image input unit 103, the location of a reflection surface that reflects electromagnetic waves, emitting electromagnetic waves to the extracted reflection surface, and performing a location analysis.

Description

可視外計測装置および可視外計測方法Non-visible measuring device and non-visible measuring method
 本発明は、可視外計測装置および可視外計測方法に関し、特に、可視外領域に位置する対象物の計測に有効な技術に関する。 The present invention relates to an out-of-visible measurement device and an out-of-visible measurement method, and more particularly to a technique effective for measuring an object located in an out-of-visible region.
 近年、自動車事故を削減して安全性の向上を図るため、さらには自動運転の実現に向けて、見通しの悪い交差点などに進入する車や歩行者、あるいは死角にいる歩行者などの可視外の計測に関する技術が開発されている。 In recent years, in order to reduce automobile accidents and improve safety, and to realize automatic driving, vehicles and pedestrians entering a poorly visible intersection, etc. or pedestrians in blind spots are not visible. Technologies related to measurement have been developed.
 この種の可視外の計測における技術としては、例えばレーダ波を出力して反射した物体を検出し、検出した物体から監視対象となる対象車両を抽出すると共に検出した物体から抽出された対象車両によって死角となる位置に存在する死角物体を抽出するものが知られている(例えば特許文献1参照)。 As a technique in this type of non-visible measurement, for example, a radar wave is output to detect a reflected object, a target vehicle to be monitored is extracted from the detected object, and a target vehicle extracted from the detected object is detected. It is known to extract a blind spot object present at a blind spot (see, for example, Patent Document 1).
 また、特許文献2には、車両同士がレーザ光投光器により、自車両の存在を示すレーザ光を路面に向けて照射すると共に、赤外線カメラにより路面で反射されたレーザ光を認識し、レーザ光判定手段により認識されたレーザ光が自車両以外の他車両により照射されたレーザ光であると判定された場合、認識されたレーザ光の移動軌跡、形状、点灯状態の少なくともいずれか1つに基づいて該他車両の移動方向や移動速度を検出する旨が記載されている。 Further, according to Patent Document 2, the vehicles irradiate laser light indicating the presence of the vehicle toward the road surface with the laser light projector, and the laser light reflected on the road surface is recognized by the infrared camera, and the laser light determination is made. When it is determined that the laser light recognized by the means is a laser light irradiated by another vehicle other than the host vehicle, based on at least one of the movement trajectory, the shape, and the lighting state of the recognized laser light It is described that the moving direction and moving speed of the other vehicle are detected.
特開2006-349456号公報JP, 2006-349456, A 特開2004-102889号公報JP 2004-102889 A
 しかし、上述した特許文献1の技術は、レーダ波を車両の床下や窓を透過させて伝搬し、その車両によって生じた死角に存在する物体にまで到達させることにより、死角物体を抽出するものである。よって、右折車両、停止車両、あるいは前方車両などによって生じる死角の状況を把握するのには有効であるが、交差点付近の壁や建物などによって死角となっている物体を検出することができないという問題がある。 However, the technique of Patent Document 1 described above extracts a dead angle object by propagating a radar wave through the floor or window of the vehicle and reaching an object existing in the dead angle generated by the vehicle. is there. Therefore, although it is effective in grasping the situation of the blind spot which is caused by the right turn vehicle, the stop vehicle, or the forward vehicle, the problem that the object which is blind spot by the wall near the intersection or the building can not be detected There is.
 また、特許文献2の技術では、レーザ光投光手段を自車両以外の他車両が備えていることが前提条件であり、該レーザ光投光手段が有していない場合には、死角となっている車両を検出することができないという問題がある。 Further, in the technology of Patent Document 2, it is a precondition that the laser light projection means is provided by another vehicle other than the own vehicle, and if the laser light projection means does not have it, it becomes a blind spot. Problem of being unable to detect a
 本発明の目的は、壁などによって死角となっている可視外の対象物を高精度に検出することのできる技術を提供することにある。 An object of the present invention is to provide a technique capable of detecting an unvisible object which is blinded by a wall or the like with high accuracy.
 本発明の前記ならびにその他の目的と新規な特徴については、本明細書の記述および添付図面から明らかになるであろう。 The above and other objects and novel features of the present invention will be apparent from the description of the present specification and the accompanying drawings.
 本願において開示される発明のうち、代表的なものの概要を簡単に説明すれば、次のとおりである。 The outline of typical ones of the inventions disclosed in the present application will be briefly described as follows.
 すなわち、代表的な可視外計測装置は、レーダ照射部、レーダ検出部、画像入力部、および解析部を有する。レーダ照射部は、電磁波を対象物に照射する。レーダ検出部は、対象物にて反射した電磁波を検出し、検出した電磁波の検出信号を出力する。画像入力部は、地形の画像情報を取得する。解析部は、レーダ検出部が出力する検出信号および画像入力部が取得する画像情報を解析して対象物の位置情報を算出する。 That is, a typical non-visible measurement device includes a radar irradiation unit, a radar detection unit, an image input unit, and an analysis unit. The radar irradiation unit irradiates an object with an electromagnetic wave. The radar detection unit detects an electromagnetic wave reflected by an object, and outputs a detection signal of the detected electromagnetic wave. The image input unit acquires terrain image information. The analysis unit analyzes the detection signal output from the radar detection unit and the image information acquired by the image input unit to calculate position information of the object.
 解析部は、画像入力部が取得した画像情報に基づいて、電磁波を反射させる反射面の位置を抽出して、抽出した反射面に対して電磁波を照射して位置解析を行うことにより、画像情報に含まれていない可視外の対象物の前記位置情報を算出する。 The analysis unit extracts the position of the reflection surface that reflects the electromagnetic wave based on the image information acquired by the image input unit, and applies the electromagnetic wave to the extracted reflection surface to perform the position analysis. Calculating the position information of the non-visible object not included in.
 特に、解析部は、形状認識部、範囲解析部、信号解析部、および位置解析部を有する。形状認識部は、画像入力部が取得した画像情報から3次元の形状情報を生成する。範囲解析部は、形状認識部が生成した形状情報からレーダ照射部が照射した電磁波が反射面にて反射して広がる反射方向を算出して、算出した反射方向から電磁波の照射範囲を示す範囲情報を算出する。 In particular, the analysis unit includes a shape recognition unit, a range analysis unit, a signal analysis unit, and a position analysis unit. The shape recognition unit generates three-dimensional shape information from the image information acquired by the image input unit. The range analysis unit calculates the reflection direction of the electromagnetic wave irradiated by the radar irradiation unit from the shape information generated by the shape recognition unit by reflecting it on the reflection surface and expanding the range information indicating the irradiation range of the electromagnetic wave from the calculated reflection direction Calculate
 信号解析部は、レーダ検出部が出力する検出信号および範囲解析部が算出する範囲情報に基づいて、対象物の反射位置を算出する。位置解析部は、形状認識部が生成する形状情報および信号解析部が算出する反射位置に基づいて、対象物の位置情報を算出する。 The signal analysis unit calculates the reflection position of the object based on the detection signal output from the radar detection unit and the range information calculated by the range analysis unit. The position analysis unit calculates position information of the object based on the shape information generated by the shape recognition unit and the reflection position calculated by the signal analysis unit.
 本願において開示される発明のうち、代表的なものによって得られる効果を簡単に説明すれば以下のとおりである。 The effects obtained by typical ones of the inventions disclosed in the present application will be briefly described as follows.
 信頼性の高い可視外計測装置を提供することができる。 It is possible to provide a reliable non-visible measuring device.
実施の形態1による可視外位置計測装置における構成の一例を示す説明図である。FIG. 5 is an explanatory drawing showing an example of the configuration of the non-visible position measurement device according to the first embodiment. 図1の可視外計測装置を自動車に搭載した際の前方視界の一例を示す説明図である。It is explanatory drawing which shows an example of the front view at the time of mounting the non-visible measurement apparatus of FIG. 1 in a motor vehicle. 図1の可視外計測装置による対象物の検出の一例を示す説明図である。It is explanatory drawing which shows an example of a detection of the target object by the non-visible measurement apparatus of FIG. 図3の反射波の信号強度の算出に用いたレベルダイヤの一例である。It is an example of the level diamond used for calculation of the signal strength of the reflected wave of FIG. 図3の可視外計測装置による対象物の検出の他の例を示す説明図である。It is explanatory drawing which shows the other example of a detection of the target object by the non-visible measurement apparatus of FIG. 図3の可視外計測装置による対象物の検出の一例を示す説明図である。It is explanatory drawing which shows an example of a detection of the target object by the non-visible measurement apparatus of FIG. 図1の可視外計測装置による対象物の検出の他の例を示す説明図である。It is explanatory drawing which shows the other example of a detection of the target object by the non-visible measurement apparatus of FIG. 図1の可視外計測装置による計測処理の一例を示すフローチャートである。It is a flowchart which shows an example of the measurement process by the non-visible measurement apparatus of FIG. 実施の形態2による可視外計測装置における構成の一例を示す説明図である。FIG. 16 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement device according to Embodiment 2; 実施の形態3による可視外計測装置における構成の一例を示す説明図である。FIG. 18 is an explanatory drawing showing an example of the configuration of an outside-of-visible measurement device according to Embodiment 3. 実施の形態4による可視外計測装置における構成の一例を示す説明図である。FIG. 18 is an explanatory drawing showing an example of the configuration of an outside-of-visible measurement device according to Embodiment 4. 図11の可視外計測装置による可視外計測処理の一例を示すフローチャートである。It is a flowchart which shows an example of the non-visible measurement process by the non-visible measurement apparatus of FIG. 実施の形態5による可視外計測装置における計測処理の一例を示すフローチャートである。21 is a flowchart showing an example of measurement processing in the non-visible measurement device according to the fifth embodiment. 実施の形態6による可視外計測装置における構成の一例を示す説明図である。FIG. 21 is an explanatory drawing showing an example of the configuration of an outside-of-visible measurement device according to the sixth embodiment. 図14の可視外計測装置を用いた計測処理の一例を示すフローチャートである。It is a flowchart which shows an example of the measurement process using the measuring apparatus outside of the visibility of FIG. 実施の形態7による可視外計測装置が有する画像入力部における構成の一例を示す説明図である。FIG. 21 is an explanatory drawing showing an example of the configuration of the image input unit of the non-visible-measurement device according to Embodiment 7;
 実施の形態を説明するための全図において、同一の部材には原則として同一の符号を付し、その繰り返しの説明は省略する。 In all the drawings for describing the embodiment, the same reference numeral is attached to the same member in principle, and the repeated description thereof is omitted.
 (実施の形態1)
 以下、実施の形態を詳細に説明する。
Embodiment 1
Hereinafter, the embodiment will be described in detail.
 〈可視外計測装置の構成例〉
 図1は、本実施の形態1による可視外位置計測装置における構成の一例を示す説明図である。
<Configuration Example of Non-Visible Measurement Device>
FIG. 1 is an explanatory view showing an example of the configuration of the non-visible position measurement apparatus according to the first embodiment.
 可視外計測装置110は、計測範囲にある測定する対象物111を隠す障害物115や反射面114およびその周辺の画像情報を画像入力部103を介して入力し、解析部104に出力する。 The non-visible measurement device 110 inputs, via the image input unit 103, image information of the obstacle 115 and the reflective surface 114 that hide the target object 111 to be measured in the measurement range and the periphery thereof, and outputs the image information to the analysis unit 104.
 可視外計測装置110は、レーダ照射部101、レーダ検出部102、画像入力部103、および解析部104を有する。画像入力部103は、例えばデジタルカメラなどであり、その場合、上述した画像情報は、例えばデジタルカメラが取得した画像データである。その他にも、赤外線カメラによる画像、地図画像、あるいはTOF(Time Of Flight)画像などであってもよい。 The non-visible measurement device 110 includes a radar irradiation unit 101, a radar detection unit 102, an image input unit 103, and an analysis unit 104. The image input unit 103 is, for example, a digital camera, and in this case, the above-described image information is, for example, image data acquired by the digital camera. Besides, an image by an infrared camera, a map image, or a TOF (Time Of Flight) image may be used.
 レーダ照射部101は、例えば電磁波を照射波112として照射して、反射面114に反射させて測定する対象物111に照射する。対象物111にて反射された電磁波は、反射面114を介して反射波113としてレーダ検出部102に到達する。 The radar irradiation unit 101 irradiates, for example, an electromagnetic wave as the irradiation wave 112, reflects the electromagnetic wave on the reflection surface 114, and irradiates the target object 111 to be measured. The electromagnetic wave reflected by the object 111 reaches the radar detection unit 102 as a reflected wave 113 via the reflection surface 114.
 レーダ検出部102は、到達した反射波の強度、および照射波と反射波との位相差または照射波と反射波との時間差を表す検出信号を出力する。また、レーダ照射部101は、任意の角度範囲を挿引して照射する。 The radar detection unit 102 outputs a detection signal representing the intensity of the reflected wave that has arrived and the phase difference between the irradiation wave and the reflected wave or the time difference between the irradiation wave and the reflected wave. Further, the radar irradiation unit 101 inserts and draws an arbitrary angle range.
 レーダ検出部102は、レーダ照射部101の照射角度ごとに任意の角度範囲にて検出を行う。このときの角度範囲は、測定する対象物や画像情報に基づいた周辺の状況に応じて変化させてよい。また、レーダ検出部102は、検出信号の照射角度ごとの強度分布から検出信号の到来方向を算出して出力してもよい。 The radar detection unit 102 performs detection in an arbitrary angle range for each irradiation angle of the radar irradiation unit 101. The angular range at this time may be changed in accordance with the object to be measured and the surrounding conditions based on the image information. In addition, the radar detection unit 102 may calculate and output the arrival direction of the detection signal from the intensity distribution for each irradiation angle of the detection signal.
 解析部104は、画像入力部103からの画像情報に基づいて、対象物周辺の壁やミラーなど反射面となる構造物の大きさや位置を抽出して、該反射面に対して電磁波を照射して位置解析を行うことにより、可視外の対象物を計測する。 Based on the image information from the image input unit 103, the analysis unit 104 extracts the size and position of a structure that will be a reflection surface such as a wall or mirror around the object, and emits an electromagnetic wave to the reflection surface. By performing the position analysis, it measures the object outside the visible.
 解析部104は、画像入力部103からの画像情報に基づいて、対象物周辺の壁やミラーなど反射面となる構造物の大きさや位置を抽出して3次元の形状情報として出力する。 Based on the image information from the image input unit 103, the analysis unit 104 extracts the size and position of a structure serving as a reflection surface such as a wall or a mirror around the object and outputs it as three-dimensional shape information.
 この解析部104は、形状認識部105、範囲解析部106、位置解析部107、および信号解析部108を有する、形状認識部105は、画像認識機能を有しており、画像入力部103の画像情報、すなわち上述したカメラ撮影画像、赤外線カメラ画像、ステレオ撮影画像、地図画像、あるいはTOF画像などを用いて3次元形状情報を作成する。 The analysis unit 104 includes a shape recognition unit 105, a range analysis unit 106, a position analysis unit 107, and a signal analysis unit 108. The shape recognition unit 105 has an image recognition function, and an image of the image input unit 103 Three-dimensional shape information is created using information, that is, the above-described camera shot image, infrared camera image, stereo shot image, map image, or TOF image.
 範囲解析部106は、形状認識部105から出力された形状情報に基づいて、電磁界解析や光線追跡法などを用いて照射波が反射面で反射されて広がる角度を計算して範囲情報を算出する。範囲情報は、反射面に照射波が当たる照射角度範囲θt、反射面で広がった照射波の到達範囲、反射面からレーダ検出部102に到来すると推測される反射波の到来角度範囲θrなどである。 The range analysis unit 106 calculates range information by calculating the angle at which the irradiation wave is reflected and spread on the reflection surface using electromagnetic field analysis, a ray tracing method, or the like based on the shape information output from the shape recognition unit 105. Do. The range information includes an irradiation angle range θt at which the irradiation wave hits the reflection surface, an arrival range of the irradiation wave spread at the reflection surface, and an arrival angle range θr of the reflection wave estimated to arrive at the radar detection unit 102 from the reflection surface. .
 信号解析部108は、範囲解析部106が出力した範囲情報に基づいて、レーダ検出部102からの検出信号である反射波の大きさと、照射波と反射との位相差あるいは時間差とから範囲情報の範囲内に含まれる反射位置を算出して対象物によって電磁波が反射された位置を特定する。 Based on the range information output from the range analysis unit 106, the signal analysis unit 108 determines the range information from the magnitude of the reflected wave that is a detection signal from the radar detection unit 102 and the phase difference or time difference between the irradiation wave and the reflection. The reflection position included in the range is calculated to specify the position where the electromagnetic wave is reflected by the object.
 このように、反射位置を解析する範囲を範囲情報内に限定することにより、不要な領域の計算を削減し、また無関係の反射位置情報を計算することがなくなり、より短い時間で高精度に対象物での反射位置を特定することが可能となる。 In this way, by limiting the range of analysis of the reflection position to within the range information, calculation of unnecessary regions is reduced, and calculation of irrelevant reflection position information is eliminated, so that it is possible to achieve high accuracy in a short time. It becomes possible to specify the reflection position on the object.
 位置解析部107は、信号解析部108から出力された反射位置を形状情報に対応させて、合成した対象物の位置情報を出力する。位置情報の出力先は、例えば図2に示すヘッドアップディスプレイ(HUD)205、モニタを有するカーナビゲーションシステム、あるいは衝突被害軽減ブレーキにおける自動ブレーキ機能の制御を司るECU(電子制御ユニット:Electronic Control Unit)などである。 The position analysis unit 107 causes the reflection position output from the signal analysis unit 108 to correspond to the shape information, and outputs position information of the synthesized object. The output destination of the position information is, for example, a head-up display (HUD) 205 shown in FIG. 2, a car navigation system having a monitor, or an ECU (Electronic Control Unit) that controls the automatic brake function in the collision damage reduction brake. Etc.
 また、位置解析部107は、必要に応じて前に測定した対象物の位置情報との比較を行い、固定物かあるいは移動しているかを判定して出力するようにしてもよい。あるいは対象物の反射強度から、対象物の属性、例えば歩行者、自転車、自動車などの判定をして出力してもよい。 In addition, the position analysis unit 107 may compare with the position information of the object measured previously, as necessary, to determine whether it is a fixed object or moving and output. Alternatively, an attribute of the object, for example, a pedestrian, a bicycle, a car, or the like may be determined and output from the reflection intensity of the object.
 さらに、対象物の移動速度と可視外計測装置110の移動速度とから衝突の危険性の予測を行うことにより、衝突警告信号を出力してもよい。レーダ照射部101およびレーダ検出部102は、独立、分離の構成であってもよく、あるいは一体のモジュールの構成となっていてもよい。 Furthermore, the collision warning signal may be output by predicting the risk of collision from the movement speed of the object and the movement speed of the outside-of-visible measurement device 110. The radar irradiation unit 101 and the radar detection unit 102 may be independent or separated, or may be an integral module.
 また、レーダの変調方式は、限定されず、FCパルス方式、周波数変調方式として知られているFMCW(Frequency Modulated Continuous Wave)方式、あるいは2周波CW(Continuous Wave)方式などのいずれであってもよい。FCパルス方式を用いることによって、距離分解能が向上する。一方、周波数変調方式は、SN比が向上して、より長距離の、あるいは散乱断面積のより小さい対象物を検出することができる。 Further, the modulation method of the radar is not limited, and may be any of an FC pulse method, an FMCW (Frequency Modulated Continuous Wave) method known as a frequency modulation method, a two-frequency CW (Continuous Wave) method, etc. . By using the FC pulse method, the distance resolution is improved. On the other hand, the frequency modulation method can improve the SN ratio to detect an object with a longer distance or a smaller scattering cross section.
 〈対象物の検出例1〉
 図2は、図1の可視外計測装置を自動車に搭載した際の前方視界の一例を示す説明図である。
<Detection example 1 of object>
FIG. 2 is an explanatory view showing an example of a front view when the non-visible measurement device of FIG. 1 is mounted on a car.
 この図2では、可視外計測装置110が障害物115の壁に遮られて死角、すなわち可視外となっている歩行者を測定の対象物111として検出する際の例を示している。なお、対象物は、歩行者に限らず、自転車、自動車などでもよい。 In this FIG. 2, the example at the time of detecting the pedestrian who is obstructed by the wall of the obstacle 115 and is a blind spot, ie, a visible object, as the measuring object 111 is shown. The object is not limited to a pedestrian, but may be a bicycle, a car or the like.
 前方監視カメラ、すなわち画像入力部103が取得した画像情報から形状認識部105において反射面114、あるいはカーブミラー206を検出し、範囲解析部106においてレーダ検出信号の解析範囲を反射面から到来する範囲に限定する。 The shape recognition unit 105 detects the reflection surface 114 or the curve mirror 206 from the image information acquired by the forward monitoring camera, that is, the image input unit 103, and the range analysis unit 106 detects the analysis range of the radar detection signal from the reflection surface Restrict to
 レーダ検出部102は、対象物111にて反射した電磁波を検出し、信号解析部108において反射波の反射位置を推定する。 The radar detection unit 102 detects the electromagnetic wave reflected by the object 111, and the signal analysis unit 108 estimates the reflection position of the reflected wave.
 そして、位置解析部107は、推定された反射位置と反射面の情報を統合して、反射位置が歩行者や自転車などの対象物か、あるいは壁などの対象物であるかを判別して例えば自動車に搭載されているモニタあるいはヘッドアップディスプレイ(HUD)205などに出力する。これらモニタあるいはヘッドアップディスプレイ(HUD)205は、表示システムとなる。 The position analysis unit 107 integrates the information of the estimated reflection position and the reflection surface to determine whether the reflection position is an object such as a pedestrian or a bicycle or an object such as a wall, for example. The information is output to a monitor or head-up display (HUD) 205 mounted on a car. The monitor or head-up display (HUD) 205 is a display system.
 ヘッドアップディスプレイ205、あるいはモニタは、位置解析部107から出力された情報に基づいて、歩行者などの対象物111の位置を表示する。これによって、ドライバへの注意喚起が行われる。 The head-up display 205 or monitor displays the position of the object 111 such as a pedestrian based on the information output from the position analysis unit 107. As a result, the driver is alerted.
 あるいは、対象物111などの対象物の速度と自車の速度とから交差点での衝突の可能性を予測して、衝突の危険性をヘッドアップディスプレイ205などに表示して警告したり、警告音を発生したり、または上述した自動ブレーキ機能の制御を司るECUなどによってブレーキをかけて自動車を停止させるようにしてもよい。 Alternatively, the possibility of a collision at an intersection may be predicted from the speed of an object such as the object 111 and the speed of the own vehicle, and the danger of a collision may be displayed on a head-up display 205 etc. Alternatively, the vehicle may be stopped by applying a brake by an ECU or the like that controls the above-described automatic braking function.
 反射面として利用可能な構造物としては、反射面114の他にも、図2の前方右側の壁を反射面114aとして用いたり、あるいは、交差点や見通しの悪いカーブに設置されているカーブミラー206などを用いたりしてもよい。 As a structure that can be used as a reflective surface, in addition to the reflective surface 114, the front right wall in FIG. 2 may be used as the reflective surface 114a, or a curved mirror 206 installed at intersections or curves with poor visibility. Or the like may be used.
 図3は、図1の可視外計測装置による対象物の検出の一例を示す説明図である。 FIG. 3 is an explanatory view showing an example of detection of an object by the outside-of-visible measurement device of FIG.
 この図3は、図2における交差点に進入する自動車および測定対象となる歩行者などの対象物111を上面から見た例を示している。 This FIG. 3 shows the example which looked at target objects 111, such as a motor vehicle entering into the intersection in FIG. 2 and a pedestrian to be measured from above.
 可視外計測装置110は、例えば自動車の前面の中央に設置している。ここでは、可視外計測装置110を自動車の前面の中央に設置する例を示したが、例えばレーダ照射部101およびレーダ検出部102のみを自動車の前面中央に設置するようにしてもよい。 The outside-of-visible measurement device 110 is installed, for example, at the center of the front of the car. Here, although the example which installs the non-visible measurement device 110 in the center of the front of a car was shown, for example, you may make it install only the radar irradiation part 101 and the radar detection part 102 in the front center of a car.
 図3において、自動車は、走行しており交差点に接近している状況である。歩行者である対象物111は、自動車の進行方向に対して交差点の左折側の道路の自動車に近い側の路側帯に位置しており、該路側帯から交差点に向かって進行している。また、歩行者である対象物111は、障害物115によって隠れた状態となっている。 In FIG. 3, the car is in a state of traveling and approaching an intersection. The object 111, which is a pedestrian, is located on the roadside of the road on the left side of the intersection, which is closer to the car, with respect to the traveling direction of the car, and proceeds from the roadside to the intersection. Further, the object 111 which is a pedestrian is in a state of being hidden by the obstacle 115.
 ここで、可視外計測例として、自動車が進行している道路幅wは約6mであり、自動車から交差点までの距離dは約15mとする。また、自動車は、例えば路肩から2m程度離れた位置にあり、歩行者である対象物111は、道路の端部から約1m離れているものとする。 Here, as an example of non-visible measurement, the road width w on which the car is traveling is about 6 m, and the distance d from the car to the intersection is about 15 m. Further, it is assumed that the car is at a position, for example, about 2 m away from the road shoulder, and the object 111, which is a pedestrian, is about 1 m away from the end of the road.
 歩行者である対象物111が交差点から約1.6m離れた位置にいると、レーダ照射部101から照射される照射波112および反射面114を介してレーダ検出部102に到達する反射波113反射波の伝搬距離は、可視外計測装置110から反射面114までが約21.2m、反射面114から歩行者である対象物111までが約5.1mとなる。 When the object 111 which is a pedestrian is at a position about 1.6 m away from the intersection, the reflected wave 113 reflected on the radar detection unit 102 via the irradiation wave 112 emitted from the radar irradiation unit 101 and the reflection surface 114 The wave propagation distance is about 21.2 m from the non-visible measurement device 110 to the reflection surface 114 and about 5.1 m from the reflection surface 114 to the object 111 which is a pedestrian.
 この図3に示した配置例での歩行者、すなわち対象物111からの反射波の信号強度を算出した。 The signal intensity of the reflected wave from the pedestrian, that is, the object 111 in the arrangement example shown in FIG. 3 was calculated.
 図4は、図3の反射波の信号強度の算出に用いたレベルダイヤの一例である。 FIG. 4 is an example of the level diagram used for calculating the signal strength of the reflected wave of FIG.
 レーダ装置の標準的な仕様値としては、例えばRecommendation ITU-R M.2057-0,“ Systems characteristics of automotive radars operating in the frequency band 76-81 GHz for intelligent transport systems applications",International Telecommunication Union Radiocommunications Sector,2014 Feb.のTable 1に記載されたRadar Bを参照し、周波数は77GHz-81GHz、検出可能距離100m、分解能7.5cm、変調方式FMCW、周波数帯域4GHzとした。 As standard specification values of the radar apparatus, for example, Recommendation ITU-R M. 2057-0, “System characteristics of automotive radars operating in the frequency band 76-81 GHz for intelligent transport systems applications”, International Telecommunication Union Radiocommunications Sector, Referring to Radar B described in Table 1 of 2014 Feb., the frequency was 77 GHz to 81 GHz, the detectable distance was 100 m, the resolution was 7.5 cm, the modulation method FMCW, and the frequency band 4 GHz.
 図4において、実線は図3の例について算出したレベルダイヤであり、点線は標準的な仕様値により算出した可視内の100m前方の車両のレベルダイヤである。 In FIG. 4, the solid line is the level diagram calculated for the example of FIG. 3, and the dotted line is the level diagram of the vehicle 100 m ahead within the visible range calculated by the standard specification value.
 レーダ照射部101の出力10dBmは、アンテナ利得23dBiにより33dBmに増幅される。伝搬経路の減衰率は、可視外計測装置110から歩行者である対象物111まで63.5dBとなる。 The output 10 dBm of the radar irradiation unit 101 is amplified to 33 dBm by the antenna gain 23 dBi. The attenuation factor of the propagation path is 63.5 dB from the non-visible measurement device 110 to the object 111 which is a pedestrian.
 反射面114の反射率は、コンクリートの誘電率がおよそ8であることから23%とし、反射による減衰率は6.4dBとした。 The reflectance of the reflective surface 114 was 23% because the dielectric constant of concrete is about 8, and the attenuation rate by reflection was 6.4 dB.
 測定する対象物111での反射率を表す散乱断面積は、標準的な-15dBsmとした。対象物111にて反射した反射波の電力は-51.9dBmとなる。反射波の伝搬減衰も照射波と同様に63.5dB、反射による減衰6.4dBとする。 The scattering cross section representing the reflectance at the object to be measured 111 was set to standard -15 dBsm. The power of the reflected wave reflected by the object 111 is −51.9 dBm. The propagation attenuation of the reflected wave is also 63.5 dB and the attenuation due to reflection is 6.4 dB as in the case of the irradiation wave.
 さらに、降雨時の減衰率は、-20dB/kmと知られており、全経路の52.5mに対して減衰率を1dBとした。反射波は、受信アンテナの利得16dBmiを経てレーダ検出部102に到達する。 Furthermore, the attenuation factor during rainfall is known to be -20 dB / km, and the attenuation factor is 1 dB for 52.5 m of the entire path. The reflected wave reaches the radar detection unit 102 through the gain 16 dBmi of the receiving antenna.
 上記の減衰率および反射率を経て、レーダ検出部102に到達した電波の電力は-117dBmとなり、レーダ検出部102の最小検出感度-120dBmに対して、レーダ検出部102におけるマージンは3dBとなり、検出可能である。また、対象物が対象物111である歩行者より散乱断面積の大きい自転車や自動車の場合には、さらにマージンが大きくなる。 Through the above attenuation factor and reflectance, the power of the radio wave reaching the radar detection unit 102 is -117 dBm, and the margin of the radar detection unit 102 is 3 dB with respect to the minimum detection sensitivity of -120 dBm of the radar detection unit 102 It is possible. Further, in the case of a bicycle or an automobile having a larger scattering cross-sectional area than the pedestrian whose object is the object 111, the margin is further increased.
 このとき、自動車の速度が約30km/hとした場合、自動車から交差点までの距離d=15mが、警告が出てからドライバが感知してブレーキを踏むまでの0.75秒間の空走距離を含めて停止可能な距離である。 At this time, assuming that the speed of the car is approximately 30 km / h, the distance d = 15 m from the car to the intersection means 0.75 seconds of running distance until the driver senses that the warning is given and the driver steps on the brake. It is the distance that can be stopped including.
 また、d=15mは、可視外計測装置110から上述した自動ブレーキ機能の制御を司るECUにアラートである停止信号を伝達してすぐにブレーキをかけた場合、自動車の速度が約50km/hであると停止可能な距離である。 In the case of d = 15 m, when braking is applied immediately after transmitting the stop signal which is an alert from the non-visible measurement device 110 to the ECU in charge of controlling the above-mentioned automatic braking function, the speed of the vehicle is about 50 km / h If there is a distance that can be stopped.
 図3に示す配置例においてブレーキをかけない場合には、自動車に気がつかずに時速約5km以上で交差点に向かっている歩行者は、衝突する可能性が高い。したがって、可視外計測装置110を自動車に搭載することにより、障害物によって隠れた歩行者などが侵入する交差点において、警告をしたり、あるいはブレーキをかけることにより、衝突防止や重大事故の軽減などを図ることができる。 In the arrangement example shown in FIG. 3, when the brake is not applied, a pedestrian who is heading to the intersection at about 5 km / hour or more without being aware of the vehicle is likely to collide. Therefore, by mounting the non-visible measuring device 110 in a car, at a crossing where a pedestrian or the like hidden by an obstacle enters, warning or braking is performed to prevent collisions or reduce serious accidents. Can be
 また、図3では、反射面114が自動車の車幅方向の中心線よりも左前方側、すなわち対象物111の位置と同じ方向となっている。よって、レーダ照射部101からは、図示するように自動車の車幅方向の中心線よりも左前方側に電磁波が照射される。この場合における電磁波の照射範囲を第1の範囲とする。 Further, in FIG. 3, the reflecting surface 114 is on the left front side of the center line in the vehicle width direction of the vehicle, that is, in the same direction as the position of the object 111. Therefore, the electromagnetic wave is emitted from the radar irradiation unit 101 to the left front side of the center line in the vehicle width direction of the vehicle as illustrated. The irradiation range of the electromagnetic wave in this case is taken as a first range.
 第1の範囲は、正の水平角を有する範囲であり、例えば90°程度の水平角を有する。ここで、レーダ照射部101の全照射範囲の中心、言い換えれば自動車の車幅方向の中心線を基準として、左方向の水平角を正の水平角とし、右方向の水平角を負の水平角とする。 The first range is a range having a positive horizontal angle, for example, a horizontal angle of about 90 degrees. Here, the horizontal angle in the left direction is a positive horizontal angle, and the horizontal angle in the right direction is a negative horizontal angle, with respect to the center of the entire irradiation range of the radar irradiation unit 101, in other words, the center line in the vehicle width direction of the vehicle. I assume.
 この場合には、交差点の自動車に近い側の道路にいる歩行者などの対象物111を検出する際に検出範囲が広くなる。 In this case, when detecting the object 111 such as a pedestrian on the road near the automobile at the intersection, the detection range becomes wide.
 〈対象物の検出例2〉
 図5は、図3の可視外計測装置による対象物の検出の他の例を示す説明図である。
<Detection example 2 of object>
FIG. 5 is an explanatory view showing another example of detection of an object by the outside-of-visible measurement device of FIG. 3.
 この図5が、図3と異なるところは、歩行者である対象物の位置である。図5では、対象物111aが自動車の進行方向に対して交差点の左折側の道路の自動車から遠い側の路側帯に位置している。 This FIG. 5 differs from FIG. 3 in the position of the object which is a pedestrian. In FIG. 5, the object 111a is located in the roadside zone on the left side of the road on the left side of the intersection with respect to the traveling direction of the vehicle.
 また、図3では、前述したように反射面114が自動車の車幅方向の中心線よりも左前方側としたが、この図5に示す例では、反射面114aが自動車の車幅方向の中心線よりも右前方側、すなわち自動車の走行車線側ではなく、対向車線側の壁面としている。 Further, although in FIG. 3 the reflecting surface 114 is on the left front side of the center line in the vehicle width direction of the vehicle as described above, in the example shown in FIG. 5, the reflecting surface 114a is the center in the vehicle width direction of the vehicle It is a wall on the opposite lane side, not to the right front side of the line, that is, not on the lane side of the vehicle.
 よって、照射波112は、図5に示すようにレーダ照射部101から自動車の進行方向右側の壁面を反射面114aとして自動車の進行方向に対して交差点の左折側の道路の自動車から遠い側の路側帯に位置している対象物111aに到達する経路となる。同様に、対象物111にて反射された反射波113は、反射面114aを経由してレーダ検出部102に到達する経路となる。 Therefore, as shown in FIG. 5, the irradiation wave 112 is a road surface on the left side of the intersection on the left side of the intersection with respect to the traveling direction of the vehicle with the wall surface on the right in the traveling direction of the vehicle It becomes a path | route which arrives at the target object 111a located in the side zone. Similarly, the reflected wave 113 reflected by the object 111 is a path that reaches the radar detection unit 102 via the reflection surface 114 a.
 よって、レーダ照射部101からは、自動車の車幅方向の中心線よりも右前方側に電磁波が照射される。この場合における電磁波の照射範囲を第2の範囲とする。第2の範囲は、上述した負の水平角を有する範囲であり、例えば-90°程度の水平角を有する。 Therefore, the electromagnetic wave is emitted from the radar irradiation unit 101 on the right front side of the center line in the vehicle width direction of the vehicle. The irradiation range of the electromagnetic wave in this case is taken as a second range. The second range is a range having the above-described negative horizontal angle, for example, a horizontal angle of about -90 °.
 この図5に示す例では、例えば図3に示すように、自動車の車幅方向の中心線よりも左前方側に反射面114がなくても対象物111aを検出することが可能となる。この場合には、交差点からより離れた位置、具体的には図5の対象物111aよりもより左側に位置する対象物を検出する場合の検出範囲を広くすることができる。 In the example shown in FIG. 5, for example, as shown in FIG. 3, it is possible to detect the object 111a without the reflecting surface 114 on the left front side of the center line in the vehicle width direction of the automobile. In this case, it is possible to widen the detection range in the case of detecting an object located at a position further away from the intersection, more specifically, on the left side of the object 111a in FIG.
 また、対象物は、複数の反射面からの経路を用いて検出するようにしてもよい。 Also, the object may be detected using paths from a plurality of reflecting surfaces.
 〈対象物の検出例3〉
 図6は、図3の可視外計測装置による対象物の検出の一例を示す説明図である。
<Detection example 3 of object>
FIG. 6 is an explanatory view showing an example of detection of an object by the outside-of-visible measurement device of FIG.
 この図6では、2つの電磁波の経路によって対象物111を測定している例を示している。1つは、図3と同様に反射面114による電磁波の経路であり、該反射面114は、自動車の車幅方向の中心線よりも左前方側にある。他の1つは、図5と同様に反射面114aによる電磁波の経路であり、反射面114aが自動車の車幅方向の中心線よりも左前方側にある。 In this FIG. 6, the example which measures the target object 111 by the path | route of two electromagnetic waves is shown. One is a path of the electromagnetic wave by the reflecting surface 114 as in FIG. 3, and the reflecting surface 114 is on the left front side of the center line in the vehicle width direction of the vehicle. The other is the electromagnetic wave path by the reflecting surface 114a as in FIG. 5, and the reflecting surface 114a is on the left front side of the center line in the vehicle width direction of the vehicle.
 図6において、反射面114による電磁波の経路は、照射波112および反射波113によって示しており、反射面114aによる電磁波の経路は、照射波112aおよび反射波113aによって示している。 In FIG. 6, the path of the electromagnetic wave by the reflecting surface 114 is indicated by the irradiation wave 112 and the reflected wave 113, and the path of the electromagnetic wave by the reflecting surface 114a is indicated by the irradiation wave 112a and the reflected wave 113a.
 この場合、これら2つの反射面114,114aによる電磁波の複数の経路から対象物111の位置を複数回計測する。そして、計測した複数の位置情報から最も可能性の高い対象物111の位置を算出して出力するようにしてもよい。このように、複数回測定することにより、より位置の計測精度を向上させることができる。 In this case, the position of the object 111 is measured a plurality of times from a plurality of paths of the electromagnetic wave by the two reflecting surfaces 114 and 114a. Then, the position of the target object 111 having the highest possibility may be calculated and output from the plurality of measured position information. Thus, the measurement accuracy of the position can be further improved by performing measurement a plurality of times.
 〈対象物の検出例4〉
 図7は、図1の可視外計測装置による対象物の検出の他の例を示す説明図である。
<Detection example 4 of object>
FIG. 7 is an explanatory view showing another example of detection of an object by the outside-of-visible measurement device of FIG.
 上述した図3、図5、および図6では、可視外計測装置110を自動車の前面の中央部に設置した例を示したが、図7では、2つの可視外計測装置を自動車に搭載した例を示している。 Although FIG. 3, FIG. 5, and FIG. 6 mentioned above showed the example which installed the non-visible measuring device 110 in the center part of the front of a car, in FIG. 7, the example which mounted two non-visible measuring devices in a car Is shown.
 自動車には、図7に示すように2つの可視外計測装置110,110aが例えば自動車の前方両端部にそれぞれ搭載されている。可視外計測装置110aの接続構成については、図1の可視外計測装置110と同様であるので説明は省略する。 In a car, as shown in FIG. 7, two non-visible measurement devices 110 and 110a are mounted, for example, at both front ends of the car. The connection configuration of the outside-of-visible measurement device 110a is the same as that of the outside-of-visible measurement device 110 in FIG.
 図7に示す例では、自動車のヘッドライト付近に可視外計測装置110,110aがそれぞれ搭載されている。なお、図7においても、例えばレーダ照射部101およびレーダ検出部102のみを自動車の前面両端部にそれぞれ設置するようにしてもよい。 In the example shown in FIG. 7, the non-visible measurement devices 110 and 110a are mounted in the vicinity of the headlights of a car. In FIG. 7 as well, for example, only the radar irradiation unit 101 and the radar detection unit 102 may be installed at both front end portions of the vehicle.
 このように可視外計測装置110,110aを自動車のヘッドライト付近に搭載することにより、より広い範囲に計測することができる。 By mounting the non-visible measurement device 110, 110a in the vicinity of the headlight of the vehicle in this manner, measurement can be performed in a wider range.
 例えば右側に搭載された可視外計測装置110を用いる場合、図3と同様に自動車の車幅方向の中心線よりも左前方側の壁面などが反射面114となる。よって、照射波112および反射波113は、図7に示す経路となって電磁波を交差点の対象物111まで到達させることができる。 For example, when the non-visible measurement device 110 mounted on the right side is used, a wall surface on the left front side or the like with respect to the center line in the vehicle width direction of the vehicle becomes the reflection surface 114 as in FIG. Therefore, the irradiation wave 112 and the reflected wave 113 can be a path shown in FIG. 7 to allow the electromagnetic wave to reach the object 111 at the intersection.
 また、左側に搭載された可視外計測装置110を用いる場合、図5と同様に自動車の車幅方向の中心線よりも右前方側の壁面などが反射面114aとなる。よって、照射波112aおよび反射波113aは、図7に示す経路となって電磁波を交差点の対象物111aまで到達させることができる。 When the non-visible measurement device 110 mounted on the left side is used, a wall surface on the right front side or the like with respect to the center line in the vehicle width direction of the vehicle serves as the reflection surface 114a as in FIG. Therefore, the irradiation wave 112a and the reflected wave 113a become a path | route shown in FIG. 7, and can make electromagnetic waves reach the target object 111a of an intersection.
 この場合、可視外計測装置110aが自動車の中心よりも左側にオフセットして設置されるので、照射波112aおよび反射波113aの経路が図5と異なることになる。これにより、電磁波が図5よりも交差点からより離れた位置まで到達する経路となる。その結果、より交差点から離れている対象物111aを検出することができる。 In this case, since the non-visible measurement device 110a is installed offset to the left with respect to the center of the vehicle, the paths of the irradiation wave 112a and the reflected wave 113a are different from those in FIG. As a result, the electromagnetic waves reach a position farther from the intersection than in FIG. As a result, it is possible to detect an object 111a further away from the intersection.
 また、2つの可視外計測装置を搭載するのではなく、2つのレーダ照射部101および2つのレーダ検出部102をそれぞれ設け、残りの解析部104は共通とする構成であってもよい。 Further, instead of mounting two non-visible measurement devices, two radar irradiation units 101 and two radar detection units 102 may be provided, and the remaining analysis units 104 may be common.
 〈検出処理例〉
 続いて、可視外計測装置110による計測技術について説明する。
<Example of detection process>
Then, the measurement technique by non-visible measurement device 110 is explained.
 図8は、図1の可視外計測装置110による計測処理の一例を示すフローチャートである。 FIG. 8 is a flowchart showing an example of measurement processing by the outside-of-visible measurement device 110 of FIG.
 まず、可視外計測装置110が起動すると、画像入力部103は、取得した画像情報を出力する(ステップS101)。形状認識部105は、画像入力部103から出力された画像情報から反射面の形状と位置を認識して、その認識結果を形状情報として出力する(ステップS102)。 First, when the non-visible measurement device 110 is activated, the image input unit 103 outputs the acquired image information (step S101). The shape recognition unit 105 recognizes the shape and position of the reflective surface from the image information output from the image input unit 103, and outputs the recognition result as shape information (step S102).
 そして、範囲解析部106は、形状認識部105が出力した形状情報に基づいて、電磁波の照射範囲と検出範囲とを算出し、その算出結果を範囲情報として出力する(ステップS103)。 Then, the range analysis unit 106 calculates the irradiation range and the detection range of the electromagnetic wave based on the shape information output from the shape recognition unit 105, and outputs the calculation result as the range information (step S103).
 レーダ照射部101が電磁波を照射することにより(ステップS104)、レーダ検出部102が対象物などにて反射された電磁波を検出する(ステップS105)。信号解析部108は、レーダ検出部102が出力した検出信号から、範囲情報の範囲に限定して解析した反射位置を出力する(ステップS106)。 When the radar irradiation unit 101 irradiates an electromagnetic wave (step S104), the radar detection unit 102 detects the electromagnetic wave reflected by the object or the like (step S105). The signal analysis unit 108 outputs, from the detection signal output from the radar detection unit 102, a reflection position analyzed by limiting to the range of the range information (step S106).
 位置解析部107は、形状情報と反射位置とを統合して対象物の位置情報を出力する(ステップS107)。位置解析部107は、計測が終了したか否かを判定し(ステップS108)、終了していない場合はステップS101の処理に戻り、再び計測を行う。 The position analysis unit 107 integrates the shape information and the reflection position and outputs the position information of the object (step S107). The position analysis unit 107 determines whether or not the measurement has been completed (step S108). If the measurement has not been completed, the process returns to the process of step S101 to perform measurement again.
 ここで、ステップS106の処理において、信号解析を行う範囲が視野の全範囲であると解析時間が大きくなり、衝突の危険を警告するまでに間に合わない恐れがある。よって、ステップS105の処理では、範囲解析部106が例えば反射面のある部分だけに解析範囲を限定することにより、計測時間を短縮することができる。それにより、実用的な時間内での解析を行うことができる。 Here, in the process of step S106, if the range in which signal analysis is performed is the entire range of the field of view, the analysis time becomes large, and there is a possibility that the analysis may not be made before warning of the danger of collision. Therefore, in the process of step S105, the measurement time can be shortened by the range analysis unit 106 limiting the analysis range to, for example, only the part having the reflection surface. Thereby, analysis can be performed within a practical time.
 以上により、障害物115に隠れた対象物111を容易に検出することができる。また、より短い時間で高精度に対象物での反射位置を特定することができるので、低い計算負荷にて高速に可視外にある対象物111の位置を検出することができる。 As described above, the object 111 hidden by the obstacle 115 can be easily detected. In addition, since the reflection position on the object can be identified with high accuracy in a shorter time, the position of the object 111 which is out of visible can be detected at high speed with low calculation load.
 それにより、信頼性の高い可視外計測装置を低コストにて提供することができる。 Thus, it is possible to provide a highly reliable non-visible measurement device at low cost.
 本実施の形態1では、可視外計測装置110を自動車に搭載して見通しの悪い交差点などにおける衝突を防止する自動車運転安全支援に用いる例について示したが、可視外計測装置110の用途は、自動車運転安全支援に限定されるものではない。 Although the first embodiment shows an example of using the non-visible measurement device 110 in a vehicle to support driving safety for preventing collisions at intersections with poor visibility etc., the application of the non-visible measurement device 110 is It is not limited to driving safety support.
 例えば、自動走行車やロボットの状況監視、歩行者や自転車の安全支援、仮想現実(VR)/拡張現実(AR)/複合現実(MR)の周辺監視センサ、ドローンに搭載する監視装置、形状認識やジェスチャ認識装置、監視カメラ装置などに可視外計測装置110を適用することにより、障害物によって見えない対象物を可視化することができる。 For example, monitoring of conditions of vehicles and robots, safety support for pedestrians and bicycles, virtual reality (VR) / augmented reality (AR) / mixed reality (MR) perimeter monitoring sensors, monitoring devices installed in drone, shape recognition By applying the non-visible measurement device 110 to a gesture recognition device, a monitoring camera device, or the like, it is possible to visualize an object that can not be seen by an obstacle.
 また、画像入力部103からの画像情報、形状認識部105にて算出した形状情報、および範囲解析部106により算出した範囲情報は、解析情報として蓄積して保持し、次の測定時に再利用してもよい。解析情報は、例えば可視外計測装置110が有する図示しない記憶部などに蓄積する。 In addition, the image information from the image input unit 103, the shape information calculated by the shape recognition unit 105, and the range information calculated by the range analysis unit 106 are accumulated and held as analysis information, and reused at the next measurement. May be The analysis information is accumulated, for example, in a storage unit (not shown) of the non-visible measurement device 110 or the like.
 例えば、同じ交差点に進入する場合には、以前に取得した解析情報を上述した記憶部から読み出して範囲解析や信号解析を行うことにより、信号解析部108の処理時間を高速化したり、計算負荷を低減することができる。 For example, when entering the same intersection, the processing time of the signal analysis unit 108 can be speeded up or the calculation load can be increased by reading out the analysis information acquired earlier from the storage unit described above and performing range analysis and signal analysis. It can be reduced.
 あるいは、形状認識部105は、入力された画像情報と過去の画像情報との差分のみを抽出することによって形状情報を算出してもよい。必要であれば、解析情報をあらかじめ蓄積してから、位置解析を行ってもよい。 Alternatively, the shape recognition unit 105 may calculate shape information by extracting only the difference between the input image information and the past image information. If necessary, analysis information may be accumulated in advance and then position analysis may be performed.
 記憶された解析情報は、通信システムや記憶媒体を介して、他の自動車に搭載された可視外計測装置に提供し、共有してもよい。また、通信システムを介して、任意のサーバシステムやクラウドサービスに蓄積し、他の可視外計測装置と共有してもよい。 The stored analysis information may be provided and shared via the communication system or a storage medium to the non-visible measurement device mounted on another vehicle. Also, it may be stored in any server system or cloud service via the communication system, and may be shared with other non-visible measurement devices.
 このように解析情報を複数の可視外計測装置によって共有することにより、より高速に低い計算負荷にて可視外計測を行うことができる。 By sharing the analysis information by the plurality of non-visible measurement devices in this manner, the non-visible measurement can be performed at high speed with low calculation load.
 さらに、衝突の危険性があった場合には、その危険位置情報を記憶して、他の可視外計測装置と共有することにより、解析することなく危険を警告することができる。それにより、安全性の向上を図ることができる。 Furthermore, when there is a risk of collision, the danger position information can be stored and shared with other non-visible measurement devices to warn of the danger without analysis. Thereby, the safety can be improved.
 多くの可視外計測装置が取得した危険位置情報を分析して、危険性の高い交差点やカーブなどを明らかにすることができ、道路の改良など交通安全対策の情報として活用することができる。 By analyzing the dangerous position information acquired by many non-visible measuring devices, it is possible to clarify intersections and curves having high risk, and it can be used as information on traffic safety measures such as improvement of roads.
 以上により、低コストであるレーダを用いて、より低い計算負荷で高速に可視外にある対象物の位置を検出する可視外計測装置を提供することができる。 According to the above, it is possible to provide the non-visible measurement device that detects the position of an object that is out of view at high speed with lower calculation load using a low-cost radar.
 (実施の形態2)
 〈可視外計測装置の構成例および動作〉
 図9は、本実施の形態2による可視外計測装置における構成の一例を示す説明図である。
Second Embodiment
<Configuration Example and Operation of Non-Visible Measurement Device>
FIG. 9 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement device according to the second embodiment.
 この図9の可視外計測装置110が、前記実施の形態1の図1の可視外計測装置110と異なるところは、新たに地図入力部109が設けられた点である。その他の接続構成については、図1と同様であるので説明は省略する。 The non-visible measurement device 110 of FIG. 9 is different from the non-visible measurement device 110 of FIG. 1 of the first embodiment in that a map input unit 109 is newly provided. The other connection configurations are the same as those in FIG.
 地図入力部109は、衛星測位システム(GPS)などから取得した現在位置情報に基づいて、図示しない内部記憶装置や記録メディア、あるいは通信により接続された外部サーバなどから地図情報を取得して、解析部104が有する形状認識部105に出力する。 The map input unit 109 acquires map information from an internal storage device or recording medium (not shown), an external server connected by communication, or the like based on current position information acquired from a satellite positioning system (GPS) or the like, and analyzes it. It is output to the shape recognition unit 105 that the unit 104 has.
 形状認識部105は、地図入力部109が取得した地図情報を用いることにより、反射面の存在位置を推定することができる。加えて反射面の位置精度が向上し、画像情報の解析時間を短縮することができる。地図入力部109が取得する地図情報は、建造物の立体構造を含んだ情報でもよく、その場合には、さらに形状認識の計算時間を短縮することができる。 The shape recognition unit 105 can estimate the existing position of the reflective surface by using the map information acquired by the map input unit 109. In addition, the positional accuracy of the reflective surface is improved, and the analysis time of the image information can be shortened. The map information acquired by the map input unit 109 may be information including a three-dimensional structure of a building, in which case the calculation time of shape recognition can be further shortened.
 一方、画像入力部103は、リアルタイムに前方視界の画像情報を取得して出力している。これにより、前方の車両や地図情報にない障害物を識別し、反射面の位置を推定することができる。 On the other hand, the image input unit 103 acquires and outputs image information of the front view in real time. In this way, it is possible to identify a vehicle ahead or an obstacle that is not in the map information, and to estimate the position of the reflecting surface.
 したがって、地図情報と画像情報とを組み合わせて形状認識を行うことにより、短い解析時間で高精度に解析することが可能となる。 Therefore, by performing shape recognition by combining map information and image information, analysis can be performed with high accuracy in a short analysis time.
 以上、地図入力部109が取得した地図情報を用いることにより、より高速で低コストの可視外計測装置110を提供することができる。 As described above, by using the map information acquired by the map input unit 109, it is possible to provide the low-speed non-visible measurement device 110 at high speed and low cost.
 (実施の形態3)
 〈可視外計測装置の構成例および動作〉
 図10は、本実施の形態3による可視外計測装置における構成の一例を示す説明図である。
Third Embodiment
<Configuration Example and Operation of Non-Visible Measurement Device>
FIG. 10 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement apparatus according to the third embodiment.
 図10に示す可視外計測装置110が、前記実施の形態1の図1の可視外計測装置110と異なるところは、レーダ照射部101による照射技術である。接続構成については、図1の可視外計測装置110と同じであるので説明は省略する。 The non-visible measurement device 110 shown in FIG. 10 is different from the non-visible measurement device 110 of FIG. 1 of the first embodiment in the irradiation technique by the radar irradiation unit 101. The connection configuration is the same as the outside-of-visible measurement device 110 in FIG.
 この場合、レーダ照射部101は、照射波601として回折波を照射する。また、レーダ検出部102は、反射波602として回折波を検出する。レーダ照射部101から照射される回折波である照射波601は、障害物115の壁面などにより回折されて回り込み、対象物111に到達する。 In this case, the radar irradiation unit 101 irradiates a diffracted wave as the irradiation wave 601. The radar detection unit 102 also detects a diffracted wave as the reflected wave 602. The irradiation wave 601, which is a diffracted wave irradiated from the radar irradiation unit 101, is diffracted by the wall surface of the obstacle 115 and the like, turns around, and reaches the object 111.
 例えば、図3に示した配置例の場合には、レーダ照射部101が照射する電磁波の周波数を2GHzとすると、障害物115となる壁面はフレネルゾーン半径1.4m以下の高さとなり、回折による減衰は、15dB以下となる。 For example, in the case of the arrangement example shown in FIG. 3, assuming that the frequency of the electromagnetic wave irradiated by the radar irradiation unit 101 is 2 GHz, the wall surface to be the obstacle 115 has a height of 1.4 m or less of the Fresnel zone radius, The attenuation is less than 15 dB.
 測定の対象物111により反射された回折波の反射波602は、再び障害物115の壁面などにより回折されて回り込み、レーダ検出部102に到達する。このとき、照射波および反射波の経路は直線でなくてもよい。 The reflected wave 602 of the diffracted wave reflected by the object of measurement 111 is diffracted again by the wall surface of the obstacle 115 or the like to wrap around and reach the radar detection unit 102. At this time, the paths of the irradiation wave and the reflected wave may not be straight.
 回折による減衰は、フレネル回折の原理に従い到達経路の光路長から計算により求まる。形状認識部105は、回折が生じる障害物の形状を認識して出力し、範囲解析部106は、回折によって照射波が到達する範囲、および回折によって反射波がレーダ検出部102に到達する範囲を算出する。 Attenuation due to diffraction is calculated from the optical path length of the arrival path according to the principle of Fresnel diffraction. The shape recognition unit 105 recognizes and outputs the shape of the obstacle causing the diffraction, and the range analysis unit 106 determines the range in which the irradiation wave arrives by diffraction and the range in which the reflected wave reaches the radar detection unit 102 by diffraction. calculate.
 信号解析部108は、回折で到達した検出信号の解析を行い反射位置の情報を出力する。位置解析部107は、反射位置と形状情報から対象物111の位置を算出して出力する。 The signal analysis unit 108 analyzes the detection signal reached by the diffraction, and outputs information on the reflection position. The position analysis unit 107 calculates and outputs the position of the object 111 from the reflection position and the shape information.
 このように、図10に示す可視外計測装置110は、照射波601および反射波602に回折波を用いることによって、例えばコンクリート壁のような反射面がない交差点や見通しの悪いカーブにおいても、可視外計測を行うことができる。 As described above, the non-visible measurement device 110 shown in FIG. 10 uses the diffracted waves for the irradiation wave 601 and the reflected wave 602 so that it is visible even at an intersection where there is no reflective surface such as a concrete wall or a poor curve. External measurement can be performed.
 (実施の形態4)
 〈可視外計測装置の構成例〉
 図11は、本実施の形態4による可視外計測装置における構成の一例を示す説明図である。
Embodiment 4
<Configuration Example of Non-Visible Measurement Device>
FIG. 11 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement device according to the fourth embodiment.
 図11の可視外計測装置110が、前記実施の形態1の図1と異なるところは、範囲解析部106から出力される範囲情報を信号解析部108だけではなく、レーダ照射部101にも出力している点である。その他の接続構成については、図1と同様であるので説明は省略する。 The non-visible measurement device 110 of FIG. 11 differs from that of FIG. 1 of the first embodiment in that range information output from the range analysis unit 106 is output not only to the signal analysis unit 108 but also to the radar irradiation unit 101. It is a point that The other connection configurations are the same as those in FIG.
 レーダ照射部101は、範囲解析部106が出力する照射する範囲情報に基づいて、電磁波を照射する範囲を限定することができる。レーダ照射部101の照射範囲を限定することにより、測定時間の短縮および解析計算量の削減を図ることができる。 The radar irradiation unit 101 can limit the irradiation range of the electromagnetic wave based on the irradiation range information output from the range analysis unit 106. By limiting the irradiation range of the radar irradiation unit 101, the measurement time can be shortened and the amount of analysis calculation can be reduced.
 〈計測処理例〉
 図12は、図11の可視外計測装置110による可視外計測処理の一例を示すフローチャートである。
<Example of measurement processing>
FIG. 12 is a flowchart showing an example of the non-visible measurement process by the non-visible measurement device 110 of FIG.
 この図12において、前記実施の形態1の図8に示すフローチャートと異なる点は、ステップS204~S206の処理である。ステップS201~S203の処理およびステップS207~S209の処理については、図8のステップS101~S103の処理およびステップS106~108の処理と同様であるので説明は省略する。 This FIG. 12 differs from the flowchart shown in FIG. 8 of the first embodiment in the processing of steps S204 to S206. The processes of steps S201 to S203 and the processes of steps S207 to S209 are the same as the processes of steps S101 to S103 and steps S106 to of FIG.
 まず、レーダ照射部101は、ステップS203の処理における範囲解析によって算出された範囲情報に基づいて、電磁波の照射範囲を例えば壁の反射面などに限定して照射する(ステップS204)。レーダ検出部102は、その反射面などに限定して電磁波を検出する(ステップS205)。 First, based on the range information calculated by range analysis in the process of step S203, the radar irradiation unit 101 limits the irradiation range of the electromagnetic wave to, for example, the reflection surface of the wall and the like (step S204). The radar detection unit 102 detects an electromagnetic wave limited to the reflection surface or the like (step S205).
 続いて、信号解析部108は、電磁波の照射範囲が範囲情報を網羅しているか否かを判定し(ステップS206)、網羅している場合は、信号解析を行う(ステップS207)。網羅していない場合には、ステップS204の処理に戻り、照射範囲を変更して電磁波の照射を行う。 Subsequently, the signal analysis unit 108 determines whether or not the irradiation range of the electromagnetic wave covers the range information (step S206). When the range is covered, the signal analysis is performed (step S207). When not covering, it returns to the process of step S204, changes the irradiation range, and irradiates electromagnetic waves.
 このように、電磁波の照射範囲および電磁波の検出範囲を限定することにより、処理時間を短縮して、高速にて低コストの可視外計測を行うことができる。 Thus, by limiting the irradiation range of the electromagnetic wave and the detection range of the electromagnetic wave, the processing time can be shortened, and low-cost non-visible measurement can be performed at high speed.
 (実施の形態5)
 〈計測処理例〉
 図13は、本実施の形態5による可視外計測装置における計測処理の一例を示すフローチャートである。
Fifth Embodiment
<Example of measurement processing>
FIG. 13 is a flowchart showing an example of measurement processing in the non-visible measurement device according to the fifth embodiment.
 この図13の処理は、前記実施の形態4の図12に示す処理の他の例を示したものである。また、可視外計測装置110の接続構成については、前記実施の形態4の図11の可視外計測装置110と同じであるので説明は省略する。 The process of FIG. 13 shows another example of the process shown in FIG. 12 of the fourth embodiment. Further, the connection configuration of the outside-of-visible measurement device 110 is the same as the outside-of-visible measurement device 110 of FIG.
 前記実施の形態4の図12の計測処理では、ステップS203の処理である範囲解析の後にステップS204~ステップS206の処理を順次行っていた。図13に示す計測処理では、図12のステップS204~ステップS206と同じ処理であるステップS304(電磁波照射)、ステップS305(電磁波検出)、およびステップS306(範囲終了判定)の処理をステップS301~S303の処理と並行して処理することを特徴としている。 In the measurement process of FIG. 12 of the fourth embodiment, the processes of steps S204 to S206 are sequentially performed after the range analysis which is the process of step S203. In the measurement process illustrated in FIG. 13, the process of step S304 (irradiation with electromagnetic wave), step S305 (electromagnetic wave detection), and step S306 (determination of range end) which is the same process as step S204 to step S206 of FIG. It is characterized in that it is processed in parallel with the processing of
 また、ステップS301~S303の処理は、それぞれ第1のステップ~第3のステップであり、ステップS304は、第4のステップであり、ステップS305は、第5のステップであり、ステップS306は、第9のステップである。 Also, the processing of steps S301 to S303 is respectively the first step to the third step, step S304 is the fourth step, step S305 is the fifth step, and step S306 is the fourth step. 9 steps.
 なお、ステップS301~S303の処理は、図12のステップS201(画像取得)、ステップS202(形状認識)、およびステップS203(範囲解析)と同じである。
ステップS301の画像取得からステップS303の範囲解析を行う処理の間に、同時に、ステップS304の電磁波照射、ステップS305の電磁波検出、ステップS306の範囲終了判定を行う。なお、ステップS306の処理における判定の基準となる範囲は、あらかじめ一つ前の測定においてステップS303の処理にて設定された範囲とする。
The processing in steps S301 to S303 is the same as step S201 (image acquisition), step S202 (shape recognition), and step S203 (range analysis) in FIG.
Between the image acquisition in step S301 and the process of performing the range analysis in step S303, the electromagnetic wave irradiation in step S304, the electromagnetic wave detection in step S305, and the range end determination in step S306 are simultaneously performed. The range serving as the reference of the determination in the process of step S306 is the range set in advance by the process of step S303 in the previous measurement.
 このように形状認識および範囲解析の処理と電磁波照射および電磁波検出の処理とを同時に行うことにより、処理時間を短縮することができる。それにより、より高速で低コストの可視外計測を可能とすることができる。 Thus, processing time can be shortened by simultaneously performing the processing of shape recognition and range analysis and the processing of electromagnetic wave irradiation and electromagnetic wave detection. Thereby, it is possible to enable faster and lower-cost non-visible measurement.
 (実施の形態6)
 〈可視外計測装置の構成例〉
 図14は、本実施の形態6による可視外計測装置における構成の一例を示す説明図である。
Sixth Embodiment
<Configuration Example of Non-Visible Measurement Device>
FIG. 14 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement device according to the sixth embodiment.
 この図14の可視外計測装置110が、前記実施の形態1の図1の可視外計測装置110と異なるところは、新たに速度入力部141が設けられた点である。その他の接続構成については、図1と同様であるので説明は省略する。 The non-visible measurement device 110 of FIG. 14 is different from the non-visible measurement device 110 of FIG. 1 of the first embodiment in that a velocity input unit 141 is newly provided. The other connection configurations are the same as those in FIG.
 速度入力部141は、可視外計測装置110が搭載されている搭載装置の移動速度を取得する。例えば可視外計測装置110が自動車に搭載されている場合には、該自動車が搭載装置となる。 The speed input unit 141 acquires the moving speed of the mounting device on which the outside-of-visible measurement device 110 is mounted. For example, when the non-visible measurement device 110 is mounted on a car, the car becomes a mounting device.
 〈動作例〉
 以下、搭載装置が自動車として説明する。
<Operation example>
Hereinafter, the mounting device will be described as a car.
 自動車の場合、速度入力部141は、自動車に備えられている速度計測器あるいは衛星測位システムなどから移動速度を取得する。あるいは、速度入力部141に速度計測器を備えた構成としてもよい。 In the case of a car, the speed input unit 141 acquires the moving speed from a speed measuring device or a satellite positioning system provided in the car. Alternatively, the speed input unit 141 may be configured to include a speed measuring device.
 速度入力部141が得た移動速度の情報は、解析部104に入力される。解析部104は、速度入力部141から取得した移動速度に基づいて、計測範囲、解析範囲、レーダの照射範囲、および検出範囲などを設定する。 Information on the moving speed obtained by the speed input unit 141 is input to the analysis unit 104. The analysis unit 104 sets a measurement range, an analysis range, an irradiation range of the radar, a detection range, and the like based on the moving speed acquired from the speed input unit 141.
 例えば、高速道路を高速に走行している場合は、走行中の自動車から遠方の障害物の狭い領域のみを計測する。自動車が交差点の多い住宅地などを低速に走行している場合には、自動車から近傍の障害物を広い領域で計測することができる。 For example, when traveling at a high speed on a freeway, only a narrow area of obstacles far from a traveling car is measured. When a car is traveling at a low speed at a residential area where there are many intersections, obstacles in the vicinity of the car can be measured in a wide area.
 また、速度入力部141から取得した自動車の速度、信号解析部108にて検出信号に基づいて算出した対象物111の移動方向から自動車の速度を維持したときの衝突の可能性を予測することができる。 Further, it is possible to predict the possibility of collision when maintaining the speed of the vehicle from the speed of the vehicle acquired from the speed input unit 141 and the moving direction of the object 111 calculated based on the detection signal by the signal analysis unit 108. it can.
 ここで、信号解析部108では、レーダ検出部102の検出信号から得られた測定の対象物111の反射位置測定結果を複数回記録して、反射位置の移動方向と移動速度を算出する。 Here, the signal analysis unit 108 records the measurement result of the reflection position of the measurement object 111 obtained from the detection signal of the radar detection unit 102 a plurality of times, and calculates the movement direction and the movement speed of the reflection position.
 位置解析部107は、自動車の移動速度と対象物の移動方向および移動速度とから自動車からの対象物までの距離を算出し、該距離がある一定値以下の場合に衝突の警告信号を出力する。自動車から対象物111までの距離は、自動車から該自動車の移動方向と対象物111の移動方向の交差する点までの距離である。 The position analysis unit 107 calculates the distance from the car to the object from the moving speed of the car and the moving direction of the object and the moving speed, and outputs a collision warning signal when the distance is less than a certain value. . The distance from the car to the object 111 is the distance from the car to the intersection of the moving direction of the car and the moving direction of the object 111.
 〈計測処理例〉
 図15は、図14の可視外計測装置110を用いた計測処理の一例を示すフローチャートである。
<Example of measurement processing>
FIG. 15 is a flow chart showing an example of measurement processing using the outside-of-visible measurement device 110 of FIG.
 この図15の処理が前記実施の形態4の図12の計測処理と異なる点は、新たにステップS401の処理(速度取得)が追加されているところである。ステップS401の処理以降のステップS402~S410の処理は、図12のステップS201~S209の処理と同様であるので説明は省略する。 The difference between the process of FIG. 15 and the measurement process of FIG. 12 of the fourth embodiment is that the process of step S401 (speed acquisition) is newly added. The processing of steps S402 to S410 after the processing of step S401 is the same as the processing of steps S201 to S209 of FIG.
 図15において、速度入力部141は、搭載装置である自動車の速度計測器あるいは衛星測位システムなどから移動速度を取得する。得られた移動速度は、ステップS404の範囲解析や、ステップS407の範囲終了判定、ステップS408の信号解析の範囲、ステップS409の位置解析の範囲において、高速走行時には遠方の狭い領域、低速走行時には近傍の広い領域を設定することに用いられる。 In FIG. 15, the speed input unit 141 acquires the moving speed from a speed measuring device of a car that is a mounted device, a satellite positioning system, or the like. The obtained moving speed is a narrow area at high speed traveling and a vicinity at low speed traveling within the range analysis of step S404, the area end determination of step S407, the range of signal analysis of step S408, and the position analysis of step S409. It is used to set a wide area of
 また、ステップS408の信号解析において、対象物111の測定した反射位置と、1つ前に測定した反射位置との差から移動方向と移動速度を算出し、ステップS409の位置解析において、自動車の移動速度と対象物111の移動方向および移動速度とから該自動車の移動方向と対象物の移動方向の交差する点における、自動車からの対象物111までの距離を算出し、距離がある一定値以下の場合に衝突の警告信号を出力する。 Also, in the signal analysis of step S408, the movement direction and movement speed are calculated from the difference between the measured reflection position of the object 111 and the reflection position measured one time ago, and in the position analysis of step S409 the movement of the vehicle The distance from the car to the object 111 at the intersection of the moving direction of the car and the moving direction of the object is calculated from the speed and the moving direction and moving speed of the object 111, and the distance is less than a certain value In case of collision output warning signal.
 これにより、計測時間を短縮しながら、搭載装置にとって重要度の高い対象物を検出することができる。また、搭載装置と対象物111との衝突警告信号を高精度に出力することができる。 This makes it possible to detect an object of high importance to the mounting apparatus while shortening the measurement time. In addition, a collision warning signal between the mounting apparatus and the object 111 can be output with high accuracy.
 (実施の形態7)
 〈画像入力部の構成例および動作〉
 図16は、本実施の形態7による可視外計測装置110が有する画像入力部103における構成の一例を示す説明図である。
Seventh Embodiment
<Configuration Example and Operation of Image Input Unit>
FIG. 16 is an explanatory drawing showing an example of the configuration of the image input unit 103 included in the outside-of-visible measurement device 110 according to the seventh embodiment.
 また、可視外計測装置110については、前記実施の形態1の図1と同様であるものとする。 The non-visible measurement device 110 is assumed to be the same as that shown in FIG. 1 of the first embodiment.
 画像入力部103は、図16に示すように、光照射部161、光検出部162、および制御演算部163を有する。光照射部161は、制御演算部163の光制御信号に従って、例えばパルス波形の光を対象物およびその周辺に照射する。 The image input unit 103 includes a light irradiation unit 161, a light detection unit 162, and a control calculation unit 163, as shown in FIG. The light irradiator 161 irradiates, for example, light of a pulse waveform to the object and the periphery thereof in accordance with the light control signal of the control calculator 163.
 障害物や反射面114となる壁や建造物で光は反射し、反射した光は光検出部162に入射される。光検出部162は、反射したパルス波形の光を検出する。制御演算部163は、パルス波の場合、照射したパルス波と反射したパルス波の光検出信号との時間差ΔTと光速度から、TOF(Time Of Flight)法を用いて画像入力部103から反射面114までの距離Lを算出する。 The light is reflected by an obstacle or a wall or a structure serving as the reflective surface 114, and the reflected light is incident on the light detection unit 162. The light detection unit 162 detects the light of the reflected pulse waveform. In the case of a pulse wave, the control calculation unit 163 uses the TOF (Time Of Flight) method from the time difference ΔT between the irradiated pulse wave and the light detection signal of the reflected pulse wave and the light velocity, and the reflecting surface from the image input unit 103 The distance L to 114 is calculated.
 照射する光は連続波でもよく、その場合、制御演算部163は、位相差や周波数差を用いて距離Lを算出してもよい。制御演算部163は、対象物およびその周辺の範囲を網羅するように照射方向をスキャンして、距離Lの値を照射方向ごとに2次元画像にマッピングすることにより、距離情報を含んだ画像情報を出力する。 The light to be irradiated may be a continuous wave, and in this case, the control calculation unit 163 may calculate the distance L using a phase difference or a frequency difference. The control calculation unit 163 scans the irradiation direction so as to cover the object and the range around it, and maps the value of the distance L to a two-dimensional image for each irradiation direction, thereby generating image information including distance information. Output
 解析部104は、この距離情報を含んだ画像情報を用いることにより、形状認識や範囲解析の計算負荷を低減することができ、その結果、可視外計測の高速化を実現することができる。 The analysis unit 104 can reduce the calculation load of shape recognition and range analysis by using the image information including the distance information, and as a result, it is possible to realize speeding up of the non-visible measurement.
 以上、距離情報を含んだ画像情報を出力する画像入力部103を有することによって、高速で低コストの可視外計測装置110を提供することができる。 As mentioned above, by having the image input part 103 which outputs the image information containing distance information, high-speed and low cost non-visible measurement apparatus 110 can be provided.
 以上、本発明者によってなされた発明を実施の形態に基づき具体的に説明したが、本発明は前記実施の形態に限定されるものではなく、その要旨を逸脱しない範囲で種々変更可能であることはいうまでもない。 As mentioned above, although the invention made by the present inventor was concretely explained based on an embodiment, the present invention is not limited to the above-mentioned embodiment, and can be variously changed in the range which does not deviate from the gist. Needless to say.
 なお、本発明は上記した実施の形態に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施の形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。 The present invention is not limited to the above-described embodiment, but includes various modifications. For example, the above-described embodiments are described in detail to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations.
 また、ある実施の形態の構成の一部を他の実施の形態の構成に置き換えることが可能であり、また、ある実施の形態の構成に他の実施の形態の構成を加えることも可能である。また、各実施の形態の構成の一部について、他の構成の追加、削除、置換をすることが可能である。 Also, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. . In addition, it is possible to add, delete, and replace other configurations in part of the configurations of the respective embodiments.
 〈付記〉
 その他、実施の形態に記載された内容の一部を以下に記載する。
(1)電磁波を測定する対象物に照射するレーダ照射部と、対象物にて反射された電磁波を検出し、検出した電磁波の検出信号を出力するレーダ検出部と、画像情報を取得する画像入力部と、レーダ検出部が出力する検出信号および画像入力部が取得する画像情報を解析して対象物の位置情報を算出する解析部と、を有し、解析部が算出する位置情報は、画像情報に含まれていない可視外の対象物の位置情報である、可視外計測装置。
(2)(1)の可視外計測装置において、画像入力部は、映像を撮影する撮像装置であり、撮影した画像データを画像情報として出力する。
(3)(1)の可視外計測装置において、レーダ照射部およびレーダ検出部をそれぞれ複数有する。
(4)(1)の可視外計測装置において、解析部は、画像入力部が取得した画像情報から3次元の形状情報を生成する形状認識部と、該形状認識部が生成した形状情報からレーダ照射部が照射した電磁波が反射面にて反射して広がる反射方向を算出して、算出した反射方向から電磁波の照射範囲を示す範囲情報を算出する範囲解析部と、レーダ検出部が出力する検出信号および範囲解析部が算出する範囲情報に基づいて、対象物の反射位置を算出する信号解析部と、形状認識部が生成する形状情報および信号解析部が算出する反射位置に基づいて、対象物の位置情報を算出する位置解析部と、を有し、信号解析部は、対象物の反射位置を算出する際に、複数の検出信号および範囲解析部が算出する範囲情報に基づいて、複数の反射位置を算出し、位置解析部は、形状認識部が生成する形状情報および信号解析部が算出する複数の反射位置に基づいて、対象物の位置情報を算出する。
<Appendix>
In addition, a part of the contents described in the embodiment will be described below.
(1) A radar irradiation unit for irradiating an object to be measured with an electromagnetic wave, a radar detection unit for detecting an electromagnetic wave reflected by the object and outputting a detection signal of the detected electromagnetic wave, and an image input for acquiring image information Unit, and an analysis unit that analyzes the detection signal output from the radar detection unit and the image information acquired by the image input unit to calculate position information of the object, and the position information calculated by the analysis unit is an image A non-visible measuring device that is position information of a non-visible object not included in the information.
(2) In the non-visible measurement device according to (1), the image input unit is an imaging device for capturing a video, and outputs the captured image data as image information.
(3) The non-visible measurement device according to (1) includes a plurality of radar irradiation units and a plurality of radar detection units.
(4) In the non-visible measurement device according to (1), the analysis unit generates a three-dimensional shape information from the image information acquired by the image input unit, and a radar from the shape information generated by the shape recognition unit. A range analysis unit that calculates the reflection direction by which the electromagnetic wave emitted by the irradiation unit is reflected by the reflection surface and spreads, and calculates range information indicating the irradiation range of the electromagnetic wave from the calculated reflection direction, and detection that the radar detection unit outputs Based on the signal and the range information calculated by the range analysis unit, a signal analysis unit that calculates the reflection position of the object, the shape information generated by the shape recognition unit, and the reflection position that the signal analysis unit calculates When the signal analysis unit calculates the reflection position of the object, the signal analysis unit calculates a plurality of detection signals based on the plurality of detection signals and the range information calculated by the range analysis unit. Calculate reflection position , Position analyzing unit, based on a plurality of reflection positions shape information and the signal analysis unit shape recognition unit is generated by calculating calculates the position information of the object.
101 レーダ照射部
102 レーダ検出部
103 画像入力部
104 解析部
105 形状認識部
106 範囲解析部
107 位置解析部
108 信号解析部
109 地図入力部
110 可視外計測装置
110a 可視外計測装置
141 速度入力部
161 光照射部
162 光検出部
163 制御演算部
205 ヘッドアップディスプレイ
DESCRIPTION OF SYMBOLS 101 Radar irradiation part 102 Radar detection part 103 Image input part 104 Analysis part 105 Shape recognition part 106 Range analysis part 107 Position analysis part 108 Signal analysis part 109 Map input part 110 Non-visible measuring device 110a Non-visible measuring device 141 Velocity input part 161 Light irradiation unit 162 Light detection unit 163 Control calculation unit 205 Head-up display

Claims (15)

  1.  電磁波を対象物に照射するレーダ照射部と、
     前記対象物にて反射した電磁波を検出し、検出した前記電磁波の検出信号を出力するレーダ検出部と、
     地形の画像情報を取得する画像入力部と、
     前記レーダ検出部が出力する前記検出信号および前記画像入力部が取得する前記画像情報を解析して前記対象物の位置情報を算出する解析部と、
     を有し、
     前記解析部は、前記画像入力部が取得した前記画像情報に基づいて、前記電磁波を反射させる反射面の位置を抽出して、抽出した前記反射面に対して電磁波を照射して位置解析を行うことにより、前記画像情報に含まれていない可視外の対象物の前記位置情報を算出する、可視外計測装置。
    A radar irradiation unit that irradiates an electromagnetic wave to an object;
    A radar detection unit that detects an electromagnetic wave reflected by the object and outputs a detection signal of the detected electromagnetic wave;
    An image input unit for acquiring terrain image information;
    An analysis unit that analyzes the detection signal output from the radar detection unit and the image information acquired by the image input unit to calculate position information of the object;
    Have
    The analysis unit extracts the position of a reflection surface that reflects the electromagnetic wave based on the image information acquired by the image input unit, and applies the electromagnetic wave to the extracted reflection surface to perform position analysis. The non-visible measurement device, thereby calculating the position information of a non-visible object not included in the image information.
  2.  請求項1記載の可視外計測装置において、
     前記解析部は、
     前記画像入力部が取得した画像情報から3次元の形状情報を生成する形状認識部と、
     前記形状認識部が生成した前記形状情報から前記レーダ照射部が照射した前記電磁波が反射面にて反射して広がる反射方向を算出して、算出した前記反射方向から前記電磁波の照射範囲を示す範囲情報を算出する範囲解析部と、
     前記レーダ検出部が出力する前記検出信号および範囲解析部が算出する前記範囲情報に基づいて、前記対象物の反射位置を算出する信号解析部と、
     前記形状認識部が生成する前記形状情報および前記信号解析部が算出する前記反射位置に基づいて、前記対象物の位置情報を算出する位置解析部と、
     を有する、可視外計測装置。
    In the non-visible measurement device according to claim 1,
    The analysis unit
    A shape recognition unit that generates three-dimensional shape information from the image information acquired by the image input unit;
    A range that indicates the radiation range of the electromagnetic wave from the calculated reflection direction by calculating the reflection direction of the electromagnetic wave emitted by the radar irradiation unit from the shape information generated by the shape recognition unit by reflecting and spreading the electromagnetic wave on the reflection surface. A range analysis unit that calculates information;
    A signal analysis unit that calculates a reflection position of the object based on the detection signal output from the radar detection unit and the range information calculated by a range analysis unit;
    A position analysis unit that calculates position information of the object based on the shape information generated by the shape recognition unit and the reflection position calculated by the signal analysis unit;
    Non-visible measuring device with.
  3.  請求項1記載の可視外計測装置において、
     前記画像入力部は、
     光を照射する光照射部と、
     前記光照射部が照射した光の反射光を検出する光検出部と、
     前記光検出部が検出した光検出信号に基づいて、画像情報を生成する制御演算部と、
     を有し、
     前記制御演算部は、前記画像入力部から前記光照射部が照射した光の反射位置までの距離である反射距離を演算し、演算した前記反射距離を2次元画像にマッピングすることにより、距離情報を有する前記画像情報を出力する、可視外計測装置。
    In the non-visible measurement device according to claim 1,
    The image input unit is
    A light emitting unit for emitting light;
    A light detection unit that detects the reflected light of the light emitted by the light irradiation unit;
    A control operation unit that generates image information based on the light detection signal detected by the light detection unit;
    Have
    The control calculation unit calculates a reflection distance which is a distance from the image input unit to a reflection position of the light emitted by the light emitting unit, and maps the calculated reflection distance to a two-dimensional image to obtain distance information. The non-visible measuring device which outputs the said image information which has.
  4.  請求項1記載の可視外計測装置において、
     前記レーダ検出部は、前記対象物にて反射された電磁波の回折波を検出して、検出した前記回折波の検出信号を出力し、
     前記解析部は、
     前記画像入力部が取得した画像情報から3次元の形状情報を生成する形状認識部と、
     前記形状認識部が生成した前記形状情報から前記レーダ照射部が照射した前記電磁波の回折方向を算出して、算出した前記回折方向から前記回折波の照射範囲を示す範囲情報を算出する範囲解析部と、
     前記レーダ検出部が出力する前記検出信号および前記範囲解析部が算出する前記範囲情報に基づいて、前記対象物の反射位置を算出して出力する信号解析部と、
     前記形状認識部が生成する前記形状情報および前記信号解析部が算出する前記反射位置に基づいて、前記対象物の位置情報を算出する位置解析部と、
     を有する、可視外計測装置。
    In the non-visible measurement device according to claim 1,
    The radar detection unit detects a diffracted wave of the electromagnetic wave reflected by the object, and outputs a detection signal of the detected diffracted wave.
    The analysis unit
    A shape recognition unit that generates three-dimensional shape information from the image information acquired by the image input unit;
    A range analysis unit that calculates the diffraction direction of the electromagnetic wave irradiated by the radar irradiation unit from the shape information generated by the shape recognition unit, and calculates range information indicating the irradiation range of the diffracted wave from the calculated diffraction direction When,
    A signal analysis unit that calculates and outputs a reflection position of the object based on the detection signal output from the radar detection unit and the range information calculated by the range analysis unit;
    A position analysis unit that calculates position information of the object based on the shape information generated by the shape recognition unit and the reflection position calculated by the signal analysis unit;
    Non-visible measuring device with.
  5.  請求項2記載の可視外計測装置において、
     地図情報を取得する地図入力部を有し、
     前記形状認識部は、前記地図入力部が取得した地図情報および前記画像入力部が取得する前記画像情報に基づいて、前記形状情報を生成する、可視外計測装置。
    In the non-visible measurement device according to claim 2,
    Has a map input unit to obtain map information,
    The non-visible measurement device, wherein the shape recognition unit generates the shape information based on the map information acquired by the map input unit and the image information acquired by the image input unit.
  6.  請求項1記載の可視外計測装置において、
     前記レーダ照射部が照射する電磁波は、パルス変調された信号である、可視外計測装置。
    In the non-visible measurement device according to claim 1,
    The non-visible measurement device, wherein the electromagnetic wave irradiated by the radar irradiation unit is a pulse-modulated signal.
  7.  請求項1記載の可視外計測装置において、
     前記レーダ照射部が照射する電磁波は、周波数変調された信号である、可視外計測装置。
    In the non-visible measurement device according to claim 1,
    The non-visible measurement device, wherein the electromagnetic wave emitted by the radar irradiation unit is a frequency-modulated signal.
  8.  請求項1記載の可視外計測装置において、
     前記解析部は、
     前記画像入力部が取得した画像情報から3次元の形状情報を生成する形状認識部と、
     前記形状認識部が生成した前記形状情報から前記レーダ照射部が照射した前記電磁波が反射面にて反射して広がる反射方向を算出して、算出した前記反射方向から前記電磁波の照射範囲を示す範囲情報を算出する範囲解析部と、
     前記レーダ検出部が出力する前記検出信号および範囲解析部が算出する前記範囲情報に基づいて、前記対象物の反射位置を算出して出力する信号解析部と、
     前記形状認識部が生成する前記形状情報および前記信号解析部が算出する前記反射位置に基づいて、前記対象物の位置情報を算出する位置解析部と、
     を有し、
     前記信号解析部は、前記検出信号から前記範囲情報に示される範囲内にて前記対象物の前記電磁波の反射位置を算出し、
     前記レーダ照射部は、前記範囲解析部が出力する前記範囲情報に基づいて、前記電磁波の照射方向を設定する、可視外計測装置。
    In the non-visible measurement device according to claim 1,
    The analysis unit
    A shape recognition unit that generates three-dimensional shape information from the image information acquired by the image input unit;
    A range that indicates the radiation range of the electromagnetic wave from the calculated reflection direction by calculating the reflection direction of the electromagnetic wave emitted by the radar irradiation unit from the shape information generated by the shape recognition unit by reflecting and spreading the electromagnetic wave on the reflection surface. A range analysis unit that calculates information;
    A signal analysis unit that calculates and outputs a reflection position of the object based on the detection signal output from the radar detection unit and the range information calculated by a range analysis unit;
    A position analysis unit that calculates position information of the object based on the shape information generated by the shape recognition unit and the reflection position calculated by the signal analysis unit;
    Have
    The signal analysis unit calculates the reflection position of the electromagnetic wave of the object within a range indicated by the range information from the detection signal,
    The non-visible measurement device, wherein the radar irradiation unit sets the irradiation direction of the electromagnetic wave based on the range information output by the range analysis unit.
  9.  請求項8記載の可視外計測装置において、
     前記範囲解析部が算出する前記範囲情報は、第1の範囲または第2の範囲のいずれかに限定され、
     前記第1の範囲は、前記レーダ照射部の照射中心を基準として、正の水平角を有する範囲であり、
     前記第2の範囲は、前記レーダ照射部の照射中心を基準として、負の水平角を有する範囲である、可視外計測装置。
    In the non-visible measurement device according to claim 8,
    The range information calculated by the range analysis unit is limited to either a first range or a second range,
    The first range is a range having a positive horizontal angle with reference to the irradiation center of the radar irradiation unit,
    The non-visible measurement device, wherein the second range is a range having a negative horizontal angle with respect to the irradiation center of the radar irradiation unit.
  10.  請求項2記載の可視外計測装置において、
     前記可視外計測装置の移動速度を取得する速度入力部を有し、
     前記位置解析部は、前記速度入力部が取得した前記移動速度と前記対象物の移動方向および移動速度とから前記対象物までの距離を算出して、算出した前記距離が設定値以下の場合にアラートを出力する、可視外計測装置。
    In the non-visible measurement device according to claim 2,
    It has a speed input unit for acquiring the moving speed of the non-visible measurement device,
    The position analysis unit calculates a distance to the object from the movement speed acquired by the speed input unit and the movement direction and movement speed of the object, and the calculated distance is less than or equal to a set value. Out-of-visible measurement device that outputs an alert.
  11.  請求項10記載の可視外計測装置において、
     前記位置解析部が出力する前記アラートは、自動車が有する衝突被害軽減ブレーキの自動ブレーキ機能の制御を司る電子制御ユニットに対して前記自動ブレーキ機能を動作させる制御信号である、可視外計測装置。
    In the non-visible measurement device according to claim 10,
    The non-visible measuring device, wherein the alert output from the position analysis unit is a control signal that causes the electronic control unit that controls the automatic braking function of a collision damage reducing brake of a vehicle to operate the automatic braking function.
  12.  請求項10記載の可視外計測装置において、
     前記位置解析部が出力する前記アラートは、自動車が有する表示システムに位置情報を表示させる制御信号である、可視外計測装置。
    In the non-visible measurement device according to claim 10,
    The non-visible measurement device, wherein the alert output from the position analysis unit is a control signal for displaying position information on a display system of a car.
  13.  電磁波を対象物に照射するレーダ照射部と、前記対象物にて反射された電磁波を検出し、検出した前記電磁波の検出信号を出力するレーダ検出部と、画像情報を取得する画像入力部)と、レーダ検出部が出力する前記検出信号および前記画像入力部が取得する前記画像情報を解析して前記対象物の位置情報を抽出する解析部と、を有する可視外計測装置による前記対象物の可視外計測方法であって、
     前記画像入力部が画像情報を取得する出力する第1のステップと、
     前記解析部が、取得した前記画像情報から形状を認識して形状情報を生成する第2のステップと、
     前記解析部が、前記形状情報に基づいて、前記電磁波の照射範囲および検出範囲を算出し、その算出結果を範囲情報として出力する第3のステップと、
     前記レーダ照射部が電磁波を照射する第4のステップと、
     前記レーダ検出部が、前記対象物にて反射された電磁波を検出して検出信号を出力する第5のステップと、
     前記解析部が、前記検出信号および前記範囲情報に基づいて、反射位置を算出する第7のステップと、
     前記解析部が、算出した前記反射位置および前記形状情報に基づいて、前記対象物の位置を算出する第8のステップと、
     を有する、可視外計測方法。
    A radar irradiation unit that irradiates an object with an electromagnetic wave; a radar detection unit that detects an electromagnetic wave reflected by the object; and outputs a detection signal of the detected electromagnetic wave; an image input unit that acquires image information) And an analysis unit that analyzes the detection signal output from the radar detection unit and the image information acquired by the image input unit, and extracts position information of the object. External measurement method,
    A first step of outputting the image information acquired by the image input unit;
    A second step of the analysis unit recognizing a shape from the acquired image information and generating the shape information;
    A third step of the analysis unit calculating the irradiation range and the detection range of the electromagnetic wave based on the shape information, and outputting the calculation result as range information;
    A fourth step of irradiating the electromagnetic wave by the radar irradiation unit;
    A fifth step of the radar detection unit detecting an electromagnetic wave reflected by the object and outputting a detection signal;
    A seventh step in which the analysis unit calculates a reflection position based on the detection signal and the range information;
    An eighth step of calculating the position of the object based on the calculated reflection position and the shape information.
    Out-of-visible measurement method.
  14.  請求項13記載の可視外計測方法において、
     前記解析部が、前記レーダ照射部が照射する前記電磁波の照射範囲が前記範囲情報を網羅しているか否かを判定する第9のステップを有し、
     前記解析部は、前記電磁波の照射範囲が前記範囲情報を網羅していると判定した際に前記検出信号と前記範囲情報とに基づいて、反射位置を算出し、前記電磁波の照射範囲が前記範囲情報を網羅していない判定した際に前記レーダ照射部が照射する前記電磁波の照射範囲を変更させる、可視外計測方法。
    In the non-visible measurement method according to claim 13,
    The analysis unit has a ninth step of determining whether the irradiation range of the electromagnetic wave irradiated by the radar irradiation unit covers the range information or not.
    The analysis unit calculates the reflection position based on the detection signal and the range information when it is determined that the irradiation range of the electromagnetic wave covers the range information, and the irradiation range of the electromagnetic wave is the range The non-visible measurement method which changes the irradiation range of the said electromagnetic waves which the said radar irradiation part irradiates, when it determines not covering information.
  15.  請求項14記載の可視外計測方法において、
     前記第1のステップから前記第3のステップの処理と、前記第4のステップの処理、前記第5のステップの処理、および前記第9のステップの処理とを並列に実行する、可視外計測方法。
    In the non-visible measurement method according to claim 14,
    The non-visible measurement method, wherein the processing of the first step to the third step, the processing of the fourth step, the processing of the fifth step, and the processing of the ninth step are performed in parallel .
PCT/JP2017/024748 2017-07-06 2017-07-06 Non-visible measurement device and non-visible measurement method WO2019008716A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/024748 WO2019008716A1 (en) 2017-07-06 2017-07-06 Non-visible measurement device and non-visible measurement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/024748 WO2019008716A1 (en) 2017-07-06 2017-07-06 Non-visible measurement device and non-visible measurement method

Publications (1)

Publication Number Publication Date
WO2019008716A1 true WO2019008716A1 (en) 2019-01-10

Family

ID=64949780

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/024748 WO2019008716A1 (en) 2017-07-06 2017-07-06 Non-visible measurement device and non-visible measurement method

Country Status (1)

Country Link
WO (1) WO2019008716A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190329768A1 (en) * 2017-01-12 2019-10-31 Mobileye Vision Technologies Ltd. Navigation Based on Detected Size of Occlusion Zones
WO2020054108A1 (en) * 2018-09-14 2020-03-19 オムロン株式会社 Sensing device, mobile body system, and sensing method
WO2020070909A1 (en) * 2018-10-05 2020-04-09 オムロン株式会社 Sensing device, moving body system, and sensing method
WO2020070908A1 (en) * 2018-10-05 2020-04-09 オムロン株式会社 Detection device, moving body system, and detection method
WO2021010083A1 (en) * 2019-07-18 2021-01-21 ソニー株式会社 Information processing device, information processing method, and information processing program
JP2021018121A (en) * 2019-07-18 2021-02-15 古河電気工業株式会社 Radar device, target detection method of radar device, and target detection system
CN113138660A (en) * 2020-01-17 2021-07-20 北京小米移动软件有限公司 Information acquisition method and device, mobile terminal and storage medium
CN113763383A (en) * 2021-11-09 2021-12-07 常州微亿智造科技有限公司 Method and device for measuring elongation of steel bar
CN113994404A (en) * 2019-06-21 2022-01-28 松下电器产业株式会社 Monitoring device and monitoring method
CN114674256A (en) * 2022-04-02 2022-06-28 四川豪智融科技有限公司 Method for judging target rotation angle based on radar polarization direction
WO2023199873A1 (en) * 2022-04-14 2023-10-19 三菱電機株式会社 Target poistion estimation device, radar device, and target position estimation method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004301649A (en) * 2003-03-31 2004-10-28 Kitakyushu Foundation For The Advancement Of Industry Science & Technology Radar system for detecting vehicle or the like
WO2006123628A1 (en) * 2005-05-17 2006-11-23 Murata Manufacturing Co., Ltd. Radar and radar system
JP2015230566A (en) * 2014-06-04 2015-12-21 トヨタ自動車株式会社 Driving support device
JP2016110629A (en) * 2014-11-27 2016-06-20 パナソニックIpマネジメント株式会社 Object detection device and road reflecting mirror
JP2017097581A (en) * 2015-11-24 2017-06-01 マツダ株式会社 Object detection device
JP2017162178A (en) * 2016-03-09 2017-09-14 パナソニックIpマネジメント株式会社 Determination device, determination method and determination program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004301649A (en) * 2003-03-31 2004-10-28 Kitakyushu Foundation For The Advancement Of Industry Science & Technology Radar system for detecting vehicle or the like
WO2006123628A1 (en) * 2005-05-17 2006-11-23 Murata Manufacturing Co., Ltd. Radar and radar system
JP2015230566A (en) * 2014-06-04 2015-12-21 トヨタ自動車株式会社 Driving support device
JP2016110629A (en) * 2014-11-27 2016-06-20 パナソニックIpマネジメント株式会社 Object detection device and road reflecting mirror
JP2017097581A (en) * 2015-11-24 2017-06-01 マツダ株式会社 Object detection device
JP2017162178A (en) * 2016-03-09 2017-09-14 パナソニックIpマネジメント株式会社 Determination device, determination method and determination program

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190329768A1 (en) * 2017-01-12 2019-10-31 Mobileye Vision Technologies Ltd. Navigation Based on Detected Size of Occlusion Zones
US11738741B2 (en) * 2017-01-12 2023-08-29 Mobileye Vision Technologies Ltd. Navigation based on detected occlusion overlapping a road entrance
WO2020054108A1 (en) * 2018-09-14 2020-03-19 オムロン株式会社 Sensing device, mobile body system, and sensing method
JP2020046755A (en) * 2018-09-14 2020-03-26 オムロン株式会社 Detection device, mobile body system, and detection method
JP7063208B2 (en) 2018-09-14 2022-05-09 オムロン株式会社 Detection device, mobile system, and detection method
WO2020070908A1 (en) * 2018-10-05 2020-04-09 オムロン株式会社 Detection device, moving body system, and detection method
JP2020060864A (en) * 2018-10-05 2020-04-16 オムロン株式会社 Detection device, movable body system, and detection method
JP2020060863A (en) * 2018-10-05 2020-04-16 オムロン株式会社 Detection device, movable body system, and detection method
WO2020070909A1 (en) * 2018-10-05 2020-04-09 オムロン株式会社 Sensing device, moving body system, and sensing method
JP7067400B2 (en) 2018-10-05 2022-05-16 オムロン株式会社 Detection device, mobile system, and detection method
JP7070307B2 (en) 2018-10-05 2022-05-18 オムロン株式会社 Detection device, mobile system, and detection method
CN113994404A (en) * 2019-06-21 2022-01-28 松下电器产业株式会社 Monitoring device and monitoring method
CN113994404B (en) * 2019-06-21 2023-11-07 松下控股株式会社 Monitoring device and monitoring method
WO2021010083A1 (en) * 2019-07-18 2021-01-21 ソニー株式会社 Information processing device, information processing method, and information processing program
JP2021018121A (en) * 2019-07-18 2021-02-15 古河電気工業株式会社 Radar device, target detection method of radar device, and target detection system
JP7328043B2 (en) 2019-07-18 2023-08-16 古河電気工業株式会社 RADAR DEVICE, TARGET DETECTION METHOD FOR RADAR DEVICE, AND TARGET DETECTION SYSTEM
CN113138660A (en) * 2020-01-17 2021-07-20 北京小米移动软件有限公司 Information acquisition method and device, mobile terminal and storage medium
CN113763383A (en) * 2021-11-09 2021-12-07 常州微亿智造科技有限公司 Method and device for measuring elongation of steel bar
CN114674256B (en) * 2022-04-02 2023-08-22 四川豪智融科技有限公司 Method for judging target rotation angle based on radar polarization direction
CN114674256A (en) * 2022-04-02 2022-06-28 四川豪智融科技有限公司 Method for judging target rotation angle based on radar polarization direction
WO2023199873A1 (en) * 2022-04-14 2023-10-19 三菱電機株式会社 Target poistion estimation device, radar device, and target position estimation method

Similar Documents

Publication Publication Date Title
WO2019008716A1 (en) Non-visible measurement device and non-visible measurement method
US9759812B2 (en) System and methods for intersection positioning
US10377376B2 (en) Vehicle with environmental context analysis
EP3273423B1 (en) Device and method for a vehicle for recognizing a pedestrian
US11011063B2 (en) Distributed data collection and processing among vehicle convoy members
KR20200102004A (en) Apparatus, system and method for preventing collision
US11608055B2 (en) Enhanced autonomous systems with sound sensor arrays
US10823844B2 (en) Method and apparatus for analysis of a vehicle environment, and vehicle equipped with such a device
JP2009086788A (en) Vehicle surrounding monitoring device
US10906542B2 (en) Vehicle detection system which classifies valid or invalid vehicles
CN109631782B (en) System and method for measuring bridge clearance
US10783384B2 (en) Object detection using shadows
WO2015009218A1 (en) Determination of lane position
JP2015161968A (en) Traffic lane identification apparatus, traffic lane change support apparatus, traffic lane identification method
EP4102251A1 (en) Determination of atmospheric visibility in autonomous vehicle applications
WO2020039840A1 (en) Radar processing device
KR102017958B1 (en) Augmented reality head up display system for railway train
GB2556427A (en) Vehicle with environmental context analysis
JP6414539B2 (en) Object detection device
KR102084946B1 (en) Method and apparatus for generating an alarm notification according to the passage height of object located in a moving path the vehicle
JP7380904B2 (en) Information processing device, information processing method, and program
KR102185743B1 (en) Method and apparatus for determining the existence of object located in front of a vehicle
Fuerstenberg et al. Advanced intersection safety-The EC project INTERSAFE
JP2022060075A (en) Drive support device
US11914679B2 (en) Multispectral object-detection with thermal imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17917118

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17917118

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP