WO2020070908A1 - Dispositif de détection, système de corps mobile, et procédé de détection - Google Patents

Dispositif de détection, système de corps mobile, et procédé de détection

Info

Publication number
WO2020070908A1
WO2020070908A1 PCT/JP2019/009926 JP2019009926W WO2020070908A1 WO 2020070908 A1 WO2020070908 A1 WO 2020070908A1 JP 2019009926 W JP2019009926 W JP 2019009926W WO 2020070908 A1 WO2020070908 A1 WO 2020070908A1
Authority
WO
WIPO (PCT)
Prior art keywords
blind spot
detection
control unit
risk
detection device
Prior art date
Application number
PCT/JP2019/009926
Other languages
English (en)
Japanese (ja)
Inventor
圭記 松浦
宜崇 鶴亀
直毅 吉武
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020070908A1 publication Critical patent/WO2020070908A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to a detection device that detects a nearby object from a moving object, a moving object system including the detection device, and a detection method.
  • Patent Documents 1 and 2 A technology has been proposed that is mounted on a mobile object such as an automobile or an AGV (automatic guided vehicle) and monitors the periphery of the mobile object (for example, Patent Documents 1 and 2).
  • Patent Document 1 discloses a driving assistance device including an obstacle recognition device that recognizes an obstacle in front of a host vehicle.
  • the obstacle recognition device of Patent Literature 1 detects a blind spot area with respect to the own vehicle in order to recognize an obstacle that appears from a blind spot with respect to the own vehicle.
  • the driving support device estimates the danger to the obstacle based on the detection result of the obstacle recognition device, and notifies a warning or the like for guiding the user to avoid the obstacle when the danger to the obstacle is lower than a predetermined value. When the degree of danger is higher than a predetermined value, control is performed such that the own vehicle runs automatically.
  • Patent Document 2 discloses a vehicle environment estimating apparatus for accurately estimating a traveling environment around a host vehicle.
  • the vehicle environment estimating device of Patent Literature 2 detects the behavior of another vehicle in the vicinity of the own vehicle, and estimates the presence of another vehicle traveling in a blind spot area from the own vehicle based on the behavior of the vehicle. .
  • the driving control of the own vehicle is performed by using the estimation result of such a blind spot area, for example, for predicting the vehicle speed of another vehicle preceding the own vehicle.
  • Patent Document 1 various controls for driving assistance are performed by determining whether the risk estimated based on the detection result of the blind spot area is higher or lower than a predetermined value.
  • the criterion for determining the degree of danger is a predetermined value regardless of the surrounding environment outside the blind spot, and in the related art, even in a situation of the surrounding environment where it is considered that the object in the blind spot is unlikely to hinder the traveling of the own vehicle, This has led to erroneous determination of the risk of causing excessive operation control.
  • An object of the present disclosure is a detection device, a detection method, and a mobile body system that can suppress excessive erroneous determination of a risk level when detecting an object with respect to a blind spot in a surrounding environment of the mobile body and determining the risk level. Is to provide.
  • a detection device is a device that detects an object in a surrounding environment including a path of a moving object.
  • the detection device includes a detection unit and a control unit.
  • the detection unit detects distance information indicating a distance from the moving body to a surrounding environment.
  • the control unit controls the detection unit.
  • the control unit detects a blind spot area indicating a blind spot in the surrounding environment based on the detection result of the detection unit, and determines the degree of risk related to the blind spot area based on the detection result of the blind spot area.
  • the control unit relaxes the risk criterion when an object is present at a junction where the detected blind spot area and the path of the moving body merge.
  • a mobile body system includes the above-described detection device, and a control device mounted on the mobile body and performing an operation according to a result of the determination of the degree of risk by the detection device.
  • the detection method is a method of detecting an object in a surrounding environment including a path of a moving object.
  • the detecting unit detects distance information indicating a distance from the moving body to the surrounding environment
  • the control unit detects a blind spot area indicating a blind spot in the surrounding environment based on a detection result of the detecting unit.
  • the method includes a step of, when an object is present at a merging point where the blind spot area and the path of the moving body merge, the control unit relaxes the criterion for determining the degree of risk.
  • the detection device the mobile body system, and the detection method according to the present disclosure, it is possible to suppress an erroneous determination of an excessive risk when detecting an object with respect to a blind spot in a surrounding environment of the mobile and determining the risk. Can be.
  • FIG. 1 is a block diagram illustrating a configuration of a mobile system according to a first embodiment of the present disclosure.
  • FIG. 4 is a diagram for explaining the operation of the detection device according to the first embodiment.
  • FIG. 8 illustrates a case where there is a blind spot object in the experiment of FIG. 8.
  • FIG. 4 is a flowchart illustrating a blind spot object detection process performed by the detection device
  • the flowchart which illustrates the judgment process of the danger degree by a detection apparatus
  • the figure for explaining the judgment processing of the degree of danger by the detection device Diagram for explaining a modification of the operation of the detection device
  • Flowchart for explaining a modification of the operation of the detection device
  • FIG. 1 is a diagram for describing an application example of the detection device 1 according to the present disclosure.
  • FIG. 2 is a diagram illustrating a case where an object is present at a merging point in this application example.
  • the detection device 1 is applicable to, for example, in-vehicle use, and constitutes a mobile body system in a mobile body such as an automobile.
  • FIG. 1 illustrates a running state of a vehicle 2 on which the detection device 1 is mounted.
  • the mobile system according to the application example monitors, for example, the surrounding environment that changes around the own vehicle 2 that is traveling using the detection device 1.
  • the surrounding environment includes, for example, structures such as buildings and electric poles existing around the vehicle 2 and various objects such as moving objects such as pedestrians and other vehicles.
  • the range that can be monitored from the host vehicle 2 is blocked by the wall 31 of the structure near the intersection 3, and a blind spot occurs.
  • the blind spot indicates a location that cannot be directly viewed geometrically from a moving object such as the host vehicle 2 in accordance with the surrounding environment.
  • another vehicle 4 approaching the intersection 3 from a side road exists in a blind spot region R1 which is a blind spot from the own vehicle 2.
  • the vehicle 4 and the host vehicle 2 from the blind spot may collide with each other at the intersection.
  • the detection device 1 of the present embodiment executes detection of an object (hereinafter, sometimes referred to as “blind spot object”) existing in the blind spot region R1 such as the vehicle 4, and based on the detection result of the blind spot object 4.
  • Determine the degree of risk The risk relates to, for example, the possibility that the host vehicle 2 and the blind spot object 4 may collide.
  • the detection device 1 can perform various types of control of driving support or driving control for warning to avoid a collision at an intersection based on the determination result of the degree of danger.
  • FIG. 2 illustrates a case where the preceding vehicle 5 exists in the same surrounding environment as in FIG.
  • the front vehicle 5 travels ahead of the host vehicle 2 on the path where the host vehicle 2 travels, and is located at the intersection 3.
  • the side road including the blind spot area R1 joins the course of the host vehicle 2, and the position of the preceding vehicle 5 is right beside the blind spot area R1.
  • the intersection 3 is an example of a merging point
  • the preceding vehicle 5 is an example of an object existing at the same point.
  • the preceding vehicle 5 as described above alerts the blind spot object (vehicle) 4 as a result of being visually recognized by the driver of the vehicle 4 in the blind spot area R1, for example. From this, the risk of collision of the encounter between the own vehicle 2 and the blind spot object 4 or the possibility that the blind spot object 4 becomes an obstacle to the running of the own vehicle is higher than the case where there is no preceding vehicle 5 (see FIG. 1). It is considered low. In such a case, using the same criterion for determining the degree of danger as in the case where there is no preceding vehicle 5 causes an excessive warning or the like such that the above control is performed even in a situation where a warning is not actually required. Can be considered.
  • the detection device 1 of the present embodiment relaxes the risk criterion when there is an object such as the preceding vehicle 5 located at the intersection 3 where the blind spot region R1 and the host vehicle 2 meet. I do. Thereby, erroneous determination of the degree of danger can be suppressed according to the situation of the surrounding environment of the own vehicle 2, an excessive warning or the like can be avoided, and the driving of the own vehicle 2 can be performed smoothly.
  • FIG. 3 is a block diagram illustrating the configuration of the present system.
  • the detection device 1 of the present embodiment includes a radar 11, a camera 12, and a control unit 13. Further, for example, the detection device 1 includes a storage unit 14, a navigation device 15, and an in-vehicle sensor 16.
  • the vehicle control device 20 includes various in-vehicle devices mounted on the host vehicle 2, and is used for, for example, driving assistance or automatic driving.
  • the radar 11 and the camera 12 are each an example of a detection unit that detects distance information.
  • the radar 11 includes, for example, a transmitter 11a, a receiver 11b, and a radar control circuit 11c.
  • the radar 11 is installed on, for example, a front grill or a windshield of the host vehicle 2 so as to transmit and receive signals forward (see FIG. 1) in the traveling direction of the host vehicle 2.
  • the transmitter 11a includes, for example, an antenna having a variable directivity (a phased array antenna or the like), a transmission circuit that causes the antenna to externally transmit the physical signal Sa, and the like.
  • the physical signal Sa includes, for example, at least one of a millimeter wave, a microwave, a radio wave, and a terahertz wave.
  • the receiver 11b includes, for example, an antenna having variable directivity, and a receiving circuit that receives the wave signal Sb from outside using the antenna.
  • the wave signal Sb is set in the same wavelength band as the physical signal Sa so as to include the reflected wave of the physical signal Sa.
  • the transmitter 11a and the receiver 11b may use a common antenna, for example, or may be configured integrally.
  • the radar control circuit 11c controls transmission and reception of signals by the transmitter 11a and the receiver 11b.
  • the radar control circuit 11c starts transmission and reception of signals by the radar 11 and controls the direction in which the physical signal Sa is emitted from the transmitter 11a, for example, by a control signal from the control unit 13. Further, the radar control circuit 11c causes the transmitter 11a to emit a physical signal Sa to the surrounding environment, and detects a wave signal Sb indicating a reflected wave of the physical signal Sa in the reception result of the receiver 11b.
  • the radar 11 operates according to a modulation method such as a CW (continuous wave) method or a pulse method, and measures the distance, azimuth, speed, and the like of an external object.
  • the CW method includes a two-wave CW method, an FM-CW method, a spread spectrum method, and the like.
  • the pulse method may be a pulse Doppler method, or pulse compression of a chirp signal or pulse compression of a PN sequence may be used.
  • the radar 11 uses, for example, coherent phase information control.
  • the radar 11 may use an incoherent method.
  • the camera 12 is installed at a position where, for example, the range in which the physical signal Sa can be radiated from the radar 11 in the own vehicle 2 can be imaged.
  • the camera 12 is installed on a windshield or the like of the host vehicle 2 toward the front of the host vehicle 2 (see FIG. 1), for example.
  • the blind spot in the detection device 1 may be based on the installation position of the camera 12 as a geometric reference or based on the installation position of the radar 11.
  • the camera 12 captures an external image from the installation position and generates a captured image.
  • the camera 12 outputs image data indicating the captured image to the control unit 13.
  • the camera 12 is, for example, an RGB-D camera, a stereo camera, or a range image sensor.
  • the camera 12 is an example of a distance measuring unit (or a monitoring unit) in the present embodiment.
  • the control unit 13 includes a CPU, a RAM, a ROM, and the like, and controls each component according to information processing.
  • the control unit 13 is configured by, for example, an ECU (electronic control unit).
  • the control unit 13 expands the program stored in the storage unit 14 on the RAM, and interprets and executes the program expanded on the RAM by the CPU.
  • the control unit 13 implements a blind spot estimation unit 131, a blind spot object measurement unit 132, and a risk determination unit 133. Each of the units 131 to 133 will be described later.
  • the storage unit 14 stores programs executed by the control unit 13, various data, and the like.
  • the storage unit 14 stores structure information D1 described below.
  • the storage unit 14 includes, for example, a hard disk drive or a solid state drive. Further, the RAM and the ROM may be included in the storage unit 14.
  • the above-mentioned programs and the like may be stored in a portable storage medium.
  • the storage medium stores the information such as the program by an electrical, magnetic, optical, mechanical or chemical action so that the computer or the like can read the information such as the recorded program by a machine or the like.
  • the detection device 1 may acquire a program or the like from the storage medium.
  • the navigation device 15 is an example of a distance measuring unit (monitoring unit) including a memory for storing map information and a GPS receiver, for example.
  • the in-vehicle sensors 16 are various sensors mounted on the host vehicle 2, and include, for example, a vehicle speed sensor, an acceleration sensor, a gyro sensor, and the like.
  • the on-vehicle sensor 16 detects the speed, acceleration, angular velocity, and the like of the vehicle 2.
  • the detection device 1 is not limited to the above configuration.
  • the detection device 1 may not include the navigation device 15 and the vehicle-mounted sensor 16.
  • the control unit 13 of the detection device 1 may be configured by a plurality of hardware resources that execute the units 131 to 133 separately.
  • the control unit 13 may be configured by various semiconductor integrated circuits such as a CPU, an MPU, a GPU, a microcomputer, a DSP, an FPGA, and an ASIC.
  • the vehicle control device 20 is an example of a control device of the mobile system according to the present embodiment.
  • the vehicle control device 20 includes, for example, a vehicle drive unit 21 and an alarm 22.
  • the vehicle drive unit 21 is configured by, for example, an ECU, and controls driving of each unit of the host vehicle 2.
  • the vehicle drive unit 21 controls the brake of the own vehicle 2 to realize automatic braking.
  • the notifier 22 notifies the user of various kinds of information by images or sounds.
  • the alarm 22 is a display device such as a liquid crystal panel or an organic EL panel mounted on the vehicle 2.
  • the alarm 22 may be an audio output device that outputs an alarm or the like by audio.
  • the mobile system operates the detection device 1 so as to monitor the surrounding environment, for example, while the own vehicle 2 is operating.
  • the vehicle control device 20 of the present system performs various controls for driving support of the own vehicle 2 or automatic driving based on the detection result by the detection device 1.
  • the detection device 1 of the present embodiment captures an image around the own vehicle 2 with the camera 12, for example, and monitors the surrounding environment of the own vehicle 2.
  • the blind spot estimation unit 131 of the detection device 1 sequentially detects, for example, the presence or absence of an area where a blind spot is estimated in the current surrounding environment, based on distance information indicating various distances in the monitoring result.
  • the blind spot object measurement unit 132 uses the radar 11 to measure the internal state of the blind spot region R1. Since the physical signal Sa radiated from the radar 11 of the own vehicle 2 has a wave-like property, the physical signal Sa causes multiple reflections or diffractions to reach the blind spot object 4 in the blind spot area R1, and furthermore, the physical signal Sa is transmitted to the own vehicle 2. It is thought that the propagation of returning to may occur.
  • the detection method of the present embodiment detects the blind spot object 4 by utilizing the wave propagating as described above.
  • the risk determining unit 133 of the present embodiment determines the risk of the blind spot object 4 that can be included in the blind spot area R1 based on the measurement result of the blind spot object measuring unit 132.
  • the risk determination unit 133 can dynamically change a threshold value that is a reference for determining the risk.
  • the degree of danger indicates, for example, the degree to which a possibility of a collision between the blind spot object 4 and the host vehicle 2 is considered.
  • the present system when the detection device 1 determines the degree of danger that requires a warning, the present system notifies the driver or the like by the annunciator 22 or increases the safety of the automatic braking or the like by the vehicle drive unit 21. Or vehicle control of the vehicle. Details of the operation of the detection device 1 in the present system will be described below.
  • FIG. 4 is a flowchart for explaining the operation of the detection device 1 according to the present embodiment. Each process shown in the flowchart of FIG. 4 is executed by the control unit 13 of the detection device 1. This flowchart is started at a predetermined cycle while the vehicle 2 is operating, for example.
  • the control unit 13 acquires one or more frames of captured images from the camera 12 (S1).
  • the control unit 13 may acquire a distance image as a captured image, or may generate a distance image based on the acquired captured image.
  • the distance image is an example of distance information indicating various distances for monitoring the surrounding environment.
  • control unit 13 performs image analysis on the acquired captured image (S2), and generates structural information D1 relating to the current surrounding environment of the own vehicle 2.
  • the structure information D1 is information indicating various object structures in the surrounding environment, and includes, for example, distances to various structures.
  • the control unit 13 also operates as the blind spot estimation unit 131 in step S2, and performs image analysis for detecting a blind spot in the acquired captured image.
  • FIG. 5 illustrates an image to be analyzed in step S2.
  • FIG. 5 is, for example, an image taken from the host vehicle 2 as a distance image (S1), and shows walls 31 and 32 of a plurality of structures near the intersection 3.
  • a blind spot region R1 is present behind the wall 31 due to the shielding of the wall 31 near the host vehicle 2.
  • a wall 32 on the back side of the blind spot area R1 faces the host vehicle 2.
  • the wall 31 is referred to as a “shielding wall”, and the wall 32 is referred to as an “opposing wall”.
  • a boundary between the blind spot region R1 and the outside is formed between the shielding wall 31 and the opposing wall 32 (see FIG. 1).
  • step S ⁇ b> 2 the control unit 13 extracts the distance values of the various walls 31 and 32 in the distance image for each pixel as the structure information D ⁇ b> 1 and stores the extracted values in the storage unit 14.
  • the distance value in the case of FIG. 5 changes discontinuously from the end of the shielding wall 31 to the opposing wall 32 while continuously changing from the own vehicle 2 side by the shielding wall 31 along the direction d1. Will be done.
  • the control unit 13 analyzes the change in the distance value as described above, and can estimate the existence of the blind spot region R1.
  • control unit 13 as the blind spot estimation unit 131 determines whether or not the blind spot region R1 is detected in the current surrounding environment of the own vehicle 2 according to an estimation result by image analysis, for example (S3).
  • image analysis for example (S3)
  • the control unit 13 determines that the blind spot area R1 has not been detected (NO in S3), the processing of steps S1 to S3 is periodically repeated, for example.
  • the control unit 13 determines that the blind spot area R1 has been detected (YES in S3), the control unit 13 executes the processing as the blind spot object measurement unit 132 (S4 to S6).
  • the processing as the blind spot object measurement unit 132 S4 to S6.
  • a processing example of the blind spot object measurement unit 132 that measures the blind spot object 4 in the blind spot region R1 by utilizing the multiple reflected waves in the wave signal Sb of the radar 11 will be described below.
  • the control unit 13 as the blind spot object measurement unit 132 controls the radar 11 so as to emit the physical signal Sa toward the blind spot area R1 (S4).
  • FIGS. 6A and 6B illustrate the propagation path of the physical signal Sa in step S4 when there is no blind spot object 4 and when there is a blind spot object 4, respectively.
  • step S4 the control unit 13 causes the radar 11 to emit the physical signal Sa from the radar 11 to the opposing wall 32 near the boundary of the blind spot region R1, based on the analysis result in FIG.
  • the physical signal Sa from the radar 11 of the host vehicle 2 is repeatedly reflected between the opposed wall 32 and the opposite wall 35 via the blind spot region R1 on the side road, and a multiple reflected wave is formed. Propagating as In the example of FIG. 6A, the multiple reflected wave does not come toward the own vehicle 2 in response to the absence of the blind spot object 4.
  • the physical signal Sa from the radar 11 is reflected not only on the walls 32 and 33 but also on the blind spot object 4 and is reflected on the host vehicle 2. It can be a forward reflected multiple reflected wave Rb1. Therefore, the wave signal Sb received by the radar 11 includes the signal component of the multiple reflection wave Rb1 having information on the blind spot object 4.
  • step S4 the radar 11 emits the physical signal Sa and receives the wave signal Sb, and performs various measurements based on the reflected wave of the physical signal Sa.
  • the control unit 13 acquires a measurement result from the radar 11 (S5).
  • the controller 13 performs a blind spot object detection process based on the measurement result of the radar 11 (S6).
  • the signal component of the multiple reflected wave Rb1 (FIG. 6B) has information corresponding to the speed of the reflection-source blind spot object 4 and the length of the propagation path by Doppler shift, phase, and propagation time.
  • the blind spot object detection process (S6), the speed and position of the blind spot object 4 that has reflected the multiple reflected wave Rb1 are detected by analyzing such signal components. Details of the process in step S6 will be described later.
  • control unit 13 operates as the risk determination unit 133, and performs a risk determination process based on the detection result of the blind spot object 4 (S6) (S7).
  • a risk is determined based on the detected position, speed, and the like, for determining whether a warning is required for the blind spot object 4 approaching the vehicle 2.
  • the threshold value used for the determination is dynamically adjusted so as to relax the risk determination criterion when, for example, the preceding vehicle 5 is present (FIG. 2). I do.
  • the risk may be determined using the information in step S7. Details of the processing in step S7 will be described later.
  • control unit 13 outputs various control signals to the vehicle control device 20 according to the determination result of the degree of risk (S7) (S8). For example, when it is determined in step S7 that a warning is required, the control unit 13 generates a control signal for causing the alarm unit 22 to notify the warning and controlling the vehicle driving unit 21.
  • the detection device 1 monitors the surroundings of the own vehicle 2 (S1 to S3), and if a blind spot is found (YES in S3), determines the degree of danger of the blind spot object 4 (S7). And various actions can be performed (S8).
  • the camera 12 is used for monitoring the periphery, but the navigation device 15 may be used.
  • This modification is shown in FIG.
  • the navigation device 15 calculates various distances to the host vehicle 2 in the map information D2 of the surrounding environment of the host vehicle 2, and monitors the current position of the host vehicle 2.
  • the control unit 13 can use the monitoring result of the navigation device 15 as described above for various processes in FIG.
  • the control unit 13 can acquire the structural information D1 or detect the blind spot region R1 based on the monitoring result of the navigation device 15, for example, based on the structure 30 in the map information D2 (S2). Further, the control unit 13 may use the detection result of the vehicle-mounted sensor 16 as appropriate in the processing of FIG.
  • FIG. 8 is a diagram for describing an experiment of a blind spot object detection process.
  • FIG. 8A shows the structure information D1 of the experiment environment of this experiment.
  • FIG. 8B shows a measurement result of the radar 11 when there is no blind spot object 4.
  • FIG. 9 is a diagram illustrating a case where there is a blind spot object in the experiment of FIG.
  • FIG. 9A shows a measurement result of the radar 11 when the blind spot object 4 is present.
  • FIG. 9B illustrates a propagation path of a multiple reflection wave estimated from the blind spot object 4.
  • a strong peak P4 appeared near 7 m farther than the opposing wall 32.
  • the azimuth of the peak P4 is seen from the radar 11 to the far side of the opposing wall 32. From the above distance and direction, it can be seen that the peak P4 is mainly a component reflected from the blind spot object 4 through reflection by the opposing wall 32 (see FIG. 9B). That is, it was confirmed that the peak P4 having the blind spot object 4 as a wave source can be detected based on the distance and the azimuth to the peak P4 in the measurement result of the radar 11.
  • the presence / absence, position, and the like of the blind spot object 4 can be detected more accurately by using the structural information of the surrounding environment.
  • an example of a blind spot object detection process according to the present embodiment will be described with reference to FIG.
  • FIG. 10 is a flowchart illustrating a blind spot object detection process according to the present embodiment. The process according to the flowchart in FIG. 10 is executed by the control unit 13 operating as the blind spot object measurement unit 132 in step S6 in FIG.
  • control unit 13 extracts an environmental component indicating a reflected wave from the surrounding environment from the signal of the measurement result of the radar 11 acquired in step S5 of FIG. It is removed (S11).
  • the process in step S11 is performed using, for example, the structure information acquired in step S2.
  • each of the peaks P1, P2, and P3 in the example of FIG. 8B is an environment component indicating a reflected wave from the corresponding wall 31, 32, or 33 in the passage structure information D1 (FIG. 8B).
  • the control unit 13 predicts reflected waves from various structures with reference to the structure information D1, and subtracts the environmental component of the prediction result from the measurement result (for example, FIG. 9A) of the radar 11 (S11).
  • FIG. 9A the radar 11
  • control unit 13 performs signal analysis for detecting the blind spot object 4 based on the signal component obtained by removing the environmental component (S12).
  • the signal analysis in step S12 may include various types of analysis such as frequency analysis, analysis on the time axis, spatial distribution, and signal strength.
  • the control unit 13 determines whether a wave source is observed, for example, beyond the blind spot facing the opposite wall 32 based on the analysis result of the signal analysis (S13), and thereby detects the presence or absence of the blind spot object 4. .
  • the peak P4 has a wave source located on the far side of the passage than the facing wall 32 and is located at a position that is not predicted as an environmental component from the structure of the passage. From this, it can be presumed that the peak P4 is caused by multiple reflection of a wave whose source is inside the blind spot. That is, when the reflected wave is observed at a distance exceeding the facing wall 32 in the direction of the detected blind spot, the control unit 13 can determine that the blind spot object 4 is present (YES in step S13).
  • the control unit 13 determines that the wave source is observed beyond the opposite wall 32 in the blind spot (YES in S13), the distance to the blind spot object 4 and the distance to the blind spot object 4 are determined according to the propagation path in which the bending due to the multiple reflection is estimated.
  • Various state variables such as speed are measured (S14).
  • the control unit 13 uses the information indicating the road width of the blind spot portion (the width of the blind spot area R1) in the structural information D1, and thereby, for example, as shown in FIG. By correcting the path length so as to be folded, the position of the blind spot object 4 closer to the actual position can be calculated.
  • step S6 in FIG. 4 ends. After that, the control unit 13 executes a risk determination process (S7 in FIG. 4) for the detected blind spot object 4.
  • control unit 13 determines that the wave source is not observed beyond the opposite wall 32 in the blind spot (NO in S13), the control unit 13 ends this processing without performing any surveying. In this case, the control unit 13 may omit the processing after step S7 in FIG.
  • the blind spot object 4 can be detected using the signal component generated inside the blind spot area R1 based on the property of the multiple reflection in the physical signal Sa of the radar 11.
  • the signal component having information on the blind spot object 4 is weak, and it is detected in the presence of a reflected wave from a visible object outside the blind spot. Therefore, it is considered that detection and estimation are difficult.
  • the actual distance to the blind spot object 4 is different from the length of the signal propagation path, it may be difficult to estimate the actual distance.
  • the structure information D1 of the surrounding environment it is possible to narrow down the preconditions for analyzing the received wave (S11) and to improve the estimation accuracy (S14).
  • the control unit 13 refers to the distance to the intersection near the blind spot in the structure information D1, and removes the signal component of the received wave obtained within a round trip propagation time of the signal with respect to the linear distance to the intersection. .
  • a received wave is a directly reflected wave (ie, a single reflected wave) and does not include information on the blind spot object 4, and thus can be excluded from the analysis target.
  • the control unit 13 can also separate the reflected wave coming from the blind spot from the reflected wave coming from another angle based on the azimuth angle of the blind spot viewed from the host vehicle 2.
  • step S11 does not necessarily need to use the structure information D1 of the surrounding environment.
  • the control unit 13 may limit the analysis target to a moving object by subtracting the position change of the own vehicle 2 from the signal obtained along the time axis. This processing may be performed in the signal analysis of step S12.
  • the control unit 13 determines whether the signal component to be analyzed has a feature that appears due to the behavior of a specific object, such as Doppler shift due to reflection on a moving object, or fluctuation of a behavior peculiar to a human or a bicycle. Whether or not it may be analyzed. In addition, the control unit 13 determines whether the spatially distributed surface measurement signal distribution has a characteristic distribution of an automobile, a bicycle, a human, or the like, or includes a reflection from a metal object of an automobile size depending on the reflection intensity. Or the like may be analyzed. The above analysis may be performed in combination as appropriate, or may be analyzed as a multidimensional feature using machine learning instead of explicitly analyzing each.
  • the risk of the blind spot object 4 is determined by detecting the blind spot object 4 based on the detection result of the blind spot region R1. judge. At this time, the threshold value of the degree of danger is dynamically adjusted according to the situation of the surrounding environment such as the preceding vehicle 5 (FIG. 2). Details of the processing in step S7 will be described with reference to FIGS.
  • FIG. 11 is a flowchart illustrating an example of a risk determination process.
  • FIG. 12 is a diagram for explaining the risk determination processing. The process according to the flowchart in FIG. 11 is executed by the control unit 13 operating as the risk determination unit 133 in step S7 in FIG.
  • the control unit 13 calculates the risk index D based on the detection result of the blind spot object 4 in step S6 (S21).
  • the risk index D indicates an index for determining the risk of a collision between the detected blind spot object 4 and the host vehicle 2. For example, as shown in FIG. 12, the speed v 1 of blind spot object 4 approaches the own vehicle 2 can be set to risk index D.
  • the control unit 13 determines whether or not an object exists at the junction where the blind spot region R1 joins in the surrounding environment of the vehicle 2 (S22).
  • the process in step S22 is performed based on, for example, the analysis result in step S2 in FIG.
  • the control unit 13 specifies the junction such as the intersection 3 in the distance image or the structure information D1, and determines whether or not the moving object is located at the specified junction, thereby performing the determination in step S22.
  • the control unit 13 determines that there is no object at the junction of the blind spots (NO in S22).
  • the control unit 13 determines that an object exists at the junction of the blind spots (S22). YES).
  • control unit 13 determines that there is no object at the junction of the blind spots (NO in S22), it sets the threshold value Va for determining the degree of danger to a normal level (S23).
  • the normal level of the threshold value Va is set in consideration of, for example, the magnitude of the risk index D for which a warning is required for the blind spot object 4 in a normal state where no attention is specifically given.
  • the control unit 13 determines that the risk index D exceeds the threshold value Va (YES in S25)
  • the control unit 13 sets, for example, a warning flag to “ON” as a risk determination result (S26).
  • the warning flag is a flag that manages the presence / absence of a warning regarding the blind spot object 4 by “ON / OFF”, and is stored in the storage unit 14.
  • the control unit 13 sets the warning flag to “OFF” (S27).
  • the control unit 13 determines that an object is present at the junction of the blind spots (YES in S22), it sets the threshold value Va to a relaxation level instead of a normal level (S24). Since the blind spot object 4 is in a state of being alerted, the mitigation level of the threshold value Va is, for example, a level at which a criterion that requires a warning regarding the blind spot object 4 is relaxed from the normal level.
  • the mitigation level of the threshold Va is set to a value larger than the normal level of the threshold Va.
  • the control unit 13 the speed v 1 of the blind spot object 4, to determine the degree of risk depending on whether exceeds the larger threshold Va than the normal level (S25), a warning flag "ON” or It is set to "OFF" (S26, S27).
  • control unit 13 ends the risk determination process (S7 in FIG. 4) and proceeds to the process of step S8.
  • the risk of the blind spot object 4 approaching the own vehicle 2 or the intersection 3 is determined according to the corresponding risk index D. For example, a binary determination according to the warning flag is performed.
  • the control unit 13 can cause the alarm 22 to warn or cause the vehicle drive unit 21 to perform specific control (S8 in FIG. 4).
  • the threshold value Va is changed from the normal level to the mitigation level (S22 to S24). Accordingly, in consideration of the effect that the blind spot object 4 is alerted by the forward vehicle 5 or the like, no warning is given when the speed of the blind spot object 4 is higher than the normal level but lower than the relaxation level. Can be tolerated. Note that the execution time of the processing of steps S22 to S24 is not particularly limited, and for example, steps S22 to S24 may be performed before step S21.
  • the risk determination process is not limited to the binary determination.
  • a ternary determination for determining whether or not a warning is issued when a warning is unnecessary may be performed.
  • the control unit 13 may determine whether or not D> Vb when proceeding to “NO” in step S25. .
  • risk index D is not limited to the speed v 1, can be set by various state variables related to the blind spot object 4, for example, instead of the velocity v 1 may be set to the acceleration dv 1 / dt.
  • the relaxation level of the threshold value Va is set to, for example, a value smaller than the normal level. Accordingly, in consideration of the effect of alerting the blind spot object 4 when the forward vehicle 5 or the like exists, the distance L to the blind spot object 4 within the range larger than the mitigation level is smaller than the normal level. A smaller case can be tolerated, and a warning or the like can be omitted.
  • the risk index D may be set by a combination of various state variables.
  • An example of such a risk index D is shown in the following equation (1).
  • D
  • L 1 is the distance from the reference position P0 to the blind spot object 4 ( Figure 12).
  • the reference position P0 is set to a position where a collision between the blind spot object 4 and the vehicle 2 is assumed, such as the center of an intersection.
  • ⁇ t is a predetermined time width, and is set, for example, in the vicinity of a time width expected to take until the host vehicle 2 reaches the reference position P0.
  • L 0 is the distance from the reference position P 0 to the host vehicle 2.
  • v 0 is the speed of the vehicle 2 and can be obtained from the on-board sensor 16 or the like.
  • the risk index D in the above equation (1) is the sum of the distance between the blind spot object 4 and the reference position P0 and the distance between the reference position P0 and the host vehicle 2, which are estimated after the elapse of the time width ⁇ t ( (FIG. 12).
  • the control unit 13 determines “ The process may proceed to “YES” and proceed to “NO” when the value does not fall.
  • the risk index D may be set as in the following Expression (2) or Expression (2 ′).
  • D L 1 ⁇ v 1 ⁇ t (2)
  • D
  • ⁇ t L 0 / v 0 is set.
  • the time width ⁇ t may be set within an allowable range in consideration of a change in the speed v 0 of the host vehicle 2 or an estimation error of the reference position P0.
  • the control unit 13 determines the degree of risk in the same manner as in the case of Expression (1) using the risk index D of Expression (2) or Expression (2 ′). Can be.
  • the mitigation level and the normal level of the threshold value Va may be dynamically changed according to, for example, the states of the host vehicle 2 and the blind spot object 4. For example, small or is L 0 as described above, large or the dv 0 / dt or dv 1 / dt, or if the blind spot object 4 is presumed to humans, is considered to be a determination of the risk more strictly. Therefore, when such a case is detected, the control unit 13 may increase the threshold value Va with respect to the risk index D of the above equation (1), for example.
  • the detection device 1 detects an object in a surrounding environment of the own vehicle 2 which is an example of a moving object.
  • the detection device 1 includes a radar 11 and a camera 12 as detection units, and a control unit 13.
  • the detection units 11 and 12 detect distance information indicating the distance from the vehicle 2 to the surrounding environment.
  • the control unit 13 controls the detection units 11 and 12 to analyze the detection result.
  • the control unit 13 detects a blind spot region R1 indicating a blind spot in the surrounding environment based on the detection results of the detection units 11 and 12, and determines a risk degree regarding the blind spot region R1 based on the detection result of the blind spot region R1 ( S7).
  • the control unit 13 relaxes the criterion for determining the degree of danger ( S24).
  • the detection device 1 described above when an object is present at the junction of blind spots, the risk criterion is relaxed, thereby detecting the object with respect to the blind spot in the surrounding environment of the own vehicle 2 to reduce the risk. It is possible to suppress erroneous determination of an excessive degree of risk when making a determination.
  • the control unit 13 detects an object in the blind spot area R1 based on the detection result of the detection unit (S6), and determines the detection result of the blind spot object 4 in the blind spot area R1. Accordingly, the degree of danger is determined (S7). Thereby, the degree of danger according to the blind spot object 4 can be determined.
  • the radar 11 emits a physical signal Sa having wave characteristics from the own vehicle 2 to the surrounding environment, and detects distance information according to a reflected wave of the emitted physical signal Sa. .
  • the control unit 13 detects the blind spot object 4 based on the wave signal Sb including the component of the wave arriving from the blind spot area R1 in the detection result of the radar 11.
  • Waves to be used are not limited to multiple reflection waves, and may include diffracted waves or transmitted waves.
  • the control unit 13 controls the radar 11 to emit the physical signal Sa toward the detected blind spot region R1 (S4).
  • the physical signal Sa from the radar 11 does not necessarily need to be concentrated in the blind spot area Ra.
  • the physical signal Sa may be radiated as appropriate within a range that the radar 11 can detect.
  • the detection device 1 of the present embodiment further includes a storage unit 14 that stores structure information D1 indicating the object structure of the surrounding environment.
  • the control unit 13 detects the blind spot object 4 by analyzing the wave signal including the component of the wave arriving from the blind spot region R1 in the detection result of the radar 11 with reference to the structure information D1 (S6). By using the structure information D1, the detection of the blind spot object 4 can be performed with high accuracy.
  • the control unit 13 generates the structure information D1 based on the detection result of the camera 12, and stores it in the storage unit 14 (S2).
  • the blind spot object 4 can be accurately detected by sequentially generating the structure information D1.
  • the control unit 13 calculates a risk index D corresponding to the risk based on the detection result of the blind spot object 4 (S21), and calculates the calculated risk index D and the threshold Va. And the risk is determined (S25).
  • the control unit 13 adjusts the threshold value Va so as to ease the risk criterion (S22, S24). By adjusting the threshold value Va, erroneous determination of an excessive degree of risk can be easily suppressed.
  • the control unit 13 determines whether or not an object exists at the junction based on the detection result of the detection unit (S22), and determines that the object exists at the junction. (YES in S22), the risk criterion is relaxed (S24). By this determination, an erroneous determination of an excessive degree of risk can be appropriately suppressed.
  • the detection unit includes at least one of the camera 12, the radar 11, and the navigation device 15.
  • the distance information can be detected by the various detection units, and the periphery of the vehicle 2 can be monitored.
  • the mobile system includes the detection device 1 and the vehicle control device 20.
  • the vehicle control device 20 is mounted on the host vehicle 2 and executes an operation according to the result of the determination of the degree of danger by the detection device 1.
  • the detection device 1 can suppress an erroneous determination of an excessive risk when detecting an object with respect to a blind spot in a surrounding environment of the own vehicle 2 and determining the risk.
  • the detection method is a method for detecting an object in a surrounding environment including a path of a moving body such as the host vehicle 2.
  • the method includes steps S1 and S2 in which the detection unit detects distance information indicating a distance from the moving body to the surrounding environment.
  • the method includes steps S3 to S6 in which the control unit 13 detects a blind spot region R1 indicating a blind spot in the surrounding environment based on the detection result of the detection unit, and the control unit 13 relates to the blind spot region R1 based on the detection result of the blind spot region R1.
  • Step S7 for determining the degree of danger.
  • the method includes a step S24 in which, when an object is present at a junction where the blind spot region R1 and the path of the mobile unit merge, the control unit 13 relaxes the risk criterion.
  • a program for causing the control unit 13 to execute the above detection method is provided. According to the detection method of the present embodiment, it is possible to suppress an erroneous determination of an excessive risk when detecting an object with respect to a blind spot in a surrounding environment of a moving object such as the host vehicle 2 and determining a risk.
  • the multiple reflected waves are used for detecting the blind spot object 4.
  • the present invention is not limited to the multiple reflected waves, and for example, a diffracted wave may be used. This modification will be described with reference to FIG.
  • the physical signal Sa from the radar 11 is diffracted on the shielding wall 31 and reaches the blind spot object 4.
  • the reflected wave from the blind spot object 4 is diffracted on the shielding wall 31 and returns to the host vehicle 2 as a diffracted wave Sb2.
  • the control unit 13 of the present embodiment controls the wavelength and the azimuth of the physical signal Sa radiated from the radar 11 so that the shielding wall 31 wraps around in Step S4 of FIG.
  • the signal can reach even a region that cannot be reached geometrically with visible light or the like having high linearity due to the presence of various shields. .
  • the signal is transmitted not only in a completely reflective path but also in a direction in which the radiated own vehicle 2 exists. reflect. Such a reflected wave causes the diffraction phenomenon to propagate to the shielding wall 31, so that the radar 11 can receive the diffracted wave Sb ⁇ b> 2 as a signal component to be analyzed.
  • the signal component of the diffracted wave Sb2 has information on the propagation path to the blind spot object 4 and Doppler information according to the moving speed. Therefore, by analyzing the signal components, the position and velocity of the blind spot object 4 can be measured from the information on the propagation time, phase, and frequency of the signal components, as in the first embodiment. At this time, the propagation path of the diffracted wave Sb2 can also be estimated from the distance to the shielding wall 31 or various types of structural information D1. Further, a propagation path in which multiple reflection and diffraction are combined can be appropriately estimated, and a signal component of such a wave may be analyzed.
  • FIG. 14 is a flowchart for explaining a modification of the detection device 1.
  • the detection device 1 according to the first embodiment monitors the periphery using the camera 12 (S1 to S3 in FIG. 4).
  • the detection device 1 of the present modified example performs the same peripheral monitoring by the radar 11 as in S1 to S3 of FIG. 4 (S1A to S3A).
  • the control unit 13 performs switching control of, for example, the band of the radar 11, and uses a band that easily turns around at the blind spot (S4A). In this case, a signal analysis utilizing the diffracted wave is performed in step S6.
  • steps S1A to S3A the resolution in monitoring the periphery of the radar 11 can be improved by using a band having high linearity.
  • the radar 11, the camera 12, and the navigation device 15 have been described as examples of the detection unit.
  • the detection unit of the present embodiment is not limited to these, and may be, for example, LIDAR.
  • the physical signal Sa emitted from the detection unit may be, for example, infrared light.
  • the detection unit may be a sonar, and may emit an ultrasonic wave as the physical signal Sa. In these cases, the wave signal Sb received by the detection unit is set in the same manner as the corresponding physical signal Sa.
  • the example in which the radar 11 and the camera 12 are installed toward the front of the vehicle 2 has been described, but the installation positions of the radar 11 and the like are not particularly limited.
  • the radar 11 and the like may be arranged toward the rear of the vehicle 2 and, for example, the mobile system may be used for parking assistance.
  • the detection device 1 detects the blind spot object 4 by utilizing the characteristics of the wave based on the physical signal Sa from the detection unit.
  • the method of detecting the blind spot object 4 is not limited to the above method, and various methods may be adopted.
  • the object 4 in the blind spot area R1 may be estimated based on various information. Even in this case, it is possible to perform the risk determination process on the estimation result in the same manner as in each of the above embodiments.
  • the detection device 1 has detected the blind spot object 4.
  • the detection device 1 of the present embodiment may determine the degree of danger related to the blind spot without detecting the blind spot object 4.
  • the risk determination process may be performed using the detection result of the blind spot region R1, and the risk index D is appropriately determined based on various information such as the size, shape, or positional relationship of the detected blind spot region R1. It may be calculated. Even in this case, erroneous determination of the degree of risk can be suppressed by relaxing the criteria for determining the degree of risk according to the presence of an object at the junction of the blind spots. For example, even if there is a blind spot object 4 that has not been particularly detected, an excessive warning or the like can be avoided by reflecting a state alerted by the preceding vehicle 5 or the like as a result.
  • the moving body on which the detection device 1 is mounted is not particularly limited to an automobile, and may be, for example, an AGV.
  • the detection device 1 may monitor the periphery when the AGV automatically travels, and may detect an object in a blind spot.
  • a first aspect according to the present disclosure is a detection device that detects an object in a surrounding environment including a path of a moving object (2).
  • the detection device includes a detection unit (11, 12) and a control unit (13).
  • the detecting unit detects distance information indicating a distance from the moving body to the surrounding environment.
  • the control unit controls the detection unit.
  • the control unit detects a blind spot area indicating a blind spot in the surrounding environment based on the detection result of the detection unit (S3), and determines a degree of risk related to the blind spot area based on the detection result of the blind spot area. (S7).
  • the control unit relaxes the risk criterion when an object is present at a junction where the detected blind spot area and the path of the moving body merge (S24).
  • the control unit detects an object in the blind spot area based on a detection result of the detection unit (S6), and detects the object in the blind spot area.
  • the risk is determined according to the detection result of the object (S7).
  • the detection unit emits a physical signal having wave characteristics from the moving body to the surrounding environment, and responds to a reflected wave of the emitted physical signal. To detect the distance information.
  • the control unit detects an object in the blind spot area based on a wave signal including a wave component arriving from the blind spot area in the detection result of the detection unit.
  • control unit is configured to, when detecting the blind spot area in the surrounding environment, emit the physical signal toward the detected blind spot area.
  • the section is controlled (S4).
  • the detection device further includes a storage unit (14) that stores structure information (D1) indicating an object structure of the surrounding environment.
  • the control unit refers to the structure information, analyzes a wave signal including a wave component arriving from the blind spot area in a detection result of the detection unit, and detects an object in the blind spot area (S6). .
  • the control unit is configured to determine a risk index corresponding to the risk based on a detection result of an object in the blind spot area ( D) is calculated (S21), and the calculated risk index is compared with a threshold to determine the risk (S25).
  • the control unit adjusts the threshold so as to ease the risk criterion (S22, S24).
  • the control unit determines whether an object is present at the junction based on a detection result of the detection unit. When it is determined that an object is present at the junction, the criterion for determining the degree of risk is relaxed.
  • the detection unit includes at least one of a camera, a radar, a LIDAR, and a navigation device.
  • 9A ninth aspect is a mobile system including the detection device according to any one of the first to eighth aspects, and a control device (20).
  • the control device is mounted on the moving body and performs an operation according to a result of the determination of the degree of risk by the detection device.
  • the 10A tenth aspect is a detection method for detecting an object in a surrounding environment including a path of a moving object (2).
  • the method includes a step of detecting distance information indicating a distance from the moving object to the surrounding environment (S1, S2).
  • a control unit (13) detects a blind spot region (R1) indicating a blind spot in the surrounding environment based on a detection result of the detection unit (S3), and based on the detection result of the blind spot region. (S7) determining the degree of risk related to the blind spot area.
  • the method includes a step (S24) of relaxing the risk criterion when an object is present at a junction where the blind spot area and the path of the mobile unit merge.
  • An eleventh aspect is a program for causing a control unit to execute the detection method according to the tenth aspect.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un dispositif de détection (1) qui détecte un objet dans un environnement proche comprenant un itinéraire d'un corps mobile (2). Le dispositif de détection est pourvu d'unités de détection (11, 12), et d'une unité de commande (13). Les unités de détection détectent des informations de distance représentant la distance du corps mobile à l'environnement proche. L'unité de commande commande les unités de détection. L'unité de commande détecte une région d'angle mort représentant un angle mort dans l'environnement proche sur la base des résultats de la détection par les unités de détection, et détermine un degré de risque associé à la région d'angle mort sur la base du résultat de la détection de la région d'angle mort (S7). Si l'objet est situé au niveau du point de raccordement où l'itinéraire du corps mobile se raccorde à la région d'angle mort détectée, alors l'unité de commande assouplit les critères de détermination du degré de risque (S24).
PCT/JP2019/009926 2018-10-05 2019-03-12 Dispositif de détection, système de corps mobile, et procédé de détection WO2020070908A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018190259A JP7070307B2 (ja) 2018-10-05 2018-10-05 検知装置、移動体システム、及び検知方法
JP2018-190259 2018-10-05

Publications (1)

Publication Number Publication Date
WO2020070908A1 true WO2020070908A1 (fr) 2020-04-09

Family

ID=70055353

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/009926 WO2020070908A1 (fr) 2018-10-05 2019-03-12 Dispositif de détection, système de corps mobile, et procédé de détection

Country Status (2)

Country Link
JP (1) JP7070307B2 (fr)
WO (1) WO2020070908A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006349456A (ja) * 2005-06-15 2006-12-28 Denso Corp 車載レーダ装置、車両制御システム
WO2012033173A1 (fr) * 2010-09-08 2012-03-15 株式会社豊田中央研究所 Dispositif de prédiction pour objets mouvants, dispositif de prédiction pour objets mouvants virtuels, programme, procédé de prédiction pour objets mouvants et procédé de prédiction pour objets mouvants virtuels
JP2013156794A (ja) * 2012-01-30 2013-08-15 Hitachi Consumer Electronics Co Ltd 車両用衝突危険予測装置
JP2015230566A (ja) * 2014-06-04 2015-12-21 トヨタ自動車株式会社 運転支援装置
JP2018101295A (ja) * 2016-12-20 2018-06-28 トヨタ自動車株式会社 物体検知装置
WO2019008716A1 (fr) * 2017-07-06 2019-01-10 マクセル株式会社 Dispositif de mesure de non-visible et procédé de mesure de non-visible
WO2019044185A1 (fr) * 2017-08-28 2019-03-07 株式会社デンソー Dispositif de sortie vidéo, programme de génération de vidéo et support lisible par ordinateur tangible non transitoire

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006349456A (ja) * 2005-06-15 2006-12-28 Denso Corp 車載レーダ装置、車両制御システム
WO2012033173A1 (fr) * 2010-09-08 2012-03-15 株式会社豊田中央研究所 Dispositif de prédiction pour objets mouvants, dispositif de prédiction pour objets mouvants virtuels, programme, procédé de prédiction pour objets mouvants et procédé de prédiction pour objets mouvants virtuels
JP2013156794A (ja) * 2012-01-30 2013-08-15 Hitachi Consumer Electronics Co Ltd 車両用衝突危険予測装置
JP2015230566A (ja) * 2014-06-04 2015-12-21 トヨタ自動車株式会社 運転支援装置
JP2018101295A (ja) * 2016-12-20 2018-06-28 トヨタ自動車株式会社 物体検知装置
WO2019008716A1 (fr) * 2017-07-06 2019-01-10 マクセル株式会社 Dispositif de mesure de non-visible et procédé de mesure de non-visible
WO2019044185A1 (fr) * 2017-08-28 2019-03-07 株式会社デンソー Dispositif de sortie vidéo, programme de génération de vidéo et support lisible par ordinateur tangible non transitoire

Also Published As

Publication number Publication date
JP2020060863A (ja) 2020-04-16
JP7070307B2 (ja) 2022-05-18

Similar Documents

Publication Publication Date Title
JP7067400B2 (ja) 検知装置、移動体システム、及び検知方法
CN111483457A (zh) 用于避免碰撞的装置、系统和方法
JP6531903B2 (ja) 物体検出装置
JP2014227000A (ja) 車両制御装置、その方法およびそのプログラム
JP6958537B2 (ja) 検知装置、移動体システム、及び検知方法
JP7111181B2 (ja) 検知装置、移動体システム、及び検知方法
JP6668472B2 (ja) 物体分類を有する動力車両の周辺領域をキャプチャーする方法、制御装置、運転者支援システム、及び動力車両
JP7028139B2 (ja) 報知装置及び報知方法
JP2011086139A (ja) 車両の衝突を回避するための装置
JP4863679B2 (ja) 位置測定装置
WO2020070908A1 (fr) Dispositif de détection, système de corps mobile, et procédé de détection
WO2020054108A1 (fr) Dispositif de détection, système de corps mobile, et procédé de détection
JP2014211332A (ja) レーダ装置、レーダ装置の制御方法
KR20130067648A (ko) 차량용 충돌 감지 시스템 및 그 방법
CN114502433A (zh) 控制装置
JP2015054603A (ja) 対象物検知装置
JP2010091490A (ja) 車載用レーダシステム
JP2011220779A (ja) 障害物検出装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19869617

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19869617

Country of ref document: EP

Kind code of ref document: A1