WO2014010546A1 - 故障判定装置 - Google Patents
故障判定装置 Download PDFInfo
- Publication number
- WO2014010546A1 WO2014010546A1 PCT/JP2013/068618 JP2013068618W WO2014010546A1 WO 2014010546 A1 WO2014010546 A1 WO 2014010546A1 JP 2013068618 W JP2013068618 W JP 2013068618W WO 2014010546 A1 WO2014010546 A1 WO 2014010546A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- moving target
- determined
- vehicle
- camera
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
- G01S13/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
- G01S13/588—Velocity or trajectory determination systems; Sense-of-movement determination systems deriving the velocity value from the range measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/93185—Controlling the brakes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/932—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9325—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles for inter-vehicle distance regulation, e.g. navigating in platoons
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93271—Sensor installation details in the front of the vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to a failure determination apparatus that determines the presence or absence of an abnormality in an imaging unit in an object recognition apparatus including a transmission and reception unit and an imaging unit.
- both the radar device (source / receiver) and the camera (imager) are used. There are some which use the detection result to determine the presence of an object and the type of the object.
- the transmission output of the radar apparatus is switched to large and small, and from the detection result based on the reflected wave received at the large transmission output, the detection result based on the reflected wave received at the small transmission output
- a technique for extracting an object other than a vehicle and performing pattern matching processing on the extracted object based on an image captured by a camera by removing There is.
- Patent Document 1 does not disclose the failure of the radar device or the camera.
- the accuracy of the object recognition is affected when any one of the radar device and the camera has an abnormality.
- it influences various controls (for example, alerting control, contact avoidance control, etc.) that the host vehicle can take on the object recognized by the object recognition device. Therefore, when there is an abnormality in the camera, it is necessary to immediately notify the user of the abnormality.
- the aspect which concerns on this invention provides the failure determination apparatus which can determine abnormality of an imaging means early.
- the present invention adopts the following means in order to solve the above problems.
- the failure determination device transmits an electromagnetic wave toward a predetermined range around the host vehicle and receives a reflected wave generated by the electromagnetic wave reflected by an object present around the host vehicle.
- Emitting means and receiving means imaging means for imaging the predetermined range around the host vehicle, moving target determining means for determining whether the object detected by the emitting and receiving means is a moving target,
- An object extracting unit for extracting a specific object from an image captured by the imaging unit, and the object determined as the moving target by the moving target determining unit is the specific object by the object extracting unit
- a failure determination unit that determines that the imaging unit is in an abnormal state when it can not be determined that there is an error.
- the object determined as the moving target by the moving target determining means is the specific object by the object extracting means
- the imaging means can not determine that the object is in the abnormal state
- the predetermined area around the host vehicle is illuminated by the illumination device, and the object is illuminated even by the illumination device.
- the failure determination unit may determine that the imaging unit is in the abnormal state.
- the object detected by the transmission and reception means is a moving target
- the object is likely to be a specific object, and thus the specific object is previously determined by the transmission and reception means
- an object that is highly likely to be a person for example, a pedestrian or a vehicle
- a specific object is extracted from the image of the imaging means after being illuminated by the illumination device, and an abnormal state of the imaging means is determined by the failure determination means. It is possible to prevent the imaging unit from being erroneously determined to be in an abnormal state when it is not possible to determine whether it is a specific object due to darkness.
- FIG. 6 is a diagram showing an example of a number-of-decisions calculation table for calculating the number of camera failures determined in accordance with the moving speed of the object.
- the failure determination device of the first embodiment according to the present invention is incorporated in an object recognition device 1, and the object recognition device 1 automatically performs, for example, the driving force of an internal combustion engine 21 as a driving source. It is mounted on a vehicle that travels by transmitting it to the drive wheels of the vehicle via a transmission (T / M) 22 such as a transmission (AT) or a continuously variable automatic transmission (CVT).
- the vehicle includes a brake actuator 23, a steering actuator 24, a notification device 25, and a headlight 26 in addition to the object recognition device 1.
- the object recognition device 1 includes a radar device (source and receiver means) 2, a camera unit (imaging means) 3, a host vehicle state sensor 4, and an electronic control unit 10.
- the radar device 2 transmits, for example, an electromagnetic wave such as a laser beam or a millimeter wave forward in the traveling direction of the host vehicle, and the transmitted electromagnetic wave is an object outside the host vehicle (for example, a structure, a pedestrian, other vehicles And the like, and mixes the transmitted electromagnetic wave and the received electromagnetic wave (reflected wave) to generate a beat signal, and outputs the beat signal to the electronic control unit 10.
- the camera unit 3 includes a camera 3a including a CCD camera, a CMOS camera, and the like, and an image processing unit 3b.
- the image processing unit 3b performs predetermined image processing, such as filtering and binarization processing, on the image of the external world ahead of the traveling direction of the host vehicle obtained by imaging with the camera 3a, for example, from pixels of a two-dimensional array Image data is generated and output to the electronic control unit 10.
- the host vehicle state sensor 4 is, for example, a vehicle speed sensor that detects the speed (vehicle speed) of the host vehicle as vehicle information of the host vehicle, a yaw rate sensor that detects a yaw rate (rotational angular velocity about the vertical axis of the vehicle center of gravity), steering A steering angle sensor that detects an angle (direction and magnitude of a steering angle input by the driver) or an actual steering angle (turning angle) according to the steering angle, a steering torque sensor that detects a steering torque, Positioning signals such as GPS (Global Positioning System) signals for measuring the position of a vehicle by using a position signal, position signals transmitted from an information transmission device outside the vehicle, etc., and further, appropriate gyro sensors and acceleration sensors Position sensor that detects the current position and traveling direction of the vehicle based on the detection result of the vehicle, a sensor that detects the amount of depression of the accelerator pedal, and the depression state of the brake pedal. It is configured to include a sensor that knowledge.
- the electronic control unit 10 includes an object detection unit 11, a moving target determination unit (moving target determination unit) 12, an object extraction unit (object extraction unit) 13, a failure determination unit (failure determination unit) 14, and vehicle control. And a unit 15.
- the object detection unit 11 calculates the position, velocity, reflection level, etc. of the object that has reflected the electromagnetic wave based on the beat signal input from the radar device 2 and outputs the calculated information to the moving target determination unit 12 Do.
- the velocity of the object can be calculated from the velocity of the vehicle and the relative velocity with the vehicle which is detected by the radar device 2 and calculated based on the position information of the object with a time difference.
- the moving target determination unit 12 determines whether the object is a moving object (moving target) based on the velocity of the object input from the object detecting unit 11 or a non-moving object, that is, a stationary object It determines whether or not the object is not a target object), and outputs the determination result to the failure determination unit 14 and the vehicle control unit 15. Further, the object detection unit 11 calculates a predicted position of the object after a predetermined time based on the calculated velocity (or relative velocity) of the object, and outputs the predicted position information to the object extraction unit 13.
- the object extraction unit 13 receives image data from the camera unit 3 and also receives predicted position information of the object from the object detection unit 11.
- the object extraction unit 13 sets an area of a predetermined size (hereinafter, referred to as an integrated range) on the image data input from the camera unit 3 based on the input predicted position information, for example, centering on the predicted position.
- the object extraction unit 13 extracts an object on the image data by edge extraction based on the luminance value of the pixels included in the set integrated range, and the human body stored in advance for the extracted object. Pattern matching processing is performed with the model image of the vehicle, and it is determined whether the extracted object matches the model image of the human body or the vehicle. Then, the object extraction unit 13 outputs the determination result to the failure determination unit 14 and the vehicle control unit 15.
- the failure determination unit 14 determines the determination result input from the moving target determination unit 12, that is, the determination result as to whether it is a moving target or a stationary object, and the determination result input from the object extraction unit 13, that is, the object extracted on the image data. It is determined whether the camera unit 3 is in an abnormal state based on the determination result of whether or not the model image of the human body or the vehicle is matched. In the first embodiment, the “specific object” is a pedestrian or a vehicle. In addition, when it is determined that the camera unit 3 is in an abnormal state, the failure determination unit 14 outputs a camera failure signal to the notification device 25, and the user of the object recognition device 1 via the notification device 25 determines that the object recognition device 1 is abnormal or the camera unit. Report 3 abnormalities.
- the vehicle control unit 15 determines the discrimination result input from the moving target discrimination unit 12, that is, the discrimination result as to whether it is a moving target or a stationary object, and the discrimination result input from the object extracting unit 13, that is, the object extracted on the image data It is determined whether the detected object is a pedestrian or a vehicle based on the determination result of whether the human body or the vehicle is matched with the model image, and the traveling of the own vehicle is performed according to the determination result. Control. For example, when it is determined that the detected object is a pedestrian or a vehicle and there is a possibility that the object contacts the vehicle, the traveling of the vehicle is controlled to avoid the contact.
- the vehicle control unit 15 controls a control signal for controlling the driving force of the internal combustion engine 21, a control signal for controlling the speed change operation of the transmission 22, and a control signal for controlling the speed reduction operation by the brake actuator 23. At least one of the control signals for controlling the steering operation of the steering mechanism (not shown) of the own vehicle according to the control signal is output, and the deceleration control or the steering control of the own vehicle is executed as the contact avoidance operation. In addition, the vehicle control unit 15 controls at least one of the output timing and the output content of the alarm by the notification device 25 in accordance with the magnitude of the possibility of contact with the pedestrian or the vehicle.
- the object recognition device 1 when the radar device 2 detects the presence of an object, the integrated range is set on the image data obtained by the camera unit 3 based on the predicted position of the object, and extracted in the integrated range Pattern matching processing is performed on the object, and when matching is performed, the object is set as a pedestrian candidate or a vehicle candidate, and information acquired by the radar device 2 with respect to the object (for example, position information, forward / backward movement speed, etc.) And information acquired by the camera unit 3 (for example, object type information, lateral movement speed, etc.) are integrated.
- the object recognition apparatus 1 is a system that recognizes whether the detected object is a pedestrian or a vehicle based on the information acquired by the radar apparatus 2 and the information acquired by the camera unit 3. If there is an abnormality in the camera unit 3, it affects the recognition result of the object recognition device 1. Therefore, when there is an abnormality in the camera unit 3, it is necessary to detect an abnormal state at an early stage and to notify the user.
- a specific object that is, a pedestrian or a vehicle in the integrated range on the image data of the camera unit 3 although the moving target is detected by the radar device 2. If the camera unit 3 can not be determined, it is determined that the camera unit 3 has an abnormality.
- the abnormal state of the camera unit 3 includes, for example, contamination of the lens of the camera 3a, a case where the imaging range of the camera 3a is shifted, disconnection of a signal line from the camera unit 3 to the electronic control device 10, and the like.
- an object that is not a moving target that is, a stationary object
- object recognition by the camera unit 3 when the object is a stationary object such as a telephone pole.
- the object since it is not determined that the object is the specific object (ie, a pedestrian or a vehicle), it can not be finally determined what the object is. Therefore, if a stationary object is included in the object detected by the radar device 2 which is the determination material of the abnormality determination of the camera unit 3, in the case as described above, the camera unit is erroneously determined to be abnormal. In order to prevent such a determination, stationary objects were excluded from the objects detected by the radar device 2 when determining the abnormality of the camera unit 3.
- step S101 the radar device 2 detects an object present ahead of the traveling direction of the vehicle, detects the reflection level of the reflected wave, and calculates the position and velocity of the object.
- step S102 it is determined whether the detected object is a moving target based on the velocity of the object calculated in step S101.
- FIG. 3 is a view showing an example of object detection by the radar device 2.
- reference symbol V indicates a vehicle
- reference symbols Xa, Xb, Xc, Xd, and Xe indicate objects detected by the radar device 2. Is shown.
- objects Xa, Xb and Xc indicate objects determined to be stationary objects
- objects Xd and Xe indicate objects determined to be moving targets.
- the radar device 2 can not determine what the detected object is.
- step S102 If the determination result in step S102 is "YES", the object detected by the radar device 2 is a moving target, so the process proceeds to step S103, and based on the image data of the camera unit 3, the image data is Object extraction is performed within the integrated range set in.
- step S104 pattern matching processing is performed on the object extracted in step S103, and it is determined whether or not matching is performed with a model image of a person or vehicle stored in advance.
- step S104 determines whether the determination result in step S104 is "YES" or "YES" or "YES"
- the initial value of the camera failure determination count C n is 0.
- FIG. 4A is a view showing an example of object detection by the camera unit 3 when the camera unit 3 is functioning normally, and shows a case where an object is extracted in an integrated range for each object in image data. There is.
- This example shows a case where the objects Xa to Xe detected by the radar device 2 are also detected on the image data of the camera unit 3.
- the objects Xc and Xd are walked by pattern matching processing.
- the object Xe is an object determined to be a vehicle by pattern matching processing
- the objects Xa and Xb are objects determined not to be a pedestrian or a vehicle by pattern matching processing. That is, in FIG.
- the moving target Xd detected by the radar device 2 is determined to be a pedestrian by the object detection of the camera unit 3, and the moving target Xe detected by the radar device 2 is an object of the camera unit 3
- symbol V shows the own vehicle in FIG. 4A and 4B.
- step S104 determines whether the camera failure determination count is “NO” or "NO"
- FIG. 4B is a view showing an example of object detection by the camera unit 3 when the camera unit 3 is in an abnormal state.
- an object is extracted in each integrated range for each object in image data
- the case is shown in which none of the objects can be determined to be a human body or a vehicle as a result of the pattern matching process. That is, FIG. 4B shows a case where the moving targets Xd and Xe detected by the radar device 2 can not be determined as a pedestrian or a vehicle by the object detection of the camera unit 3. In this case, the camera failure determination count is incremented by one.
- the moving targets Xd and Xe detected by the radar device 2 are not extracted as objects in the integrated range on the image data, they can not be discriminated as pedestrians or vehicles by object detection of the camera unit 3 Since the case is included, the camera failure determination count is also incremented by one in this case.
- the camera failure determination frequency N may be a fixed value of 1 or more (for example, any integer such as “1”, “5” or “10”), but the moving speed of the moving target or the object by the radar device 2 It is also possible to change according to the intensity (i.e., the reflection level) of the reflected wave reflected by the object in detection.
- FIG. 5 is an example of a determination number calculation table for calculating the camera failure determination number N according to the moving speed of the moving target.
- the camera failure determination frequency N is set to “1” when the moving speed is equal to or higher than a predetermined value, and the camera failure determination frequency N is set to a larger value as the moving speed becomes slower than the predetermined value.
- the reflection level when the reflection level in object detection by the radar device 2 is low, it is difficult to separate from noise or a stationary object, and it is determined that what is detected by the radar device 2 is a moving target As the reflection level is lower, it is difficult to determine that the camera unit 3 is abnormal by setting the number N of camera failure determinations to be larger as the reflection level is lower. This prevents an erroneous determination of camera abnormality.
- FIG. 6 is an example of a number-of-decisions calculation table for calculating the number N of camera failure determinations according to the magnitude of the reflection level.
- the camera failure determination frequency N is set to “1” when the reflection level is equal to or higher than a predetermined value, and the camera failure determination frequency N is set to a larger value as the reflection level becomes lower than the predetermined value.
- the number of camera failures determined when the moving speed and the reflection level are preset reference values is set as the reference number N 0.
- step S108 it is determined whether the current value C n of the camera failure determination count exceeds the camera failure determination number N. If the determination result in step S108 is “NO” (C n ⁇ N), the process proceeds to step S109, and the camera malfunction determination flag is set to “0”. On the other hand, if the determination result in step S108 is "YES” (C n > N), the process proceeds to step S110, the camera failure determination flag is set to "1", and the execution of this routine is once ended. Thus, it is determined that the camera unit 3 is in an abnormal state.
- the failure determination process is repeatedly performed for each object detected by the radar device 2 with the series of processes in steps S101 to S110 as one cycle, and is performed simultaneously and in parallel for each object. Then, when a plurality of moving targets are detected by the radar device 2, when the camera failure determination count C n exceeds the number N of camera failure determinations in the failure determination process for at least one of the moving targets.
- the camera unit 3 is determined to be in an abnormal state.
- the camera failure determination count C n calculated for each moving object is added up, and the total number of the summed camera failure determination counts C n is If the camera failure determination frequency N is exceeded, it may be determined that the camera unit 3 is in an abnormal state.
- the radar when the object detected by the radar device 2 is a moving target, the radar is used by utilizing the fact that the object is likely to be a pedestrian or a vehicle. Since the camera unit 3 determines whether the object is a pedestrian or a vehicle after detecting a moving target having a high possibility of being a pedestrian or a vehicle by the device 2, the camera unit 3 detects a pedestrian or a vehicle When it can be determined that the camera unit 3 is normal, it can be determined that the camera unit 3 is normal. When the camera unit 3 can not be determined to be a pedestrian or a vehicle, the camera unit 3 is in an abnormal state. It can be determined that Therefore, the abnormal state of the camera unit 3 can be determined early.
- the camera failure determination frequency N is set to a larger value, so the moving speed is slower or As the reflection level is lower, it can be made more difficult to determine that the camera unit 3 is in an abnormal state, whereby an erroneous determination can be prevented.
- failure determination of the camera unit 3 in the failure determination apparatus of the second embodiment according to the present invention will be described.
- the environment around the host vehicle is dark, such as when the host vehicle is traveling at night or in a tunnel, it is difficult to extract an object from the image data of the camera unit 3, and pattern matching processing It may be difficult.
- the moving target is detected by the radar device 2, it can be determined that the moving target is a pedestrian or a vehicle in the integrated range on the image data of the camera unit 3 If it is determined that there is an abnormality in the camera unit 3 even if it is not present, there is a possibility that an erroneous determination will be made.
- the integration on the image data of the camera unit 3 is performed even though the moving target is detected by the radar device 2
- the headlight 26 of the own vehicle is turned on to increase the camera failure determination count instead of immediately increasing the camera failure determination count.
- image data is generated again based on the image captured by the camera 3a, and it is determined whether the moving target is a pedestrian or a vehicle in the integrated range on the image data, The camera failure determination count is increased when it can not be determined as a pedestrian or a vehicle.
- steps S101 to S105 and steps S107 to S110 are the same as the processes of the same step numbers in the first embodiment, and the flow of these processes is also the same as that of the first embodiment, and thus the description thereof is omitted.
- the determination result in step S104 is "NO"
- the process proceeds to step S111, and the headlight 26 is turned on.
- step S112 an object is extracted from the image data generated based on the image captured by the camera 3a after the headlight 26 is turned on, within the integrated range set on the image data.
- step S113 pattern matching processing is performed on the object extracted in step S112, and it is determined whether or not matching is performed with a model image of a person or vehicle stored in advance.
- the failure determination apparatus of the second embodiment in addition to the operation and effects of the failure determination apparatus of the first embodiment described above, after the headlight 26 is turned on, the image is captured again by the camera 3a, and the image data is captured. Since it is determined based on whether the moving target is a pedestrian or a vehicle, the camera unit 3 is abnormal when the pedestrian or the vehicle can not be determined due to the darkness of the surrounding environment such as night travel or travel in a tunnel It is possible to prevent an erroneous determination as a state.
- the present invention is not limited to the embodiments described above.
- the camera failure determination count is calculated for each object detected by the radar device 2.
- the camera failure determination count is incremented by “1” when it can not be determined as a pedestrian or a vehicle with respect to at least one moving target in step.
- the integrated value of the camera failure determination count is the camera failure determination If the number N is exceeded, it is also possible to determine that the camera unit 3 is abnormal.
- a specific object is a pedestrian or a vehicle, but it is also possible to add an animal such as a dog or a cat to the specific object.
- the detection direction of the object is not limited to the forward traveling direction of the host vehicle, and may be the rear in the front-rear direction of the host vehicle or the side of the host vehicle.
- Control using an object determined to be a specific object by the object recognition device 1 in which the failure determination device is incorporated is not limited to travel control for avoiding contact between the own vehicle and the object, and a specific object
- various controls that can be taken by the vehicle with respect to a specific object such as follow-up traveling control in which the vehicle travels following the preceding vehicle, are possible.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
- Closed-Circuit Television Systems (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
本願は、2012年7月10日に出願された日本国特願2012-154963号に基づき優先権を主張し、その内容をここに援用する。
前記物体認識装置のようにレーダ装置とカメラの二つの検出結果に基づいて物体を認識する場合には、レーダ装置とカメラのいずれか一方でも異常があるときには、物体認識の精度に影響を及ぼす。さらに、物体認識装置により認識した物体に対して自車両が取り得る各種の制御(例えば、注意喚起制御や接触回避制御等)に影響を及ぼす。
そのため、カメラに異常があるときには、早急に異常であることをユーザーに報知する必要がある。
(1)本発明に係る一態様の故障判定装置は、自車両周辺の所定範囲に向けて電磁波を発信するとともに、該電磁波が該自車両周辺に存在する物体により反射されて生じる反射波を受信する発受信手段と、前記自車両周辺の前記所定範囲を撮像する撮像手段と、前記発受信手段により検知された前記物体が移動物標であるか否かを判別する移動物標判別手段と、前記撮像手段により撮像された画像から特定の物体を抽出する物体抽出手段と、前記移動物標判別手段により前記移動物標であると判別された前記物体が前記物体抽出手段によって前記特定の物体であると判別することができない場合に、前記撮像手段が異常状態であると判定する故障判定手段と、を備える。
図1に示すように、本発明に係る第1実施形態の故障判定装置は物体認識装置1に組み込まれており、物体認識装置1は、例えば駆動源としての内燃機関21の駆動力を、オートマチックトランスミッション(AT)あるいは無段自動変速機(CVT)等のトランスミッション(T/M)22を介して車両の駆動輪に伝達して走行する車両に搭載されている。この車両は、物体認識装置1のほかに、ブレーキアクチュエータ23、ステアリングアクチュエータ24、報知装置25、ヘッドライト26、とを備えている。
レーダ装置2は、例えばレーザ光やミリ波等の電磁波を自車両の進行方向前方に向けて発信すると共に、この発信した電磁波が自車両の外部の物体(例えば、構造物、歩行者、他車両等)によって反射されたときに生じた反射波を受信し、発信した電磁波と受信した電磁波(反射波)とを混合してビート信号を生成し、電子制御装置10へ出力する。
また、物体検出部11は、算出された物体の速度(または相対速度)に基づいて該物体の所定時間後の予測位置を算出し、この予測位置情報を物体抽出部13へ出力する。
また、故障判定部14は、カメラユニット3が異常状態であると判定した場合に、カメラ故障信号を報知装置25へ出力し、報知装置25を介してユーザーに物体認識装置1の異常またはカメラユニット3の異常を報知する。
例えば、検出された物体が歩行者または車両であると判別され、該物体が自車両と接触する可能性がある場合には、接触を回避するように自車両の走行を制御する。より具体的には、車両制御部15は、内燃機関21の駆動力を制御する制御信号およびトランスミッション22の変速動作を制御する制御信号およびブレーキアクチュエータ23による減速動作を制御する制御信号およびステアリングアクチュエータ24による自車両の操舵機構(図示略)の操向動作を制御する制御信号のうちの少なくとも何れかの制御信号を出力し、接触回避動作として自車両の減速制御または操向制御を実行する。
また、車両制御部15は、歩行者または車両との接触可能性の大きさに応じて、報知装置25による警報の出力タイミングおよび出力内容の少なくとも何れかを制御する。
この物体認識装置1では、レーダ装置2によって物体の存在を検知した場合に、カメラユニット3で得た画像データ上に該物体の予測位置に基づいて統合範囲を設定し、該統合範囲において抽出した物体に対してパターンマッチング処理を行い、マッチングした場合に該物体を歩行者候補あるいは車両候補とするとともに、該物体に対しレーダ装置2で取得した情報(例えば、位置情報や前後移動移動速度等)とカメラユニット3で取得した情報(例えば、物体の種別情報や横移動速度等)とを統合する。
図2のフローチャートに示す故障判定処理ルーチンは、電子制御装置10によって一定時間毎に繰り返し実行される。
まず、ステップS101において、レーダ装置2により自車両の進行方向前方に存在する物体を検出し、反射波の反射レベルを検出するとともに、該物体の位置、速度を算出する。
次に、ステップS102に進み、ステップS101において算出された物体の速度に基づいて、検出された物体が移動物標か否かを判別する。
次に、ステップS104に進み、ステップS103で抽出された物体に対してパターンマッチング処理を行い、予め記憶されていた人または車両のモデル画像とマッチングするか否かを判定する。
つまり、図4Aは、レーダ装置2により検知された移動物標Xdがカメラユニット3の物体検知により歩行者であると判別され、レーダ装置2により検知された移動物標Xeがカメラユニット3の物体検知により車両であると判別された場合を示している。この場合、カメラ故障確定カウントCnの値は増加しない。なお、図4A及び4Bにおいて符号Vは自車両を示す。
つまり、図4Bは、レーダ装置2により検知された移動物標Xd,Xeが、カメラユニット3の物体検知により歩行者または車両であると判別することができなかった場合を示している。この場合には、カメラ故障確定カウントが1つ増加する。
なお、レーダ装置2により検知された移動物標Xd,Xeが、画像データ上の統合範囲において物体として抽出されない場合も、カメラユニット3の物体検知により歩行者または車両であると判別することができない場合に含まれるので、この場合にもカメラ故障確定カウントが1つ増加する。
ステップS108における判定結果が「NO」(Cn≦N)である場合には、ステップS109に進み、カメラ故障確定フラグを「0」とする。
一方、ステップS108における判定結果が「YES」(Cn>N)である場合には、ステップS110に進み、カメラ故障確定フラグを「1」として、本ルーチンの実行を一旦終了する。これにより、カメラユニット3が異常状態であると確定する。
自車両が夜間走行している場合やトンネル内を走行している場合など、自車両の周囲の環境が暗いときには、カメラユニット3の画像データから物体を抽出するのが困難で、パターンマッチング処理が困難な場合がある。このようなときに、レーダ装置2によって移動物標が検知されたにもかかわらず、カメラユニット3の画像データ上の統合範囲において該移動物標が歩行者または車両であると判別することができなかったからといって、カメラユニット3に異常があると判定すると誤判定となる虞がある。
ステップS101~S105、およびステップS107~S110の処理は、第1実施形態における同一ステップ番号の処理と同じであり、これらの処理の流れも第1実施形態と同じであるので説明を省略する。
ステップS104における判定結果が「NO」である場合には、ステップS111に進み、ヘッドライト26を点灯する。
次に、ステップS112に進み、ヘッドライト26の点灯後にカメラ3aにより撮像された画像に基づいて生成された画像データに対して、該画像データ上に設定した統合範囲内で物体の抽出を行う。
次に、ステップS113に進み、ステップS112で抽出された物体に対してパターンマッチング処理を行い、予め記憶されていた人または車両のモデル画像とマッチングするか否かを判定する。
一方、ステップS113における判定結果が「NO」である場合、すなわち移動物標が歩行者または車両であると判別することができない場合には、ステップS106に進み、カメラ故障確定カウントの前回値Cn-1に「1」を加算した値を、カメラ故障確定カウントの今回値Cnとして更新する(Cn=Cn-1+1)。
そして、ステップS105,106からステップS107に進む。その後は第1実施形態と同様の処理を実行する。
なお、本発明は前述した実施形態に限られるものではない。
例えば、前述した実施形態では、レーダ装置2により複数の移動物標が検出されている場合に、レーダ装置2により検出された各物体毎にカメラ故障確定カウントを算出したが、1つの画像データ上において少なくとも1つの移動物標に対して歩行者または車両と判別することができなかった場合に、カメラ故障確定カウントを「1」増加するようにし、このカメラ故障確定カウントの積算値がカメラ故障確定回数Nを越えた場合に、カメラユニット3が異常であると判定することも可能である。
物体の検知方向は自車両の進行方向前方に限るものではなく、自車両の前後方向の後方や、自車両の側方であってもよい。
3 カメラユニット(撮像手段)
12 移動物標判別部(移動物標判別手段)
13 物体抽出部(物体抽出手段)
14 故障判定部(故障判定手段)
26 ヘッドライト(照明装置)
Claims (4)
- 自車両周辺の所定範囲に向けて電磁波を発信するとともに、該電磁波が該自車両周辺に存在する物体により反射されて生じる反射波を受信する発受信手段と、
前記自車両周辺の前記所定範囲を撮像する撮像手段と、
前記発受信手段により検知された前記物体が移動物標であるか否かを判別する移動物標判別手段と、
前記撮像手段により撮像された画像から特定の物体を抽出する物体抽出手段と、
前記移動物標判別手段により前記移動物標であると判別された前記物体が前記物体抽出手段によって前記特定の物体であると判別することができない場合に、前記撮像手段が異常状態であると判定する故障判定手段と、
を備えることを特徴とする故障判定装置。 - 前記移動物標判別手段により前記移動物標であると判別された前記物体の移動速度が小さいほど、前記撮像手段が前記異常状態であると判定され難くすることを特徴とする請求項1に記載の故障判定装置。
- 前記移動物標判別手段により前記移動物標であると判別された前記物体から反射された前記反射波の反射レベルが低いほど、前記撮像手段が前記異常状態であると判定され難くすることを特徴とする請求項1または請求項2に記載の故障判定装置。
- 前記移動物標判別手段により前記移動物標であると判別された前記物体が前記物体抽出手段によって前記特定の物体であると判別することができない場合に、前記撮像手段が前記異常状態であると判定する前に前記自車両周辺の前記所定範囲を照明装置により照明し、該照明装置で照明しても前記物体が前記物体抽出手段によって前記特定の物体であると判別することができない場合に、前記故障判定手段は前記撮像手段が前記異常状態であると判定することを特徴とする請求項1から請求項3のいずれか1項に記載の故障判定装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380004732.1A CN104040606B (zh) | 2012-07-10 | 2013-07-08 | 故障判定装置 |
US14/362,511 US8917321B1 (en) | 2012-07-10 | 2013-07-08 | Failure-determination apparatus |
CA2858309A CA2858309C (en) | 2012-07-10 | 2013-07-08 | Failure-determination apparatus |
EP13816074.2A EP2787497A4 (en) | 2012-07-10 | 2013-07-08 | DEFAULT ASSESSMENT APPARATUS |
JP2013553725A JP5497248B1 (ja) | 2012-07-10 | 2013-07-08 | 故障判定装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-154963 | 2012-07-10 | ||
JP2012154963 | 2012-07-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014010546A1 true WO2014010546A1 (ja) | 2014-01-16 |
Family
ID=49916000
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/068618 WO2014010546A1 (ja) | 2012-07-10 | 2013-07-08 | 故障判定装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US8917321B1 (ja) |
EP (1) | EP2787497A4 (ja) |
JP (1) | JP5497248B1 (ja) |
CN (1) | CN104040606B (ja) |
CA (1) | CA2858309C (ja) |
WO (1) | WO2014010546A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106537481A (zh) * | 2014-08-07 | 2017-03-22 | 日立汽车系统株式会社 | 行动计划装置 |
JP2017159871A (ja) * | 2016-03-11 | 2017-09-14 | トヨタ自動車株式会社 | 車両用照明装置 |
JP2017216517A (ja) * | 2016-05-30 | 2017-12-07 | アイシン精機株式会社 | 画像表示システム |
WO2018061425A1 (ja) * | 2016-09-29 | 2018-04-05 | パナソニックIpマネジメント株式会社 | センサ故障検出装置およびそのための制御方法 |
WO2019172103A1 (ja) * | 2018-03-08 | 2019-09-12 | 日立オートモティブシステムズ株式会社 | 信号処理システム、及びその評価システム、並びにその信号処理システムに用いられる信号処理装置 |
JP2019205032A (ja) * | 2018-05-22 | 2019-11-28 | クラリオン株式会社 | 車載用故障検出装置、及び故障検出方法 |
WO2020250528A1 (ja) * | 2019-06-14 | 2020-12-17 | マツダ株式会社 | 外部環境認識装置 |
JP2021149820A (ja) * | 2020-03-23 | 2021-09-27 | 株式会社デンソー | センサ評価装置 |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5991332B2 (ja) * | 2014-02-05 | 2016-09-14 | トヨタ自動車株式会社 | 衝突回避制御装置 |
CN106663342A (zh) | 2014-02-13 | 2017-05-10 | 充电网公司 | 执行与连接的车辆相关联的行动 |
JP6292097B2 (ja) * | 2014-10-22 | 2018-03-14 | 株式会社デンソー | 側方測距センサ診断装置 |
KR102288399B1 (ko) * | 2014-10-30 | 2021-08-10 | 현대모비스 주식회사 | 차량 경보 장치 |
CN105869367B (zh) * | 2015-01-20 | 2020-09-29 | 深圳辉锐天眼科技有限公司 | 一种基于联网视频监控的故障标定方法 |
KR101673776B1 (ko) * | 2015-06-05 | 2016-11-07 | 현대자동차주식회사 | 자동차용 헤드유닛 및 카메라 유닛의 고장 진단 방법 |
DE102016203472A1 (de) * | 2016-03-03 | 2017-09-07 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Verarbeitungseinheit zur Detektion von Objekten auf Basis von asynchronen Sensordaten |
JP6600271B2 (ja) * | 2016-03-31 | 2019-10-30 | 株式会社デンソー | 物体認識装置及び物体認識方法 |
DE102016217264A1 (de) | 2016-09-09 | 2018-03-15 | Honda Motor Co., Ltd. | Fahrzeugfahrt-Unterstützungsvorrichtung |
CN110290997B (zh) * | 2017-02-21 | 2022-06-17 | 日立安斯泰莫株式会社 | 车辆控制装置 |
DE112017007201T5 (de) * | 2017-03-07 | 2019-11-28 | Mitsubishi Electric Corporation | Ausfalldetektionseinrichtung, Ausfalldetektionsverfahren und Ausfalldetektionsprogramm |
WO2019092880A1 (ja) * | 2017-11-13 | 2019-05-16 | 三菱電機株式会社 | 故障検出装置、故障検出方法及び故障検出プログラム |
DE102018009683A1 (de) * | 2018-01-18 | 2019-07-18 | Sew-Eurodrive Gmbh & Co Kg | Verfahren zum Betreiben einer Anlage und Anlage |
CN108508872B (zh) * | 2018-04-18 | 2020-05-05 | 鄂尔多斯市普渡科技有限公司 | 一种无人驾驶汽车信息采集系统的故障检测方法 |
CN108760239B (zh) * | 2018-06-21 | 2020-05-08 | 江苏本能科技有限公司 | 车辆识别装置检测方法及系统 |
WO2020011367A1 (en) * | 2018-07-13 | 2020-01-16 | Abb Schweiz Ag | Camera monitoring method |
US11393204B2 (en) * | 2018-09-18 | 2022-07-19 | Capital One Service, LLC | Monitoring systems and methods |
JP7115180B2 (ja) * | 2018-09-21 | 2022-08-09 | トヨタ自動車株式会社 | 画像処理システムおよび画像処理方法 |
JP2020104547A (ja) * | 2018-12-26 | 2020-07-09 | 株式会社日立製作所 | 外界センサの故障検出装置、及び、外界センサの故障検出方法 |
WO2020142950A1 (zh) * | 2019-01-09 | 2020-07-16 | 深圳市大疆创新科技有限公司 | 测距装置的异常记录方法、测距装置及可移动平台 |
US10831189B2 (en) * | 2019-01-31 | 2020-11-10 | StradVision, Inc. | Learning method and learning device for providing functional safety by warning driver about potential dangerous situation by using explainable AI which verifies detection processes of autonomous driving network, and testing method and testing device using the same |
US11851088B2 (en) * | 2020-03-11 | 2023-12-26 | Baidu Usa Llc | Method for determining capability boundary and associated risk of a safety redundancy autonomous system in real-time |
KR20210136567A (ko) * | 2020-05-08 | 2021-11-17 | 현대자동차주식회사 | 차량의 커뮤니케이션 조명 시스템 |
CN116829927A (zh) * | 2021-02-04 | 2023-09-29 | 美国西门子医学诊断股份有限公司 | 检测机器视觉系统中的缺陷的装置和方法 |
KR20230114796A (ko) * | 2022-01-24 | 2023-08-02 | 현대자동차주식회사 | 자율주차 보조 장치 및 방법 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003252147A (ja) * | 2002-02-26 | 2003-09-10 | Toyota Motor Corp | 車両用障害物検出装置 |
JP2005157765A (ja) | 2003-11-26 | 2005-06-16 | Alpine Electronics Inc | 歩行者検出装置 |
JP2006292621A (ja) * | 2005-04-13 | 2006-10-26 | Toyota Motor Corp | 物体検出装置 |
JP2007163258A (ja) * | 2005-12-13 | 2007-06-28 | Alpine Electronics Inc | 車載センサの補正装置および方法 |
JP2009029367A (ja) * | 2007-07-30 | 2009-02-12 | Mazda Motor Corp | 車両の前照灯制御装置 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09142236A (ja) * | 1995-11-17 | 1997-06-03 | Mitsubishi Electric Corp | 車両の周辺監視方法と周辺監視装置及び周辺監視装置の故障判定方法と周辺監視装置の故障判定装置 |
DE19546506A1 (de) * | 1995-12-13 | 1997-06-19 | Daimler Benz Ag | Fahrzeug-Navigationssystem und Signalverarbeitungsverfahren für ein solches Navigationssystem |
US20040083035A1 (en) * | 1996-09-25 | 2004-04-29 | Ellis Christ G. | Apparatus and method for automatic vision enhancement in a traffic complex |
JP3843502B2 (ja) * | 1996-09-30 | 2006-11-08 | マツダ株式会社 | 車両用動体認識装置 |
DE10149115A1 (de) * | 2001-10-05 | 2003-04-17 | Bosch Gmbh Robert | Objekterfassungsvorrichtung |
JP4738778B2 (ja) * | 2003-10-15 | 2011-08-03 | 富士通テン株式会社 | 画像処理装置、運転支援装置および運転支援システム |
DE102005009702A1 (de) * | 2005-03-03 | 2006-09-07 | Robert Bosch Gmbh | Abstandsmessvorrichtung und Verfahren zur Funktionsprüfung einer Abstandsmessung |
JP4598653B2 (ja) * | 2005-05-13 | 2010-12-15 | 本田技研工業株式会社 | 衝突予知装置 |
JP5146716B2 (ja) * | 2007-03-01 | 2013-02-20 | マツダ株式会社 | 車両用障害物検知装置 |
DE102007018470A1 (de) * | 2007-04-19 | 2008-10-23 | Robert Bosch Gmbh | Fahrerassistenzsystem und Verfahren zur Objektplausibilisierung |
JP5649568B2 (ja) * | 2008-05-21 | 2015-01-07 | アー・デー・ツェー・オートモーティブ・ディスタンス・コントロール・システムズ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツング | 車両と歩行者との衝突を回避するための運転者援助システム |
JP5150527B2 (ja) * | 2009-02-03 | 2013-02-20 | 株式会社日立製作所 | 車両用衝突回避支援装置 |
US7924146B2 (en) * | 2009-04-02 | 2011-04-12 | GM Global Technology Operations LLC | Daytime pedestrian detection on full-windscreen head-up display |
JP5210233B2 (ja) * | 2009-04-14 | 2013-06-12 | 日立オートモティブシステムズ株式会社 | 車両用外界認識装置及びそれを用いた車両システム |
US20130278441A1 (en) * | 2012-04-24 | 2013-10-24 | Zetta Research and Development, LLC - ForC Series | Vehicle proxying |
-
2013
- 2013-07-08 CA CA2858309A patent/CA2858309C/en not_active Expired - Fee Related
- 2013-07-08 WO PCT/JP2013/068618 patent/WO2014010546A1/ja active Application Filing
- 2013-07-08 CN CN201380004732.1A patent/CN104040606B/zh active Active
- 2013-07-08 EP EP13816074.2A patent/EP2787497A4/en not_active Withdrawn
- 2013-07-08 US US14/362,511 patent/US8917321B1/en active Active
- 2013-07-08 JP JP2013553725A patent/JP5497248B1/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003252147A (ja) * | 2002-02-26 | 2003-09-10 | Toyota Motor Corp | 車両用障害物検出装置 |
JP2005157765A (ja) | 2003-11-26 | 2005-06-16 | Alpine Electronics Inc | 歩行者検出装置 |
JP2006292621A (ja) * | 2005-04-13 | 2006-10-26 | Toyota Motor Corp | 物体検出装置 |
JP2007163258A (ja) * | 2005-12-13 | 2007-06-28 | Alpine Electronics Inc | 車載センサの補正装置および方法 |
JP2009029367A (ja) * | 2007-07-30 | 2009-02-12 | Mazda Motor Corp | 車両の前照灯制御装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2787497A4 |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10725474B2 (en) | 2014-08-07 | 2020-07-28 | Hitachi Automotive Systems, Ltd. | Action planning device having a trajectory generation and determination unit that prevents entry into a failure occurrence range |
EP3179463A4 (en) * | 2014-08-07 | 2018-04-04 | Hitachi Automotive Systems, Ltd. | Action planning device |
CN106537481B (zh) * | 2014-08-07 | 2019-06-04 | 日立汽车系统株式会社 | 行动计划装置 |
US10761536B2 (en) | 2014-08-07 | 2020-09-01 | Hitachi Automotive Systems, Ltd. | Action planning device having a trajectory generation and determination unit |
CN106537481A (zh) * | 2014-08-07 | 2017-03-22 | 日立汽车系统株式会社 | 行动计划装置 |
JP2017159871A (ja) * | 2016-03-11 | 2017-09-14 | トヨタ自動車株式会社 | 車両用照明装置 |
US10421389B2 (en) | 2016-03-11 | 2019-09-24 | Toyota Jidosha Kabushiki Kaisha | Vehicle lighting system |
JP2017216517A (ja) * | 2016-05-30 | 2017-12-07 | アイシン精機株式会社 | 画像表示システム |
WO2017208585A1 (ja) * | 2016-05-30 | 2017-12-07 | アイシン精機株式会社 | 画像表示システム |
WO2018061425A1 (ja) * | 2016-09-29 | 2018-04-05 | パナソニックIpマネジメント株式会社 | センサ故障検出装置およびそのための制御方法 |
CN109643492A (zh) * | 2016-09-29 | 2019-04-16 | 松下知识产权经营株式会社 | 传感器故障检测装置及其控制方法 |
JPWO2018061425A1 (ja) * | 2016-09-29 | 2019-07-18 | パナソニックIpマネジメント株式会社 | センサ故障検出装置およびそのための制御方法 |
JP2019158390A (ja) * | 2018-03-08 | 2019-09-19 | 日立オートモティブシステムズ株式会社 | 信号処理システム、及びその評価システム、並びにその信号処理システムに用いられる信号処理装置 |
WO2019172103A1 (ja) * | 2018-03-08 | 2019-09-12 | 日立オートモティブシステムズ株式会社 | 信号処理システム、及びその評価システム、並びにその信号処理システムに用いられる信号処理装置 |
JP2019205032A (ja) * | 2018-05-22 | 2019-11-28 | クラリオン株式会社 | 車載用故障検出装置、及び故障検出方法 |
JP7137356B2 (ja) | 2018-05-22 | 2022-09-14 | フォルシアクラリオン・エレクトロニクス株式会社 | 車載用故障検出装置、及び故障検出方法 |
WO2020250528A1 (ja) * | 2019-06-14 | 2020-12-17 | マツダ株式会社 | 外部環境認識装置 |
JP2020204822A (ja) * | 2019-06-14 | 2020-12-24 | マツダ株式会社 | 外部環境認識装置 |
JP7298323B2 (ja) | 2019-06-14 | 2023-06-27 | マツダ株式会社 | 外部環境認識装置 |
JP2021149820A (ja) * | 2020-03-23 | 2021-09-27 | 株式会社デンソー | センサ評価装置 |
Also Published As
Publication number | Publication date |
---|---|
US8917321B1 (en) | 2014-12-23 |
CN104040606A (zh) | 2014-09-10 |
US20140368668A1 (en) | 2014-12-18 |
EP2787497A1 (en) | 2014-10-08 |
CA2858309A1 (en) | 2014-01-16 |
JP5497248B1 (ja) | 2014-05-21 |
JPWO2014010546A1 (ja) | 2016-06-23 |
EP2787497A4 (en) | 2015-09-16 |
CN104040606B (zh) | 2015-08-26 |
CA2858309C (en) | 2015-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014010546A1 (ja) | 故障判定装置 | |
US10997436B2 (en) | Object detection apparatus and object detection method | |
EP3285084B1 (en) | Vehicle communication system for cloud-hosting sensor-data | |
JP6369390B2 (ja) | 車線合流判定装置 | |
US9731728B2 (en) | Sensor abnormality detection device | |
EP2808700B1 (en) | Drive assist device, and vehicle using drive assist device | |
US20180259636A1 (en) | Monitoring device and monitoring method | |
JP4602277B2 (ja) | 衝突判定装置 | |
JP5651642B2 (ja) | 物体位置検知装置 | |
JP5785578B2 (ja) | 車両周辺監視装置 | |
WO2016031523A1 (ja) | 物体認識装置及び車両制御システム | |
CN110290997B (zh) | 车辆控制装置 | |
JP2013117475A (ja) | 障害物検出装置 | |
JP2013019684A (ja) | 車両周辺監視装置 | |
JP2017058761A (ja) | 運転支援装置及び運転支援プログラム | |
CN113173160B (zh) | 车辆用防碰撞装置及车辆用防碰撞方法 | |
JP2010162975A (ja) | 車両制御システム | |
JP6429360B2 (ja) | 物体検出装置 | |
CN113002535A (zh) | 驾驶员辅助设备和驾驶员辅助方法 | |
JP5806647B2 (ja) | 物体認識装置 | |
US20220314973A1 (en) | Driving support device, driving support method, and storage medium | |
JP2008286566A (ja) | 車載装置 | |
JP2005145396A (ja) | 車両用運転支援装置 | |
JP5651649B2 (ja) | 物体位置検知装置 | |
CN111319620B (zh) | 车辆及其控制方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013553725 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13816074 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2013816074 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14362511 Country of ref document: US Ref document number: 2013816074 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2858309 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |