WO2013073310A1 - 車載用環境認識装置 - Google Patents
車載用環境認識装置 Download PDFInfo
- Publication number
- WO2013073310A1 WO2013073310A1 PCT/JP2012/075886 JP2012075886W WO2013073310A1 WO 2013073310 A1 WO2013073310 A1 WO 2013073310A1 JP 2012075886 W JP2012075886 W JP 2012075886W WO 2013073310 A1 WO2013073310 A1 WO 2013073310A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- detection
- processing load
- accuracy
- unit
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 125
- 238000012545 processing Methods 0.000 claims description 234
- 238000000034 method Methods 0.000 claims description 84
- 238000006243 chemical reaction Methods 0.000 claims description 19
- 238000003384 imaging method Methods 0.000 claims description 11
- 238000010586 diagram Methods 0.000 description 16
- 230000000694 effects Effects 0.000 description 4
- 230000003111 delayed effect Effects 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/06—Improving the dynamic response of the control system, e.g. improving the speed of regulation or avoiding hunting or overshoot
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/029—Steering assistants using warnings or proposing actions to the driver without influencing the steering system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
Definitions
- the present invention relates to an on-vehicle environment recognition device that recognizes the surrounding environment of a host vehicle based on an image captured by an imaging unit, and an alarm / vehicle control system using the same.
- a processor in a multiprocessor system, is selected by dynamically selecting an application to be executed from among multi-applications according to a vehicle operation situation, and dynamically allocating the selected application to a plurality of processors. Is implemented effectively.
- the output performance of the application is limited. Even if the application can be switched according to the situation by switching the application according to the surrounding environment of the vehicle, it is difficult to improve the performance of a single application.
- the present invention has been made in view of the above-mentioned point, and the purpose of the present invention is to be able to obtain the recognition result with high accuracy in the situation where the alarm and the vehicle control are activated, and the environment around the vehicle with higher accuracy. It is an object of the present invention to provide an on-vehicle environment recognition device that realizes recognition.
- An on-vehicle environment recognition apparatus for solving the above problems includes a detection unit for detecting a plurality of detection elements set in advance from an image captured by an imaging unit, and a detection accuracy of at least one of the plurality of detection elements. And an adjustment unit that adjusts according to the situation of
- the recognition result can be obtained with high accuracy in the situation where the alarm and the vehicle control are activated, and the environment recognition around the vehicle can be realized with higher accuracy. Therefore, it is possible to correct the timing of the warning and the vehicle control according to the vehicle condition, improve the stability, and improve the estimation accuracy and the calculation accuracy.
- the subject except having mentioned above, a structure, and an effect are clarified by description of the following embodiment.
- the block diagram of the vehicle-mounted environment recognition apparatus in 1st Embodiment The block diagram of a vehicle behavior part.
- FIG. 1 is a block diagram of a vehicle-mounted environment recognition apparatus 100 according to the present embodiment.
- the on-vehicle environment recognition device 100 is provided, for example, in a microcomputer provided in a camera device mounted on a vehicle, and is embodied by executing a software program in a ROM by a CPU.
- the in-vehicle environment recognition apparatus 100 is provided with a lane recognition application for recognizing a lane in which the host vehicle is traveling (a traveling lane), and the lane recognition unit 210 uses a plurality of images captured by the imaging unit 110.
- the detection element of is detected, and specifically, the white line dividing the lane and the lateral position of the vehicle, the yaw angle, and the curvature indicating the degree of curve bending are detected.
- the lane recognition unit 210 includes a lateral position detection unit 211, a yaw angle detection unit 212, and a curvature detection unit 213.
- Each of the detection units 211 to 213 has variable detection accuracy independently of the other detection units. It has an adjustable configuration.
- the application is simply referred to as an application.
- the lateral position detection unit 211 performs lateral position detection processing A for detecting the lateral position of the white line based on the image captured by the imaging unit 110.
- the lateral position detection unit 211 can improve the detection accuracy of the lateral position by improving the resolution for finding the lateral position of the white line.
- the yaw angle detection unit 212 performs a yaw angle detection process B that detects the yaw angle of the host vehicle based on the image captured by the imaging unit 110.
- the yaw angle detection unit 212 can improve the yaw angle detection accuracy as in the case of the lateral position by improving the resolution of the inclination of the white line.
- the processing time also increases.
- the curvature detection unit 213 performs a curvature detection process C for detecting the curvature of the curve on which the host vehicle is traveling based on the image captured by the imaging unit 110.
- the curvature detection unit 213 may use a higher resolution image to improve the curvature detection accuracy.
- the lane recognition unit 210 outputs information on the lateral position, the yaw angle, and the curvature, which are detection elements detected by the detection units 211 to 213, to the alarm and vehicle control unit 400. Then, the vehicle behavior unit 500 outputs the information on the vehicle speed, the steering angle, and the yaw rate of the own vehicle to the alarm and vehicle control unit 400.
- the warning / vehicle control unit 400 is based on the information of the lateral position, the yaw angle, and the curvature output from the lane recognition unit 210 and the information on the vehicle speed, the steering angle, and the yaw rate output from the vehicle behavior unit 500. Judge about implementation of warning and control for control.
- the alarm and vehicle control unit 400 also estimates the implementation timing of the alarm and vehicle control before actually performing the alarm and vehicle control. The timing of lane departure can be estimated from the lateral position, the yaw angle, the vehicle speed, and the steering angle.
- the application control unit 300 detects the detection accuracy necessary to stably and stably execute the alarm and control. Request for.
- the application control unit 300 changes the processing load distribution processing of the detection processing of the lane recognition unit 210 or the content of the detection processing according to the request accuracy from the alarm / vehicle control unit 400.
- the process execution content is adjusted and control is performed so as to complete the process in a prescribed process cycle T.
- the accuracy required from the alarm / vehicle control unit 400 may be adjusted for each scene.
- the application control unit 300 performs control of the application so that the output accuracy suitable for the content of the alarm and vehicle control to be implemented is changed at the previous stage predicted before the specified time for the alarm and vehicle control. .
- FIG. 2 is a configuration diagram of a vehicle behavior unit.
- Vehicle behavior unit 500 outputs information on the behavior of the host vehicle to alarm and vehicle control unit 400. Information on the behavior of the host vehicle is obtained from a sensor attached to the vehicle.
- the vehicle behavior unit 500 is a vehicle speed unit 510 that obtains the vehicle speed of the host vehicle from the vehicle speed sensor, a wheel speed unit 520 that obtains information from a sensor that measures the number of rotations of the wheel, and steering obtained from a sensor that measures the steering condition of the steering wheel of the vehicle.
- the alarm and vehicle control unit 400 integrates the vehicle behavior information and the information from the image recognition result and uses the information when judging the implementation of the alarm and vehicle control.
- FIG. 3 is a block diagram of the alarm and vehicle control unit.
- the alarm and vehicle control unit 400 inputs information on the lateral position, yaw angle, and curvature detected by the detection units 211 to 213 in the lane recognition unit 210 in this embodiment as a result of recognition from the in-vehicle environment recognition device 100. Use as information. Further, the information of the vehicle behavior obtained from the vehicle behavior unit 500 described in FIG. 2 is also used as the input information. Carry out vehicle control for securing the safety of the own vehicle such as lane departure suppression from the input information of the in-vehicle environment recognition apparatus 100 and the vehicle behavior unit 500, or sound an alarm for calling the driver of safety. The alarm / vehicle control unit 400 determines whether to execute the control.
- the alarm and vehicle control unit 400 includes a prediction determination unit 410, a recognition output request unit 420, and an alarm and vehicle control execution unit 430.
- the prediction judgment unit 410 predicts whether there is a need for alarm / vehicle control for securing the safety of the host vehicle before actually performing alarm / vehicle control.
- the value of the request in the recognition output request unit 420 is changed.
- the recognition output request unit 420 is information to be used for alarm / vehicle control for the in-vehicle environment recognition device 100 according to the contents of alarm / vehicle control predicted to be performed by the prediction judgment unit 410, or Express how much the recognition result that affects the judgment and implementation of the alarm and vehicle control is expressed by the scale of involvement, and request the output accuracy of the recognition result according to the content of the alarm and vehicle control to be performed .
- the recognition output request unit 420 requests the application control unit 300 to change the output result of the environment recognition result to more appropriate accuracy.
- the recognition output request unit 420 requests that the value of the recognition result be appropriately changed before the warning / vehicle control is performed, and thereafter, when it is time to perform the warning / vehicle control,
- the alarm and vehicle control implementation unit 430 implements alarm and vehicle control using more appropriate recognition output.
- FIG. 4 is a block diagram of the application control unit.
- the application control unit 300 adjusts the detection accuracy of at least one of the plurality of detection elements detected by the lane recognition unit 210 according to the situation of the vehicle (accuracy adjustment unit), and the required accuracy processing load conversion unit 310 , A processing load adjustment unit 320, and an execution request unit 330.
- the required accuracy processing load conversion unit 310 When it is predicted that the alarm / vehicle control is to be performed based on the situation of the vehicle, the required accuracy processing load conversion unit 310 has a detection accuracy set in advance for each of a plurality of detection elements according to the situation of the vehicle.
- the processing load of each of the detection units 211 to 213 is converted based on the required value. Specifically, from the table of required values of detection accuracy for each recognition output indicated by the measure of degree of involvement created by the alarm / vehicle control unit 400, each process A of the lane recognition application according to the degree of involvement
- the processing load in the case where the other processing Z that does not affect the recognition accuracy is executed is converted.
- the processing load is estimated when the adjustment is made for the predicted alarm / vehicle control scene while referring to the predicted degree of involvement of the alarm / vehicle control scene, the degree of involvement at normal times, and the processing load at normal times Is done.
- the processing load converted by the required accuracy processing load conversion unit 310 is calculated without considering whether all the processings fall within the specified processing cycle T, and the processing of each of the detection units 211 to 213 is summed up. Also, the case where it does not fall within the prescribed processing cycle T occurs. Further, depending on the calculation result, the total processing time of each process is shorter than the prescribed processing cycle T, and a case may be assumed where the processing load is small such that the prescribed processing cycle T is largely interrupted.
- the processing load adjustment unit 320 adjusts the processing load so that the processing load conversion result falls within the specified processing cycle T. If the processing time has a margin, the overall accuracy is increased to achieve higher accuracy. Change to a more stable process with accuracy.
- the execution request unit 330 issues an execution request to the lane recognition unit 210 so as to achieve more appropriate recognition output accuracy (detection accuracy).
- FIG. 5 is a table showing the degree of involvement of multiple processes in each scene and the ratio of the processing load, and shows the distribution of the processing load of the CPU based on the degree of involvement of alarm and vehicle control.
- FIG. 5 shows a degree of involvement for the lane recognition application to request the lane recognition result to have more appropriate recognition accuracy in the alarm / vehicle control unit 400.
- the situation (scene) of the vehicle is shown, and the numerical value of the degree of involvement regarding each recognition output result according to each situation is shown.
- the condition of the vehicle indicates a change in the condition of the vehicle relating to warning and vehicle control, and in the present embodiment, when the vehicle is in the low speed lateral lane lane, the high speed lateral lane lane, the middle lane control Describe the time.
- Normal refers to lane recognition status when traveling in the lane and warning for safety and vehicle control are excluded, and specifically, traveling around the center of the traveling lane And the lane departure estimated time is S seconds or more.
- the case of traveling along a curve with a radius of curvature of 800 m or less, which is S seconds or more, is also referred to as middle travel control in the middle of a curve. If the lane departure prediction time is less than S seconds, as shown in FIG. 8, the inclination yaw angle of the vehicle with respect to the lane is large, and the lateral movement speed is high.
- lateral speed is F [m / s] or more
- high speed lateral It defines as the speed lane departure time, and conversely, when the lateral speed is less than the threshold F [m / s], it defines as the low speed lateral speed lane departure.
- the warning and control of vehicle control during curve driving are activated at the time of low speed lateral speed lane departure (alarm and vehicle control), high speed lateral speed lane departure (alarm and vehicle control)
- the degree of involvement of each detection element (lateral position, yaw angle, curvature) of the recognition output is used to indicate how much each scene of the recognition output is to perform an alarm or control.
- the degree of involvement uses the recognition output of the lane recognition application's multiple recognition processes (lateral position detection process A, yaw angle detection process B, curvature detection process C) to the warning and vehicle control to be implemented.
- the numerical values are set in advance, including the meaning of requiring recognition accuracy suitable for performing alarm and vehicle control with higher accuracy and more stably.
- the degree of involvement is set by the recognition output request unit 420 of the alarm / vehicle control unit 400, and the application control unit 300 refers to the table of the degree of involvement in FIG.
- the table arranged in the center column of FIG. 5 is a processing load conversion table, which is calculated by the required accuracy processing load conversion unit 310 of the application control unit 300 that has received the degree of involvement. Calculations are performed to estimate how much the processing load on the CPU is when the numerical value represented by the degree of involvement is directly reflected in the processing.
- the rightmost column in FIG. 5 shows the processing load redistribution table obtained by redistributing the processing load of each processing so that the processing load of each processing calculated by the processing load conversion table falls within the prescribed processing cycle T. It is.
- the processing load redistribution table shows the result of redistributing the processing load of the CPU and adjusting the total to be 100% or less.
- the other process Z that does not affect the recognition accuracy is a process that is always executed regardless of the lane recognition accuracy, the processing load is always constant, and the redistribution is also affected. Calculate as there is not.
- the processing load of the processing load conversion table is secured for the value of the degree of involvement below the normal degree of involvement. Then, with respect to values exceeding the normal degree of involvement, the processing load is redistributed so that the total is 100%.
- the lateral position detection processing A is 46%
- the yaw angle detection processing B is 23%
- the curvature detection processing C is 16%
- the other processing Z is 20%
- the processing load of processing B, C, and Z maintains a value as it is, without applying redistribution. That is, for the process Z, there is no method for reducing the processing load, and the CPU load for the processes B and C is lower than the normal time, and thus further reduction must be avoided, so redistribution is not applied.
- the processing load of process A is redistributed based on the following equation.
- the treatment A is changed from 46% to 41% by redistribution.
- the processing load after redistribution at the time of departure from the low-speed lateral velocity lane is 41% for process A, 23% for process B, 16% for process C, and 20% for process Z.
- the CPU load is redistributed in consideration of the size of the processing load of the processing load conversion table.
- the conversion value of the processing load of processing A is 45%
- the conversion value of the processing load of processing B is 35%
- the total processing load after redistribution of processing A and processing B needs to be 75%.
- the redistribution value is calculated by the following formula. Redistribution value of treatment A 42.2% 75 75% * ⁇ A (45%) / (A (45%) + B (35%)) ⁇ Redistribution value of treatment B 32.8% 75 75% * ⁇ B (35%) / (A (45%) + B (35%)) ⁇
- the processing load adjustment unit 320 in the processing load redistribution table, redistribution of the processing load to the CPU is performed so that the alarm and control can be performed with high accuracy and stability while guaranteeing the minimum recognition accuracy. carry out.
- the execution request unit 330 Based on the redistribution value of the processing load, the execution request unit 330 issues a request to the lane recognition unit 210 to execute the recognition process of the next frame.
- FIG. 6 is a diagram showing the ratio of the processing load of a plurality of detection processes in each scene.
- processing load for each processing A to C in lane recognition Dynamically change the distribution of the system to implement more accurate and stable alarm and control. It can be confirmed that the processing load of the recognition processing is increased in order to improve the accuracy of the recognition output that is regarded as important when performing an alarm or control in each scene.
- the processing load of the processing A is raised more than usual.
- the processing load of the process B is raised more than usual.
- FIG. 7 is a processing flow diagram of a warning and vehicle control system using the in-vehicle environment recognition device.
- step S1 the surroundings of the host vehicle are imaged by the imaging unit 110 (vehicle-mounted camera), and in step S2, lane recognition processing (processing A to C, Z) is performed using the image.
- the imaging unit 110 vehicle-mounted camera
- step S2 lane recognition processing (processing A to C, Z) is performed using the image.
- processing load at the normal time.
- step S3 it is determined from the situation that the recognition result of the vehicle behavior and the lane recognition processing becomes available in the alarm and vehicle control unit 400, whether it is the first alarm and vehicle control scene or not. If the recognition result of the lane is not recognized or if it is determined that it is early to be used for alarm and vehicle control soon after operation, it is determined as No.
- the warning and vehicle control unit 400 predicts whether or not warning and vehicle control are likely to occur in the future, and whether warning or vehicle control is scheduled to be performed. It is implemented by the determination unit 410. And when it is judged that it is each above-mentioned each alarm / vehicle control scene for the first time, it is judged as Yes, when it is other than the above regarding alarm / vehicle control and it can not be judged with implementation prediction, it is judged as No. .
- step S3 only the normal process of FIG. 5 is performed, and the flow also follows the No path, and the lane recognition application recognition process is performed without changing the previous recognition process, It leads to the imaging of the camera, and the processing of the next frame is repeated.
- step S3 If Yes in step S3, that is, if it is determined in the prediction determination unit 410 that a warning and vehicle control are scheduled to be performed in the future, it is predicted in what scene how many seconds a warning and vehicle control will be performed. In step S4, the process proceeds to step S4.
- step S4 when the prediction by the prediction judgment unit 410 reaches a specified time before, the content of the warning and vehicle control predicted to be implemented at the specified time and the time are notified to the recognition output request unit 420.
- the recognition output request unit 420 makes a request to improve the accuracy of the recognition output necessary to carry out more appropriate alarm / vehicle control according to the contents of alarm / vehicle control performed after a prescribed time.
- the output not used for alarm and vehicle control after the specified time is stopped, and the output not regarded as important is reduced in accuracy, thereby reducing the processing load on the CPU and increasing the processing load on the output regarded as important. This contributes to the improvement of the accuracy and stability of the planned warning and vehicle control.
- Stop of recognition output, accuracy decrease, accuracy maintenance, accuracy improvement, etc. are adjusted based on the indicator of the degree of involvement in alarm and vehicle control.
- the degree of involvement is quantified by considering how much the recognition output is involved in alarm and vehicle control.
- the degree of involvement refers to the degree of precision that you want to request for performing high-precision alarm and vehicle control, and at least the precision that you want to guarantee, without considering whether the recognition application is completed within the defined processing cycle. Reflect on the indicator.
- step S5 the required accuracy processing load conversion is performed. For example, when the processing time is extended without completing each of the processes A to C and Z within the defined processing cycle T, the recognition result is output with a delay, and it is fundamentally used as a delay factor of alarm and vehicle control. Not desirable. For this reason, it is necessary to first determine whether the processing is completed within the prescribed processing cycle.
- step S5 the processing load when the predicted alarm / vehicle control scene is not adjusted while referring to the predicted degree of involvement of the alarm / vehicle control scene, the degree of involvement at the normal time, and the processing load at the normal time.
- Estimate The required accuracy processing load conversion is performed by the required accuracy processing load conversion unit 310 of the application control unit 300.
- step S6 processing load adjustment of each processing of the lane recognition application is performed. If each process of lane recognition ends early within the defined processing cycle T, or if it is clear that it does not end within the defined processing cycle T, the processing load of each process is redistributed and adjusted. In the case of early termination within the prescribed processing cycle T, in order to improve the overall recognition accuracy at the same rate, adjustment is carried out such that all is added and termination is performed in the prescribed processing cycle T of 100%.
- the processing load rate in the processing load redistribution table is 100% of the total load that can be processed within one application prescribed cycle.
- the processing load adjustment is performed by the processing load adjustment unit 320 of the application control unit 300.
- step S7 after the adjustment of the processing load in step S6, an application execution request is issued so that each of the detection units 211 to 213 performs the recognition process according to the adjusted processing load.
- the application execution request is output from the execution request unit 330 of the application control unit 300 to the lane recognition unit 210.
- the lateral position is considered to be largely involved in alarm and vehicle control, and the degree of involvement is raised more than usual, and the degree of involvement is reduced with respect to other yaw angles and lateral positions.
- This improves the detection accuracy and stability of the lateral position, which is an important judgment criterion for warning and vehicle control at the time of departure from a low-speed lateral velocity lane, and warns the driver by suppressing the processing load of yaw angle and curvature. -Accuracy and stability of vehicle control can be improved.
- the degree of involvement is raised more than usual because the yaw angle is largely involved in alarm and vehicle control, the degree of involvement is substantially maintained and the accuracy is maintained for the lateral position, and the degree of involvement is greatly reduced for the curvature. .
- the warning and vehicle control given to the driver by improving the detection accuracy and stability of the yaw angle, which is an important judgment criterion for warning and vehicle control at the time of high speed lateral lane departure, and suppressing the processing load of curvature. Accuracy and stability can be improved.
- the degree of involvement of the yaw angle and the curvature, which greatly influence the alarm and vehicle control is raised more than usual, and the degree of involvement of the lateral position which does not greatly influence the control is lowered.
- This improves the stability of the detection accuracy and stability of the yaw angle and the curvature, which are important judgment criteria for warning and vehicle control at the time of lane center traveling control in a curve, and suppresses the processing load at the lateral position. It is possible to improve the accuracy and stability of the alarm and vehicle control to be given.
- FIG. 8A is a view showing an output example of warning / vehicle control for a sudden lane departure
- FIG. 8B is a view schematically showing a sudden lane departure state.
- the example of FIG. 8 shows an example in which the degree of curvature involvement is 0 at the time of a sudden lane departure. Note that a pair of solid lines in FIG. 8B indicates the boundary of the lane 11, and a dotted line indicates the center of the lane 11.
- the degree of involvement of the curvature is set to 0, only processes A and B are performed, and process C is omitted. This is one of the embodiments in which the recognition output clearly changes before and after the alarm and vehicle control.
- FIG.9 (a) is a figure which shows the example of an output of the lane maintenance control along a curve
- FIG.9 (b) is a figure which shows typically a curve driving state.
- FIG. 9A shows an example in which the number of times of execution of the processing A is reduced, with the degree of participation of the lateral position being 2 out of 3 during the lane maintenance control along the curve 12.
- the term of curvature is regarded as important, and calculation is made, and it is assumed that there is no problem even if the update of the lateral position is slightly delayed.
- the output is skipped, the detection results of the yaw angle and the curvature are regarded as important, and the accuracy of the alarm and vehicle control as a whole is improved.
- the processing load ratio of the plurality of recognition processing is changed according to the situation, and the processing load is reduced for the recognition processing of low importance, Increase the processing load for high importance recognition processing. Therefore, more accurate and accurate recognition processing can be performed, and warning and vehicle control based on such recognition contents can be performed accurately. That is, it is possible to accurately obtain the recognition result in the situation where the alarm or the vehicle control is activated, and it is possible to realize the environment recognition around the vehicle with higher accuracy. Therefore, it is possible to correct the timing of the warning and the vehicle control according to the vehicle condition, improve the stability, and improve the estimation accuracy and the calculation accuracy.
- FIG. 10 is a block diagram of the in-vehicle environment recognition device in the second embodiment.
- the detailed description is abbreviate
- the in-vehicle environment recognition device 101 includes an image processing unit 200 that processes an image captured by the imaging unit 110.
- the image processing unit 200 includes therein a plurality of applications, that is, a multi-application that performs lane recognition and side vehicle detection, and includes a lane recognition unit 210 and a side vehicle detection unit 220.
- the side vehicle detection unit 220 has a left side detection unit 221 and a right side detection unit 222 as a detection unit capable of adjusting the output accuracy involved in alarm and vehicle control.
- the left side detection unit 221 detects the presence of a vehicle on the left side of the host vehicle
- the right side detection unit 222 detects the presence of a vehicle on the right side of the host vehicle.
- the image processing unit 200 outputs the recognition results of the lane recognition unit 210 and the side vehicle detection unit 220 to the alarm / vehicle control unit 400.
- the alarm and vehicle control unit 400 predicts and determines the implementation of the alarm and vehicle control together with the vehicle behavior information from the vehicle behavior unit 500.
- FIG. 11 is a table showing the degree of involvement and processing load of multiple processes in each scene
- FIG. 12 is a diagram showing the processing load rate of multiple processes in each scene
- FIG. FIG. 13 (b) shows an example of an output of warning / vehicle control at the time of lane departure
- FIG. 13 (b) schematically shows a lane deviation state
- FIG. FIG. 14B is a view schematically showing a sudden lane departure state.
- the% display in the processing load redistribution table is a display in which the default processing time 50 msec of the lane recognition unit is calculated as 100%, and the vehicle detection unit is also calculated as 100% in the default processing time 50 msec. ing. If the default processing time of each application is different, there is no problem in the total processing load as long as the calculation is such that the specified period T is obtained. Therefore, if the default processing time of each application is different, the total% of the two applications in the processing load redistribution table may exceed 200%.
- the degree of participation of the lateral position detection processing A is raised from 33% to 50% under normal conditions.
- the degree of involvement of the detection processes B and C is reduced from 33% to 25% at the normal time.
- the degree of involvement is performed only at the left side with the same degree of involvement 50% as the degree of involvement 50% at the normal time, and the other side means the temporary stop of vehicle detection.
- CPU load can be assigned to improve the recognition accuracy of the output related to the alarm and vehicle control by reducing the processing load unnecessary for the alarm and vehicle control at the time of departure from the low-speed lateral velocity lane.
- the accuracy and stability of alarm and vehicle control that can be felt can be improved.
- the processing load redistribution table only the items related to the recognition output accuracy are readjusted.
- the other processing Z (20%) of the lane recognition unit 210 and the processing Y (10%) of the vehicle or the like of the vehicle detection unit 220 are necessary processing regardless of the accuracy of the recognition output, The processing time is required without affecting the recognition accuracy.
- the CPU processing load area originally allocated as the processing load for side vehicle detection may be at the time of departure from the low-speed lateral velocity lane or high-speed lateral velocity lane in FIG. There is no problem in performing load distribution such as that used by lane recognition as shown in the middle lane control mode.
- the multi-application can adaptively distribute the CPU processing load, and the accuracy improvement is more effective.
- the processing of lane recognition important for warning and vehicle control can be made more accurate and stabilized by omitting the processing of unnecessary lane departure and vehicle detection of the adjacent lane on the opposite side. To provide the driver with high precision and highly stable warning and vehicle control.
- the degree of involvement of the yaw angle detection processing B is raised from 33% to 50% under normal conditions.
- the degree of participation of the detection process A is substantially maintained and the accuracy thereof is maintained at 30%, and the degree of participation of the curvature detection process C is largely reduced from 33% to 20% in the normal state regarding the curvature.
- the warning and vehicle control given to the driver by improving the detection accuracy and stability of the yaw angle, which is an important judgment criterion for warning and vehicle control at the time of high speed lateral lane departure, and suppressing the processing load of curvature. Accuracy and stability can be improved.
- the degree of involvement is performed only at the left side with the same degree of involvement 50% as the degree of involvement 50% at the normal time, and the other side means the temporary stop of vehicle detection.
- CPU load can be allocated to improve the recognition accuracy of the output accuracy related to the alarm and vehicle control by reducing the processing load unnecessary for the alarm and vehicle control at the time of departure from the high-speed lateral velocity lane. It is possible to improve the accuracy and stability of alarm and vehicle control that the driver can feel.
- CPU processing load redistribution is performed.
- the CPU processing load area distributed as the processing load for side vehicle detection is assigned to the processing load for lane recognition, so that it is during lane departure. Recognizing the judgment at the time of a lane departure while preventing the user's convenience from being impaired by combining lane recognition to increase processing load and side vehicle detection that does not affect the user even if processing load is reduced It is possible to improve the accuracy.
- the processing load for vehicle detection is used as it is for vehicle detection, and the CPU load is redistributed to change the output accuracy within the processing load for lane recognition.
- the processing load after redistribution is shown in FIG.
- FIG. 13 is a diagram showing an output example of warning and vehicle control at the time of low-speed lateral velocity left lane departure
- FIG. 14 is a diagram showing an output example of warning and vehicle control at the time of high-speed lateral velocity left lane departure.
- FIG. 13 and FIG. 14 show examples of recognition output when the degree of involvement is manipulated to an extreme. In normal recognition output, recognition outputs of lane recognition (lateral position, yaw angle, curvature) and vehicle detection (left vehicle, right vehicle) are output. However, as shown in FIG.
- the degree of participation is adjusted largely, and the curvature (process C) and the left vehicle (process D)
- the distribution of the degree of participation is performed so as to stop the recognition output result, and the processing load is redistributed to the lateral position (process A), the yaw angle (process B), and the right vehicle (process E).
- the information of the recognition result is used for the alarm or the vehicle control in cooperation with the alarm / vehicle control unit that implements the alarm or the vehicle control using the information on the recognition result.
- the processing resolution of each application, the image resolution, the recognition distance, the processing cycle, the processing timing, and the like can be changed so as to improve the accuracy. Therefore, the recognition result can be accurately obtained in the situation where the alarm or the vehicle control is activated, and the environment recognition around the vehicle can be realized with higher accuracy.
- the present invention is not limited to the above-mentioned embodiment, and various designs are possible in the range which does not deviate from the spirit of the present invention described in the claim. It is possible to make changes.
- the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to one having all the described configurations.
- part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
- Vehicle recognition system 110 Imaging unit 210 lane recognition unit 211 Horizontal position detector 212 yaw angle detector 213 Curvature Detector 221 Left side detection unit 222 Right side detection unit 300 Application control unit (accuracy adjustment unit) 310 Required Accuracy Processing Load Converter 320 Processing load adjustment unit 330 Execution Request Department 400 alarm and vehicle control unit 410 Prediction judgment unit 420 recognition output request unit 430 Warning and Vehicle Control Department 500 Vehicle Behavior Department 510 Vehicle speed section 520 Wheel Speed Section 530 steering angle 540 yaw rate section 550 horizontal G section
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
図1は、本実施の形態における車載用環境認識装置100の構成図である。
車載用環境認識装置100は、例えば車両に搭載されるカメラ装置が備えるマイクロコンピュータ内に設けられており、ROM内のソフトウエアプログラムをCPUで実行することによって具現化される。
横位置、ヨー角、車速、操舵角から車線逸脱のタイミングが推定できる。
車両挙動部500は、警報・車両制御部400に対して自車両の挙動の情報を出力する。自車両の挙動の情報は、車両に取り付けられたセンサから取得する。車両挙動部500は、車速センサから自車両の車速を得る車速部510、車輪の回転数を計測するセンサから情報を得る車輪速部520、車両のハンドルの操舵状況を計測するセンサから得られる操舵角部530、車両の走行方向の変化の度合いを示すヨーレートを計測するセンサから得るヨーレート部540、車両の横方向の加速度を計測するセンサから得る横G部550を有する。警報・車両制御部400では、これら車両挙動情報と、画像認識結果からの情報を統合して、警報・車両制御の実施を判断する際に利用する。
警報・車両制御部400は、車載用環境認識装置100からの認識結果、本実施例においては、レーン認識部210の各検出部211~213で検出した横位置、ヨー角、曲率の情報を入力情報として利用する。また、図2において説明した車両挙動部500より得られた車両挙動の情報も入力情報として利用する。これら車載用環境認識装置100と車両挙動部500の入力情報から、車線逸脱抑制などの自車両の安全を確保するための車両制御を実施するか、もしくは安全をドライバーに喚起するための警報を鳴らす制御を実施するかどうかを警報・車両制御部400にて判断する。
アプリ制御部300は、レーン認識部210で検出される複数の検出要素の少なくとも一つの検出精度を自車の状況に応じて調整するものであり(精度調整部)、要求精度処理負荷換算部310と、処理負荷調整部320と、実施依頼部330を有する。
レーン認識部処理負荷(低速横速度車線逸脱時の横位置の処理負荷)=
([低速横速度車線逸脱時の横位置検出処理Aの関与度]/[通常時の横位置検出処理Aの関与度])×[通常時の横位置検出処理Aの負荷]
46≒45.454545…=(50/33)×30
この処理負荷再分配表には、CPUの処理負荷を再分配して、全体で100%以下に収まるよう調整を加えた結果が示されている。ただし、複数の処理のうち、認識精度に影響しない、その他の処理Zに関しては、レーン認識の精度に関係なく、必ず実行する処理であり、その処理負荷は常に一定であり、再分配においても影響はないように計算する。
100-(処理Bの処理負荷換算値)-(処理Cの処理負荷換算値)-(処理Zの処理負荷換算値)
=処理Aの再配分後の処理負荷
100-23%-16%-20%=41%
処理Aの再配分値 42.2% ≒ 75% * {A(45%)/ (A(45%)+B(35%))}
処理Bの再配分値 32.8% ≒ 75% * {B(35%)/ (A(45%)+B(35%))}
通常時に対して、警報・車両制御が実施されるシーン(低速横速度車線逸脱時、高速横速度車線逸脱時、カーブ中車線中央走行制御時)では、レーン認識における各処理A~Cの処理負荷の配分を動的に変更することで、より高精度で安定的な警報や制御を実施する。各シーンで警報や制御を実施する際に重要視される認識出力の精度を向上するために、認識処理の処理負荷を上げていることが確認できる。
ステップS1では、撮像部110(車載カメラ)により自車両の周囲が撮像され、ステップS2では、その画像を利用してレーン認識処理(処理A~C、Z)が実施される。初期動作に関しては、通常時の処理負荷のまま実施される。
(1)低速横速度車線逸脱時について
低速横速度車線逸脱時における車線逸脱では、車線に対する車両の傾きは小さいものと仮定することができ、車線に対する車両の傾きを示すヨー角の検出精度は、それほど高い必要性がない。時々刻々と変化する横位置の値を見ているだけで、今後の警報・車両制御の予測が可能である。また、曲率についても、警報・車両制御の直前においては、更新の必要性が低いと考えられる。
高速横速度車線逸脱時における車線逸脱では、車線に対する車両の傾きが大きく車線に対する車両の傾きを示すヨー角の検出精度が、今後、車線逸脱するかの予測に重要な要因となる。また、横位置に関してもある程度の精度を必要とする。一方、曲率については、警報・車両制御の直前においては、更新の必要性が低いと考えられる。
カーブ中の車線中央走行制御時は、自車両を道路のカーブに沿って走行させることが重要な要素となる。このため、車線中の横位置などは、多少検出精度に誤差があっても影響が少ない。また、横位置の更新周期が多少遅くなっても大きく影響を与えることはない。一方、道路に沿って自車両を走行させるには、自車両の傾きヨー角と、道路の曲がり具合を示す曲率が重要な要素となる。このため、警報・車両制御に大きく影響を与えるヨー角、曲率については通常時よりも関与度を上げ、制御には大きな影響を与えない横位置については、関与度を下げて対応する。これにより、カーブ中の車線中央走行制御時の警報・車両制御に重要な判断基準となるヨー角と曲率の検出精度、安定性を向上し、横位置の処理負荷を抑制することで、ドライバーに与える警報・車両制御の精度、安定性を向上させることができる。
図9(a)では、カーブ12に沿った車線維持制御時に横位置の関与度を3フレーム中2回を0として処理Aの実施回数を減少させた場合の例を示す。カーブ12に沿った車線維持制御時は、車線の曲がりぐらいに沿って走ることが優先されるために、多少車線中央からずれるよりも、レーンに沿って安定走行することが重要視される。このため、曲率の項を重要視して計算し、横位置の更新が多少遅れても問題ないとする。横位置関与度を0にすることで、出力を飛ばし、ヨー角、曲率の検出結果を重要視し、全体としての警報・車両制御の精度を向上させる。
次に、本発明の第2実施の形態について説明する。図10は、第2実施の形態における車載用環境認識装置の構成図である。なお、第1実施の形態と同様の構成要素には、同一の符号を付することでその詳細な説明を省略する。
図11は、各シーンにおける複数処理の関与度と処理負荷の割合を示す表、図12は、各シーンにおける複数処理の処理負荷の割合を示す図、図13(a)は、低速横速度左車線逸脱時の警報・車両制御の出力例を示す図、図13(b)は、車線逸脱状態を模式的に示す図、図14(a)は、高速横速度左車線逸脱時の警報・車両制御の出力例を示す図、図14(b)は、急な車線逸脱状態を模式的に示す図である。
低速横速度車線逸脱時の車線逸脱は、図13(b)に示すように車線11に対する自車両1の傾きは小さいものと仮定することができる。したがって、レーン認識に関しては、車線11に対する自車両1の傾きを示すヨー角の認識精度はそれほど高い必要性がない。時々刻々と変化する横位置の値を見ているだけで、今後の警報・車両制御の予測が可能である。また、曲率についても、警報・車両制御の直前においては、更新の必要性が低いと考えられる。
高速横速度車線逸脱時の車線逸脱は、図14に示すように、車線11に対する自車両1の傾きが大きいものと仮定できる。したがって、レーン認識に関しては、車線11に対する自車両1の傾きを示すヨー角の認識精度が、車線逸脱するかの予測に重要な要因となり、また、横位置に関してもある程度の精度を必要とする。一方、曲率については、警報・車両制御の直前においては、更新の必要性が低いと考えられる。
カーブ中の車線中央走行制御時は、自車両を道路のカーブに沿って走行させることが重要な要素となる。このため、車線中の横位置などは、多少認識精度に誤差があっても影響が少ない。また、横位置の更新周期が多少遅くなっても大きく影響を与えることはない。
一方、道路に沿って自車両を走行させるには、自車両の傾きヨー角と、道路の曲がり具合を示す曲率が重要な要素となる。このため、警報・車両制御に大きく影響を与えるヨー角、曲率については関与度を上げ、制御には大きな影響を与えない横位置については、関与度を下げて対応する。
図13及び図14では、極端に関与度を操作した場合の認識出力例を示している。通常の認識出力では、レーン認識(横位置、ヨー角、曲率)、車両検知(左車両、右車両)の認識出力を出力する。しかし、図13に示すように、低速横速度左車線逸脱時に、極端な処理負荷の再分配を行った例では、関与度を大きく調整し、ヨー角(処理B)と曲率(処理C)と右車両(処理E)の認識出力結果を停止させるような関与度の配分を実施し、横位置(処理A)、左車両(処理D)に処理負荷の再分配を行う。これにより、警報・車両制御を利用するユーザにとっては、警報・車両制御の精度が向上したかのように体感させることが可能となる。
110 撮像部
210 レーン認識部
211 横位置検出部
212 ヨー角検出部
213 曲率検出部
221 左側方検出部
222 右側方検出部
300 アプリ制御部(精度調整部)
310 要求精度処理負荷換算部
320 処理負荷調整部
330 実施依頼部
400 警報・車両制御部
410 予測判断部
420 認識出力要求部
430 警報・車両制御実施部
500 車両挙動部
510 車速部
520 車輪速部
530 操舵角部
540 ヨーレート部
550 横G部
Claims (5)
- 撮像部で撮像した画像に基づいて自車周囲の環境を認識する車載用環境認識装置であって、
前記画像から予め設定された複数の検出要素を検出する検出部と、
前記複数の検出要素の少なくとも一つの検出精度を自車の状況に応じて調整する精度調整部と、
を有することを特徴とする車載用環境認識装置。 - 前記精度調整部は、
前記自車の状況に基づいて警報・車両制御が行われると予測される場合に、前記自車の状況に応じて前記複数の検出要素ごとに予め設定された検出精度の要求値に基づいて、前記検出部が前記複数の検出要素を検出する検出処理を実行したときの処理負荷を換算する要求精度処理負荷換算部と、
該要求精度処理負荷換算部により換算された処理負荷で前記検出部が前記検出処理を実行した場合に、前記検出処理が規定処理周期内に収まるか否かを判断し、判断結果に応じて前記複数の検出処理のうち少なくとも一つの検出処理の処理負荷を調整する処理負荷調整部を有することを特徴とする請求項1に記載の車載用環境認識装置。 - 前記処理負荷調整部は、前記検出処理が規定処理周期内に収まらないときは、前記複数の検出処理のうち少なくとも一つの検出処理の処理負荷を低減させることを特徴とする請求項2に記載の車載用環境認識装置。
- 前記処理負荷調整部は、前記検出処理の処理時間が規定処理周期よりも短いときは、前記複数の検出処理のうち少なくとも一つの検出処理の処理負荷を増加させることを特徴とする請求項2に記載の車載用環境認識装置。
- 前記精度調整部は、自車の状況に応じて前記複数の検出処理の少なくとも一つの検出処理を省略させることを特徴とする請求項1に記載の車載用環境認識装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12849641.1A EP2782085A1 (en) | 2011-11-15 | 2012-10-05 | Onboard environment-recognition device |
KR1020147008624A KR20140057643A (ko) | 2011-11-15 | 2012-10-05 | 차재용 환경 인식 장치 |
US14/358,390 US20140300731A1 (en) | 2011-11-15 | 2012-10-05 | Onboard Environment-Recognition Device |
CN201280056223.9A CN103930938A (zh) | 2011-11-15 | 2012-10-05 | 车载用环境识别装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-249816 | 2011-11-15 | ||
JP2011249816A JP5628137B2 (ja) | 2011-11-15 | 2011-11-15 | 車載用環境認識装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013073310A1 true WO2013073310A1 (ja) | 2013-05-23 |
Family
ID=48429380
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/075886 WO2013073310A1 (ja) | 2011-11-15 | 2012-10-05 | 車載用環境認識装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20140300731A1 (ja) |
EP (1) | EP2782085A1 (ja) |
JP (1) | JP5628137B2 (ja) |
KR (1) | KR20140057643A (ja) |
CN (1) | CN103930938A (ja) |
WO (1) | WO2013073310A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10107640B2 (en) * | 2014-02-28 | 2018-10-23 | Bridgestone Corporation | Vehicular travel condition monitoring apparatus |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10367869B2 (en) * | 2014-12-30 | 2019-07-30 | Ford Global Technologies, Llc | Remote vehicle control and operation |
DE112016001839B4 (de) * | 2015-04-23 | 2022-03-24 | Mitsubishi Electric Corporation | Präsentationsplan-erzeugungsvorrichtung, informations-präsentationsvorrichtung und präsentationsplan-erzeugungsverfahren |
JP6316265B2 (ja) * | 2015-12-01 | 2018-04-25 | 本田技研工業株式会社 | 車線変更制御装置 |
JP6809023B2 (ja) * | 2016-08-02 | 2021-01-06 | いすゞ自動車株式会社 | 操舵補助装置及び操舵補助方法 |
DE102016217636A1 (de) * | 2016-09-15 | 2018-03-15 | Robert Bosch Gmbh | Bildverarbeitungsalgorithmus |
US10353393B2 (en) * | 2016-12-29 | 2019-07-16 | Baidu Usa Llc | Method and system for improving stability of autonomous driving vehicles |
JP7350188B2 (ja) * | 2020-08-27 | 2023-09-25 | 三菱電機株式会社 | 運転支援装置、学習装置、運転支援方法、運転支援プログラム、学習済モデルの生成方法、学習済モデル生成プログラム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006185045A (ja) * | 2004-12-27 | 2006-07-13 | Mitsubishi Motors Corp | ノーズビューモニタ装置 |
WO2008062512A1 (fr) | 2006-11-21 | 2008-05-29 | Fujitsu Limited | Système multiprocesseur |
JP2009116539A (ja) * | 2007-11-05 | 2009-05-28 | Fujitsu Ten Ltd | 周辺監視装置 |
WO2010038851A1 (ja) * | 2008-10-02 | 2010-04-08 | 日立オートモティブシステムズ株式会社 | 車両走行に関する情報処理装置 |
JP2011100338A (ja) * | 2009-11-06 | 2011-05-19 | Hitachi Automotive Systems Ltd | 車載用マルチアプリ実行装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4055656B2 (ja) * | 2003-05-30 | 2008-03-05 | トヨタ自動車株式会社 | 衝突予測装置 |
JP2007172035A (ja) * | 2005-12-19 | 2007-07-05 | Fujitsu Ten Ltd | 車載画像認識装置、車載撮像装置、車載撮像制御装置、警告処理装置、画像認識方法、撮像方法および撮像制御方法 |
JP4372804B2 (ja) * | 2007-05-09 | 2009-11-25 | トヨタ自動車株式会社 | 画像処理装置 |
JP5094658B2 (ja) * | 2008-09-19 | 2012-12-12 | 日立オートモティブシステムズ株式会社 | 走行環境認識装置 |
-
2011
- 2011-11-15 JP JP2011249816A patent/JP5628137B2/ja not_active Expired - Fee Related
-
2012
- 2012-10-05 CN CN201280056223.9A patent/CN103930938A/zh active Pending
- 2012-10-05 KR KR1020147008624A patent/KR20140057643A/ko not_active Application Discontinuation
- 2012-10-05 US US14/358,390 patent/US20140300731A1/en not_active Abandoned
- 2012-10-05 WO PCT/JP2012/075886 patent/WO2013073310A1/ja active Application Filing
- 2012-10-05 EP EP12849641.1A patent/EP2782085A1/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006185045A (ja) * | 2004-12-27 | 2006-07-13 | Mitsubishi Motors Corp | ノーズビューモニタ装置 |
WO2008062512A1 (fr) | 2006-11-21 | 2008-05-29 | Fujitsu Limited | Système multiprocesseur |
JP2009116539A (ja) * | 2007-11-05 | 2009-05-28 | Fujitsu Ten Ltd | 周辺監視装置 |
WO2010038851A1 (ja) * | 2008-10-02 | 2010-04-08 | 日立オートモティブシステムズ株式会社 | 車両走行に関する情報処理装置 |
JP2011100338A (ja) * | 2009-11-06 | 2011-05-19 | Hitachi Automotive Systems Ltd | 車載用マルチアプリ実行装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10107640B2 (en) * | 2014-02-28 | 2018-10-23 | Bridgestone Corporation | Vehicular travel condition monitoring apparatus |
Also Published As
Publication number | Publication date |
---|---|
KR20140057643A (ko) | 2014-05-13 |
JP2013105385A (ja) | 2013-05-30 |
EP2782085A1 (en) | 2014-09-24 |
CN103930938A (zh) | 2014-07-16 |
JP5628137B2 (ja) | 2014-11-19 |
US20140300731A1 (en) | 2014-10-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013073310A1 (ja) | 車載用環境認識装置 | |
RU2720226C1 (ru) | Способ помощи при движении и устройство помощи при движении | |
RU2722777C1 (ru) | Способ помощи при вождении и устройство помощи при вождении | |
US9595197B2 (en) | Lateral control apparatus of vehicle and control method of the same | |
JP7027738B2 (ja) | 運転支援装置 | |
US10854080B2 (en) | Vehicle control system | |
RU2721438C1 (ru) | Способ помощи при движении устройством помощи при движении | |
US20160176400A1 (en) | Lane keeping assist apparatus | |
US20180186372A1 (en) | Lane departure prevention apparatus | |
WO2011055581A1 (ja) | 車載用マルチアプリ実行装置 | |
JP6525406B1 (ja) | 車両制御装置 | |
US8489287B2 (en) | Vehicle roll over prevention safety driving system and method | |
WO2019009067A1 (ja) | 車両制御装置 | |
JP7066463B2 (ja) | 走行支援システムおよび車両の制御方法 | |
KR101846577B1 (ko) | 차선유지보조시스템 및 그 제어방법 | |
JP5835132B2 (ja) | 運転支援装置 | |
JP2022164725A (ja) | 運転引継制御装置、方法およびプログラム | |
JP5867340B2 (ja) | 走行路曲率計算装置 | |
JP2016147506A (ja) | 車両制御装置及び車両制御方法 | |
EP4321405A1 (en) | Driving control method of vehicle and driving control device | |
JP6525405B1 (ja) | 車両制御装置 | |
US20230322229A1 (en) | Vehicle control device, storage medium for storing computer program for vehicle control, and method for controlling vehicle | |
JP4927056B2 (ja) | 運転意図推定装置、車両用運転操作補助装置および車両用運転操作補助装置を備えた車両 | |
JP2006264645A (ja) | 車両用走行制御装置 | |
JP2022149658A (ja) | 車両制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12849641 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20147008624 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012849641 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14358390 Country of ref document: US |