US11772644B2 - Damage reduction device, damage reduction method, and program - Google Patents

Damage reduction device, damage reduction method, and program Download PDF

Info

Publication number
US11772644B2
US11772644B2 US17/570,319 US202217570319A US11772644B2 US 11772644 B2 US11772644 B2 US 11772644B2 US 202217570319 A US202217570319 A US 202217570319A US 11772644 B2 US11772644 B2 US 11772644B2
Authority
US
United States
Prior art keywords
collision
vehicle
subject vehicle
passenger
damage reduction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US17/570,319
Other versions
US20220126821A1 (en
Inventor
Hideki Oyaizu
Yuhi Kondo
Yasutaka Hirasawa
Suguru Aoki
Taketo Akama
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Priority to US17/570,319 priority Critical patent/US11772644B2/en
Assigned to Sony Group Corporation reassignment Sony Group Corporation CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: SONY CORPORATION
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AKAMA, Taketo, KONDO, YUHI, AOKI, SUGURU, HIRASAWA, YASUTAKA, OYAIZU, HIDEKI
Publication of US20220126821A1 publication Critical patent/US20220126821A1/en
Application granted granted Critical
Publication of US11772644B2 publication Critical patent/US11772644B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • B62D15/0265Automatic obstacle avoidance by steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R2021/01204Actuation parameters of safety arrangents
    • B60R2021/01252Devices other than bags
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/20Steering systems
    • B60W2710/207Steering angle of wheels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details

Definitions

  • the present technology relates to an apparatus, method, and program for reducing damage of a collision accident that occurs in driving of an automobile, for example.
  • the present technology is applicable not only to automobiles but also to various moving body apparatuses such as ships and autonomous traveling robots and is also applicable to various technical fields including simulation apparatuses and games of those above.
  • An automatic emergency brake that automatically brakes and an automatic avoidance technology of automatically avoiding collisions have been recently developed. Further, there has also been known a technology of minimizing, in a case where a collision of an object cannot be avoided by the automatic emergency brake, damage to a subject vehicle, a pedestrian, or a surrounding environment by use of a collision damage reduction brake, a collision avoidance system, and the like.
  • Patent Literature 1 discloses a technology of controlling, when a collision is unavoidable and when another vehicle is about to collide with a subject vehicle, the subject vehicle to collide at an area of the subject vehicle where there is no passenger or where the strength of components is high.
  • Patent Literature 2 discloses a technology of causing a collision at an area of the subject vehicle where a collision object receives less shock.
  • Patent Literature 3 discloses a technology of controlling, when a collision is unavoidable, travel of the subject vehicle such that another vehicle that receives a collision collides at a collision site excluding a cabin thereof.
  • Patent Literatures 1 and 2 are for minimizing damage to the subject vehicle and do not disclose a technology for minimizing damage to another vehicle on the other party as well.
  • Patent Literature 3 is for simply avoiding a collision with a cabin portion of the other vehicle on the other party, and does not determine whether there is a passenger therein actually.
  • Patent Literatures 1 to 3 disclose nothing about a technology of performing, in a case where there are multiple collision objects other than the vehicle, different control depending on the objects.
  • a damage reduction device includes an input unit, a prediction unit, a recognition unit, and a determination unit.
  • the input unit inputs status data regarding a status in a moving direction of a moving body apparatus.
  • the prediction unit predicts a collision with an object in the moving direction on the basis of the status data.
  • the recognition unit recognizes whether the object includes a person.
  • the determination unit determines, when the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data.
  • the damage reduction device determines a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data. This enables movement control of the moving body apparatus in which priority is given to avoidance of the collision with the person.
  • the determination unit may be configured to determine, when it is predicted that a collision with the object other than the person is unavoidable, a collision mode of the moving body apparatus to the object on the basis of a type of the object.
  • the determination unit may be configured to determine, when the object is recognized as a manned moving body, a target collision site of the object, with which the moving body apparatus is to collide, and determine, when the object is recognized as an unmanned structure, a target collision site of the moving body apparatus that collides with the object.
  • the status data may include object passenger data regarding a sitting position of a passenger in the manned moving body.
  • the determination unit determines, when the object is recognized as the manned moving body, a target collision site of the manned moving body on the basis of the object passenger data.
  • the damage reduction device may further include an object passenger estimation unit that estimates the sitting position of the passenger of the manned moving body on the basis of the object passenger data.
  • the determination unit determines a non-sitting position of the passenger of the manned moving body or a vicinity thereof as a target collision site of the manned moving body.
  • the input unit may be configured to further input moving-body passenger data regarding a sitting position of a passenger of the moving body apparatus.
  • the damage reduction device further includes a passenger grasping unit that grasps the sitting position of the passenger of the moving body apparatus on the basis of the moving-body passenger data, and the determination unit further determines a target collision site of the moving body apparatus that collides with the object, on the basis of the status data and the moving-body passenger data.
  • the determination unit may determine a non-sitting position of the passenger of the moving body apparatus or a vicinity thereof as a target collision site of the moving body apparatus.
  • the damage reduction device may further include an output unit that outputs control data for moving the moving body apparatus in the steering direction determined by the determination unit, to a control unit that controls a moving operation of the moving body apparatus.
  • the damage reduction device may be mounted to the moving body apparatus that is equipped with at least a distance sensor for the moving direction and a front camera.
  • the input unit inputs data from the distance sensor and the front camera as the status data.
  • a damage reduction method includes inputting status data regarding a status in a moving direction of a moving body apparatus.
  • a collision with an object in the moving direction is predicted on the basis of the status data.
  • Whether the object includes a person is recognized.
  • a steering direction of the moving body apparatus in which a collision with the person is avoidable is determined on the basis of the status data.
  • a program causes a computer to execute the steps of: inputting status data regarding a status in a moving direction of a moving body apparatus; predicting a collision with an object in the moving direction on the basis of the status data; recognizing whether the object includes a person; and determining, when the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data.
  • FIG. 1 is an outer appearance view of a configuration of an automobile as an example of a moving body apparatus equipped with a damage reduction device according to an embodiment of the present technology.
  • FIG. 2 is a block diagram showing a configuration of the automobile.
  • FIG. 3 is a block diagram showing a configuration of the damage reduction device.
  • FIG. 4 is a flowchart for describing an action of the damage reduction device.
  • FIG. 5 is a schematic plan view for describing an operation example of the automobile.
  • FIG. 6 is a schematic plan view for describing another operation example of the automobile.
  • FIG. 7 is a diagram for describing another embodiment of the damage reduction device.
  • FIG. 8 is a flowchart for describing another action of the damage reduction device.
  • FIG. 9 is a schematic plan view for describing an operation example of an automobile in another embodiment of the present technology.
  • FIG. 10 is a schematic plan view for describing another operation example of the automobile.
  • FIG. 11 is a schematic plan view for describing a modified example of an operation of the automobile in another embodiment of the present technology.
  • FIG. 12 is a schematic plan view for describing another modified example of an operation of the automobile.
  • a technology disclosed in this specification aims at suppressing damage caused by a collision, particularly, human damage at a minimum in a situation where a moving body apparatus cannot avoid a collision with an object such as a different manned moving body, a person, or a structure that is present in a moving direction of the moving body apparatus when the moving body apparatus is traveling.
  • the moving body apparatus described above is, typically, a vehicle (subject vehicle) such as an automobile.
  • the different manned moving body described above is also a vehicle on the other party (different vehicle) that receives a collision of the subject vehicle.
  • the person described above typically corresponds to a passer-by such as a pedestrian and includes, in addition thereto, a passenger of the subject vehicle or the different vehicle.
  • the structure described above typically corresponds to a road installation object such as a utility pole, a signal, a street tree, a wall, or a guardrail that is installed on a road or the like and includes, in addition thereto, an unmanned and parked vehicle, for example.
  • a road installation object such as a utility pole, a signal, a street tree, a wall, or a guardrail that is installed on a road or the like and includes, in addition thereto, an unmanned and parked vehicle, for example.
  • a collision object (object) will be exemplified mainly as a vehicle, a structure, or a person, but it is needless to say that the present technology is not limited to those embodiments.
  • FIG. 1 is an outer appearance view of a configuration of an automobile as an example of a moving body apparatus equipped with a damage reduction device according to an embodiment of the present technology.
  • FIG. 2 is a block diagram thereof.
  • an automobile 100 (hereinafter, also referred to as subject vehicle) includes a distance sensor 110 for a moving direction, a front camera 120 , and a vehicle interior imaging camera 130 that images a passenger status within the vehicle.
  • the automobile 100 includes a steering device 140 , a brake device 150 , a vehicle body acceleration device 160 , a steering angle sensor 141 , a wheel speed sensor 151 , a brake switch 152 , an accelerator sensor 161 , a control unit 10 , and a damage reduction device 1 .
  • the distance sensor 110 is installed, for example, substantially at the center of the front part of the automobile 100 and outputs, to the control unit 10 , data regarding a distance between the automobile 100 and a physical object present in a moving direction thereof.
  • the output of the distance sensor 110 is referred to, for example, for calculation of a relative distance, a relative speed, or a relative acceleration with respect to a physical object (vehicle, pedestrian, structure, or the like) present in front of the subject vehicle, as will be described later.
  • the distance sensor 110 includes, for example, various sensors using a millimeter-wave radar, an infrared laser, and the like.
  • the front camera 120 is installed in, for example, a cabin or roof part of the automobile 100 and images the forward field of view of the automobile 100 at a predetermined frame rate.
  • the image data captured by the front camera 120 is output to the control unit 10 and is, as will be described later, referred to for determination of the type of the physical object (vehicle, pedestrian, structure, or the like) present in front of the subject vehicle, for calculation of a riding position of a passenger within the vehicle, and relative positions of the physical object described above and the subject vehicle, and the like.
  • the front camera 120 includes, for example, an image sensor such as a CMOS or a CCD.
  • the vehicle interior imaging camera 130 is installed in the cabin of the automobile 100 and images the interior status of the cabin at a predetermined frame rate.
  • the image data captured by the vehicle interior imaging camera 130 is output to the control unit 10 and is, as will be described later, referred to for determination of the presence or absence of a passenger of the subject vehicle and the riding position thereof.
  • the front camera 120 includes, for example, an image sensor such as a CMOS or a CCD.
  • the distance sensor 110 , the front camera 120 , and the vehicle interior imaging camera 130 may be configured such that the outputs therefrom are supplied to the damage reduction device 1 , instead of the configuration in which the outputs therefrom are supplied to the control unit 10 as shown in FIG. 2 .
  • the steering device 140 typically includes a power steering device and transmits a driver's steering wheel operation to a steering wheel.
  • the brake device 150 includes brake actuators attached to respective wheels and a hydraulic circuit that actuates those brake actuators, and transmits an operational force by depressing a brake pedal to the brake actuators via the hydraulic circuit.
  • the brake device 150 typically has an ABS control function for preventing lock (slip) of the wheels or a traction control function for preventing driving slip of drive wheels.
  • the vehicle body acceleration device 160 includes a throttle valve, a fuel injector, and the like and controls a rotational acceleration of the drive wheels.
  • the control unit 10 controls the steering device 140 , the brake device 150 , and the vehicle body acceleration device 160 .
  • the control unit 10 detects a steering amount and a steering direction and controls the steering device 140 , on the basis of the output of the steering angle sensor 141 that detects the driver's steering wheel operation.
  • the control unit 10 calculates a vehicle body speed of the vehicle and also controls the brake device 150 so as to prevent the lock (slip) of the wheels, on the basis of the outputs of the wheel speed sensors 151 installed on all of the wheels or some of the wheels.
  • the brake switch 152 is for detecting a brake operation (depression of the brake pedal) by the driver, and is referred to in the ABS control and the like.
  • the control unit 10 controls the vehicle body acceleration device 160 on the basis of the output of the accelerator sensor 161 that detects an operation amount of an accelerator pedal of the driver.
  • the control unit 10 may control some of the steering device 140 , the brake device 150 , and the vehicle body acceleration device 160 in cooperation with one another as well as when the control unit 10 controls them individually. This enables the automobile 100 to be controlled to have a desired posture in turning, braking, acceleration, and the like.
  • control unit 10 is configured to be capable of controlling the steering device 140 , the brake device 150 , and the vehicle body acceleration device 160 irrespective of the above-mentioned various operations of the driver.
  • the automobile 100 may have an automated driving function.
  • the control unit 10 takes the initiative in controlling the devices described above on the basis of the outputs of the sensors and cameras described above.
  • the control unit 10 is configured to be capable of controlling at least one of the devices described above on the basis of the output of the damage reduction device 1 that will be described later.
  • control unit 10 may be an aggregate of ECUs that individually control the steering device 140 , the brake device 150 , and the vehicle body acceleration device 160 or may be a single controller that collectively controls those devices. Further, the steering device 140 , the brake device 150 , and the vehicle body acceleration device 160 may individually include the ECUs described above. In this case, the control unit 10 is configured to individually output a control signal to the ECUs of the respective devices.
  • the damage reduction device 1 executes damage reduction behavior, which will be described later, in an emergency where there is a high possibility of a collision, to thus reduce damage to a passenger of a vehicle on the other party, a passenger of a subject vehicle (automobile 100 ), and the like.
  • a collision object include a vehicle, a person, and a road installation object.
  • a vehicle typically, an oncoming vehicle or preceding vehicle that is traveling in front of the automobile 100 , a vehicle parked in front of the automobile 100 , and the like (hereinafter, collectively referred to as object vehicle) will be described as examples of the collision object.
  • FIG. 3 is a block diagram showing a configuration of the damage reduction device 1 .
  • the damage reduction device 1 includes an input unit 11 , a prediction unit 12 , an object recognition unit 13 , a passenger estimation unit 14 , a passenger grasping unit 15 , a determination unit 16 , and an output unit 17 .
  • the input unit 11 inputs status data regarding a status in the moving direction (traveling direction) of the automobile 100 .
  • the status data is, for example, data regarding an object (a vehicle, a person, or a structure such as a road installation object) located in front of the automobile 100 .
  • the status data includes traveling data regarding a traveling state of the object vehicle approaching the automobile 100 (object vehicle traveling data), passenger data regarding a riding position of a passenger of the object vehicle (object vehicle passenger data), and the like.
  • the traveling data and the passenger data of the object vehicle correspond to output data of the distance sensor 110 and imaging data of the front camera 120 that are input via the control unit 10 .
  • the input unit 11 further inputs passenger data regarding a sitting position of a passenger of the automobile 100 (subject vehicle passenger data).
  • the subject vehicle passenger data corresponds to imaging data of the vehicle interior imaging camera 130 that is input via the control unit 10 .
  • the input unit 11 is configured to be capable of inputting various types of data associated with the traveling state of the automobile 100 , e.g., outputs of various sensors such as the steering angle sensor 141 and the wheel speed sensor 151 , control information for the brake device or the like in the control unit 10 , and the like.
  • the prediction unit 12 is configured to be capable of predicting a collision between the automobile 100 and the object vehicle and a collision site on the basis of the object vehicle traveling data.
  • the prediction unit 12 includes a relative speed arithmetic unit 121 that calculates a relative distance, a relative speed, and the like between the automobile 100 and the object vehicle from the output data of the distance sensor 110 .
  • the prediction unit 12 determines whether the automobile 100 and the object vehicle are likely to collide with each other from the traveling states of the automobile 100 and the object vehicle at the present moment.
  • the prediction unit 12 compares, on the basis of the object vehicle traveling data, the traveling state of the subject vehicle (automobile 100 ) and the traveling state of the object vehicle, and estimates a possibility of a collision, a position of a collision, and further collision sites of the subject vehicle and the vehicle on the other party, from an intersection point of traveling tracks of both of the vehicles.
  • a vehicle body speed and a steering angle that are calculated on the basis of the outputs of the wheel speed sensor 151 and the steering angle sensor 141 are referred to.
  • a relative position and a relative speed of the object vehicle with respect to the subject vehicle which are calculated on the basis of the output of the distance sensor 110 , are referred to.
  • the prediction unit 12 may be configured to refer to the image data of the front camera 120 as well, to predict a collision between the subject vehicle and the object vehicle. In this case, a captured image of the front camera 120 is analyzed, so that the course of the object vehicle or a collision site can be predicted with accuracy. Predicted data generated in the prediction unit 12 is output to the determination unit 16 .
  • the object recognition unit 13 is configured to be capable of recognizing a type of the object with which the automobile 100 collides.
  • the object recognition unit 13 classifies the object into three types of a vehicle, a person, and a structure such as a road installation object, but the object is not limited thereto as a matter of course.
  • various person recognition technologies, vehicle recognition technologies, and the like can be used.
  • the object recognition unit 13 recognizes whether the object includes a person. For example, in a case where the object is a vehicle, the object recognition unit 13 recognizes the presence or absence of a passenger in the vehicle. In this case, the object recognition unit 13 analyzes image data (object vehicle passenger data) output from the front camera 120 and detects the presence or absence of a passenger in the object vehicle. Recognized data generated in the object recognition unit 13 is output to the determination unit 16 .
  • the passenger estimation unit 14 estimates a sitting position of a passenger of the object vehicle on the basis of the object vehicle passenger data. In this embodiment, the passenger estimation unit 14 estimates the sitting position of the passenger of the object vehicle on the basis of the output of the object recognition unit 13 . The passenger estimation unit 14 estimates the presence or absence of a person in a driver's seat, a front passenger seat, or a rear seat of the object vehicle from a result of the analysis for the interior of the object vehicle, which is imaged by the front camera 120 and recognized as a person by the object recognition unit 13 . Estimated data generated in the passenger estimation unit 14 is output to the determination unit 16 .
  • the presence or absence of a passenger in a seat other than the driver's seat is estimated.
  • the presence or absence of a passenger in all the seats including the driver's seat is estimated. It should be noted that the determination on whether the object vehicle is traveling or parked is made on the basis of the output of the distance sensor 110 or the front camera 120 .
  • the passenger grasping unit 15 grasps a sitting position of a passenger of the automobile 100 on the basis of the image data (subject vehicle passenger data) output from the vehicle interior imaging camera 130 .
  • the passenger grasping unit 15 grasps the presence or absence of a passenger other than the driver within the automobile 100 , and in a case where there is a passenger, a riding position thereof.
  • Grasped data generated in the passenger grasping unit 15 is output to the determination unit 16 .
  • the passenger grasping technology typically, the person recognition technology is used.
  • a technology of executing matching with an image of the interior of an unmanned vehicle to estimate the presence or absence of a person may be employed.
  • the determination unit 16 determines, when it is predicted that a collision with the object vehicle is unavoidable on the basis of the outputs of the prediction unit 12 , the passenger estimation unit 14 , and the passenger grasping unit 15 , a collision mode of the automobile 100 against the object vehicle (in this example, a target collision site of the object vehicle, at which the automobile 100 is to collide) according to the riding position of the passenger of the object vehicle. This is for the purpose of causing the automobile 100 to collide at the site capable of reducing human damage to the object vehicle on the other party and the automobile 100 .
  • the determination unit 16 is configured to determine a non-sitting position of a passenger of the object vehicle or its vicinity as a target collision site of the object vehicle. In other words, when it is recognized that the object vehicle includes a person, a target collision position is determined for a site other than the riding position of the passenger of the object vehicle. This enables human damage on the other party due to the collision to be suppressed at a minimum.
  • the determination unit 16 determines the front part on the seat (front passenger seat) side next to the driver's seat of the object vehicle as a target collision site of the object vehicle. This enables reduction in damage to the driver of the object vehicle.
  • the determination unit 16 determines the center of the front part of the object vehicle as a target collision site of the object vehicle. This enables reduction in damage to each passenger on the other party due to the collision.
  • a target collision site of the object vehicle, at which the automobile 100 is to collide is determined on the basis of the object vehicle passenger data.
  • the automobile 100 it is possible to cause the automobile 100 to collide at the site capable of reducing human damage to the object vehicle on the other party.
  • the determination unit 16 may be configured to refer to a sitting position of a passenger of the automobile 100 (output of the passenger grasping unit 15 ) as well as that of the object vehicle, to determine a target collision site of the automobile 100 that collides at the target collision site of the object vehicle. This enables reduction not only in damage to the passenger of the object vehicle that receives a collision, but also in damage to the passenger of the automobile 100 that collides.
  • the determination unit 16 determines a non-sitting position of a passenger of the automobile 100 or its vicinity as a target collision site of the automobile 100 . This enables not only damage to the other party but also damage to the passenger of the subject vehicle due to the collision to be suppressed at a minimum.
  • the determination unit 16 determines the center of the front part of the automobile 100 as a target collision site of the automobile 100 . This enables reduction in damage to the passengers of the subject vehicle.
  • the determination unit 16 generates, at the time when a collision with the object vehicle is predicted, control data by which an automatic brake function of actuating the brake device of the automobile 100 is executed. The determination unit 16 then generates, at the time when it is predicted that a collision with the object vehicle is unavoidable, control data by which a steering direction of the automobile 100 is determined such that a target collision site of the automobile 100 can be caused to collide toward a target collision site of the object vehicle.
  • the output unit 17 is configured to output the control data, which is for moving the automobile 100 toward the target collision site of the object vehicle, the target collision site being determined by the determination unit 16 , to the control unit 10 that controls a moving operation of the automobile 100 .
  • the damage reduction device 1 configured as described above includes hardware necessary for a computer, e.g., a CPU, a RAM, and a ROM.
  • the CPU loads a program according to the present technology, which is recorded in the ROM in advance, to the RAM for execution, and thus a damage reduction method according to the present technology is executed.
  • a specific configuration of the damage reduction device 1 is not limited.
  • a PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • the damage reduction device 1 may be configured as a part of the control unit 10 .
  • FIG. 4 is a flowchart showing a control flow of the determination unit 16 in the damage reduction device 1 .
  • FIGS. 5 and 6 are diagrams for describing examples of the movement control of the automobile 100 .
  • the damage reduction device 1 first determines whether there is a possibility that the automobile 100 collides with the oncoming vehicle 200 (Step 101 ). This determination is executed on the basis of the output of the prediction unit 12 . When there is a possibility of a collision, the damage reduction device 1 determines whether a collision is avoidable by steering and braking (Step 102 ).
  • Step 103 When it is determined that a collision with the oncoming vehicle 200 is avoidable by sudden turning, sudden braking, and the like, an operation of avoiding that collision is executed (Step 103 ). This operation does not need an operation of the driver and causes the control unit 10 to directly output a control command to the steering device 140 and the brake device 150 .
  • the steering direction for example, the image data from the front camera 120 is referred to.
  • the determination unit 16 determines to which direction among right turning, left turning, and straight (see reference symbols 18 R, 18 L, and 18 S, respectively in part A of FIG. 5 ) the steering direction of the automobile 100 is controlled in order to reduce damage due to a collision. In this regard, the determination unit 16 determines whether there is a passenger in the front passenger seat of any of the automobile 100 (subject vehicle) and the oncoming vehicle 200 (Step 104 ). In this step, the outputs of the passenger estimation unit 14 and the passenger determination unit 15 are referred to.
  • the determination unit 16 determines that the front part on the front passenger seat side of the subject vehicle and the front part on the front passenger seat side of the oncoming vehicle 200 are target collision positions.
  • the determination unit 16 then outputs control data regarding steering and braking of the subject vehicle to the control unit 10 , so as to turn the automobile 100 to the right while applying a brake and cause the front part on the front passenger seat side of the subject vehicle to collide at the front part on the front passenger seat side of the oncoming vehicle 200 in a collision mode shown in part B of FIG. 5 (Step 105 ). It should be noted that in a case where a driving operation by a driven vehicle of the subject vehicle is being performed during the steering control described above, priority is given to the steering control by the control unit 10 .
  • the determination unit 16 determines that the center of the front part of the subject vehicle and the center of the front part of the oncoming vehicle 200 are target collision sites.
  • the determination unit 16 then outputs, to the control unit 10 , control data regarding steering and braking of the subject vehicle to cause the automobile 100 to go straight while applying a brake and cause the oncoming vehicle 200 and the subject vehicle collide head-on with each other in a collision mode shown in FIG. 6 (Step 106 ). It should be noted that in a case where a driving operation by a driven vehicle of the subject vehicle is being performed during the steering control described above, priority is given to the steering control by the control unit 10 .
  • each cabin is inhibited from being locally deformed as in the offset collision. Therefore, compared with a collision at the front parts on the driver's seat sides or the front passenger seat sides, damage to all the passengers of the subject vehicle and the oncoming vehicle 200 can be suppressed at a minimum.
  • positions of the passengers of the automobile 100 and the oncoming vehicle 200 are referred to, so that a collision site of each of the automobile 100 and the oncoming vehicle 200 is determined. This enables damage not only to the passengers of the subject vehicle but also to the passengers of the vehicle on the other party to be suppressed at a minimum.
  • a non-riding position of a passenger of the object vehicle is determined as a target collision site of the object vehicle in a similar manner to the case where the object vehicle is a preceding vehicle or a parked vehicle. This enables human damage on the other party to be suppressed at a minimum while protecting a passenger of the subject vehicle.
  • FIG. 7 a case where an object having a possibility of a collision with an automobile 100 includes not only a vehicle 210 but also a pedestrian 220 and a road installation object such as a utility pole 230 will be described as an example.
  • a damage reduction device 1 according to this embodiment and an automobile 100 equipped with this damage reduction device 1 have configurations similar to those of the first embodiment.
  • a configuration different from that of the first embodiment will be mainly described, and a configuration similar to that of the first embodiment will be denoted by a similar reference symbol and description thereof will be omitted or simply described.
  • the damage reduction device 1 of this embodiment includes an input unit 11 , a prediction unit 12 , an object recognition unit 13 , and a determination unit 16 (see FIG. 3 ).
  • the input unit 11 inputs status data regarding a status in a moving direction (traveling direction) of the automobile 100 .
  • the status data is, for example, data regarding an object (vehicle 210 , pedestrian 220 , or utility pole 230 ) located in front of the automobile 100 .
  • the recognition object unit 13 is configured to be capable of recognizing a type of the object with which the automobile 100 collides on the basis of the status data, and classifies the object into three types of, e.g., a car (vehicle 210 ), a person (pedestrian 220 ), and an unmanned structure (road installation object such as utility pole 230 ).
  • the determination unit 16 determines, when a collision with the object described above is predicted and when it is recognized that the object described above includes a person, a steering direction of the automobile 100 in which a collision with the person is avoidable, on the basis of the status data described above. In other words, when the object includes the pedestrian 220 , avoidance of a collision with the pedestrian 220 is set as a control target having the highest priority.
  • the determination unit 16 is configured to determine, when it is predicted that a collision with the object is unavoidable, a collision mode of the automobile 100 against the object on the basis of the type of the object.
  • the damage reduction device 1 of this embodiment is configured to perform different collision modes for the objects depending on whether the object is the vehicle 210 , the pedestrian 220 , or the road installation object such as the utility pole 230 .
  • the determination unit 16 determines, when the object is recognized as the vehicle 210 , a target collision site of the vehicle 210 (object vehicle) by a method similar to that of the first embodiment described above. At that time, the determination unit 16 refers to a riding position of a passenger of the subject vehicle as well, and determines a target collision site capable of reducing damage to the passengers of both of the vehicle on the other party and the subject vehicle.
  • the determination unit 16 determines, when the object is recognized as an unmanned structure such as the utility pole 230 , a site having relatively high rigidity in the automobile 100 , e.g., a pillar site, as a target collision site of the automobile 100 .
  • a site having relatively high rigidity in the automobile 100 e.g., a pillar site
  • the riding position of the passenger of the subject vehicle can be referred to on the basis of the output of the passenger grasping unit 15 (see FIG. 3 ).
  • the pillar on the front passenger seat side is determined as a target collision site, thus enabling damage to the driver to be suppressed at a minimum.
  • a site having high shock-absorbing property in the automobile 100 is determined as a target collision position.
  • the site having high shock-absorbing property include a site having relatively low rigidity such as the hood or the front glass, and a site at which a shock-absorbing apparatus such as an air-bag for pedestrians is installed.
  • FIG. 8 is a flowchart showing a control example of the damage reduction device 1 in this embodiment.
  • the damage reduction device 1 first determines whether there is a possibility that the automobile 100 collides with those objects (Step 201 ). This determination is executed on the basis of the output of the prediction unit 12 . When there is a possibility of a collision, the damage reduction device 1 determines whether a collision is avoidable by steering and braking (Step 202 ).
  • Step 203 When it is determined that a collision with each object described above is avoidable by sudden turning, sudden braking, and the like, an operation of avoiding that collision is executed (Step 203 ). This operation does not need an operation of the driver and causes the control unit 10 to directly output a control command to the steering device 140 and the brake device 150 .
  • the steering direction for example, image data from the front camera 120 is referred to.
  • the damage reduction device 1 determines whether the objects include a person (Step 204 ). When the objects include a person, the damage reduction device 1 determines whether there is an object other than the person (Step 205 ).
  • the damage reduction device 1 controls the steering direction such that the automobile 100 collides with the objects other than the person.
  • the damage reduction device 1 determines the steering direction of the automobile 100 such that damage to the vehicle 210 and the automobile 100 becomes minimum (Steps 206 and 207 ). Such control is similar to that of the first embodiment described above, and thus description thereof will be omitted here.
  • the vehicle 210 when the vehicle 210 is recognized as an unmanned parked vehicle, the vehicle 210 may be considered as an unmanned structure, and steering control similar to that performed when the object is the utility pole 230 may be executed, as will be described later.
  • the damage reduction device 1 determines the steering direction of the automobile 100 such that damage to the automobile 100 becomes minimum (Steps 206 and 208 ).
  • the damage reduction device 1 determines to which direction among right turning, left turning, and straight the steering direction of the automobile 100 is controlled. At that time, the damage reduction device 1 refers to the riding position of the passenger of the automobile 100 . When there is no passenger in the front passenger seat, the damage reduction device 1 determines a pillar site on the front passenger seat side of the automobile 100 as a target collision site. The damage reduction device 1 then outputs system data regarding steering and braking of the automobile 100 to the control unit 10 , so as to turn the automobile 100 to the right while applying a brake and cause the pillar part described above to collide with the utility pole 230 in a collision mode shown in part B of FIG. 9 (Step 208 ). It should be noted that in a case where a driving operation by a driven vehicle of the subject vehicle is being performed during the steering control described above, priority is given to the steering control by the control unit 10 .
  • the damage reduction device 1 determines the steering direction of the automobile 100 such that damage to the pedestrian becomes minimum (Step 209 ).
  • the damage reduction device 1 determines to which direction among right turning, left turning, and straight the steering direction of the automobile 100 is controlled. At that time, the damage reduction device 1 determines a site having the highest shock-absorbing effect in the automobile 100 (a relatively soft site such as the hood or the front glass, or a site where an air-bag for pedestrians 170 actuates) as a target collision site. The damage reduction device 1 then outputs system data regarding steering and braking of the automobile 100 to the control unit 10 , so as to turn the automobile 100 to the right while applying a brake and cause the front of the automobile 100 to collide with the pedestrian 220 in a collision mode shown in part B of FIG. 10 (Step 209 ). It should be noted that in a case where a driving operation by a driven vehicle of the subject vehicle is being performed during the steering control described above, priority is given to the steering control by the control unit 10 .
  • the collision mode is made different depending on the type of the object.
  • damage not only to the subject vehicle but also to a different vehicle and a pedestrian can be made minimum suitably or comprehensively.
  • the present technology is not limited thereto.
  • the present technology is also effective in, for example, a right-turn/straight accident (collision accident between a vehicle turning right and a person going straight) at an intersection or a collision of vehicles on a sudden encounter.
  • FIG. 11 shows an example of controlling a collision between an automobile 100 traveling straight and a right-turning vehicle 201 .
  • the damage reduction device 1 determines a left-side rear part of the right-turning vehicle 201 as a target collision site and executes steering control to turn the automobile 100 to the right.
  • FIG. 12 shows an example of controlling a collision between the automobile 100 traveling straight and a passing vehicle 202 cutting across in front of the automobile 100 .
  • the damage reduction device 1 determines a left-side rear part of the passing vehicle 202 as a target collision site and executes steering control to turn the automobile 100 to the right.
  • the present technology is also effective for a vehicle at a junction, a vehicle rushing out from an alley, and the like as the vehicle whose collision with the subject vehicle is unavoidable.
  • the automobile has been described as an example of the moving body apparatus, in addition thereto, the present technology is applicable to a boat traveling on water, an autonomous traveling robot, and the like.
  • a damage reduction device including:
  • an input unit that inputs object traveling data regarding a traveling state of a manned moving body approaching a moving body apparatus and object passenger data regarding a sitting position of a passenger of the manned moving body;
  • a prediction unit that predicts a collision with the manned moving body on the basis of the object traveling data
  • a passenger estimation unit that estimates the sitting position of the passenger of the manned moving body on the basis of the object passenger data
  • a determination unit that determines, when it is predicted that the collision with the manned moving body is unavoidable, a target collision site of the manned moving body, at which the moving body apparatus is to collide, on the basis of the sitting position of the passenger of the manned moving body.
  • the determination unit determines a non-sitting position of the passenger of the manned moving body or a vicinity thereof as a target collision site of the manned moving body.
  • the determination unit determines, when the manned moving body is moving toward the moving body apparatus and when the sitting position of the passenger of the manned moving body is estimated as only a driver's seat, a front part on a seat side next to the driver's seat as a target collision site of the manned moving body.
  • the determination unit determines, when the manned moving body is moving toward the moving body apparatus and when the sitting position of the passenger of the manned moving body is estimated as a driver's seat and a seat next thereto, the center of a front part of the manned moving body as a target collision site of the manned moving body.
  • the input unit further inputs moving-body passenger data regarding a sitting position of a passenger of the moving body apparatus
  • the damage reduction device further includes a passenger grasping unit that grasps the sitting position of the passenger of the moving body apparatus on the basis of the moving-body passenger data, and
  • the determination unit further determines a target collision site of the moving body apparatus that collides with the target collision site of the manned moving body, on the basis of the object traveling data, the object passenger data, and the moving-body passenger data.
  • the determination unit determines a non-sitting position of the passenger of the moving body apparatus or a vicinity thereof as a target collision site of the moving body apparatus.
  • the determination unit determines, when the sitting position of the passenger of the moving body apparatus is grasped as a driver's seat and a seat next thereto, the center of a front part of the moving body apparatus as a target collision site of the moving body apparatus.
  • an output unit that outputs control data for moving the moving body apparatus toward the target collision site of the manned moving body, which is determined by the determination unit, to a control unit that controls a moving operation of the moving body apparatus.
  • the damage reduction device is mounted to the moving body apparatus that is equipped with at least a distance sensor for a moving direction and a front camera, and
  • the input unit inputs data from the distance sensor and the front camera as the object traveling data and the object passenger data.
  • a damage reduction method including:
  • a damage reduction device including:
  • an input unit that inputs status data regarding a status in a moving direction of a moving body apparatus
  • a prediction unit that predicts a collision with an object in the moving direction on the basis of the status data
  • a recognition unit that recognizes a type of the object
  • a determination unit that determines, when it is predicted that the collision with the object is unavoidable, a collision mode of the moving body apparatus to the object on the basis of the type of the object.
  • the recognition unit recognizes whether the object is any of a manned moving body, an unmanned structure, and a person, and
  • the determination unit determines, when the object is recognized as the manned moving body, a target collision site of the object, at which the moving body apparatus is to collide, and determines, when the object is recognized as the unmanned structure or the person, a target collision site of the moving body apparatus that collides with the object.
  • the status data includes object passenger data regarding a sitting position of a passenger in the manned moving body, and
  • the determination unit determines, when the object is recognized as the manned moving body, a target collision site of the manned moving body on the basis of the object passenger data.
  • an object passenger estimation unit that estimates the sitting position of the passenger of the manned moving body on the basis of the object passenger data
  • the determination unit determines a non-sitting position of the passenger of the manned moving body or a vicinity thereof as a target collision site of the manned moving body.
  • the determination unit determines, when the object is recognized as the unmanned structure, a pillar site of the moving body apparatus as a target collision site of the moving body apparatus.
  • the determination unit determines, when the object is recognized as the person, a shock-absorbing site of the moving body apparatus as a target collision site of the moving body apparatus.
  • the input unit further inputs moving-body passenger data regarding a sitting position of a passenger of the moving body apparatus
  • the damage reduction device further includes a passenger grasping unit that grasps the sitting position of the passenger of the moving body apparatus on the basis of the moving-body passenger data, and
  • the determination unit further determines a target collision site of the moving body apparatus that collides with the object, on the basis of the status data and the moving-body passenger data.
  • the determination unit determines a non-sitting position of the passenger of the moving body apparatus or a vicinity thereof as a target collision site of the moving body apparatus.
  • an output unit that outputs control data for moving the moving body apparatus to the object in the collision mode, which is determined by the determination unit, to a control unit that controls a moving operation of the moving body apparatus.
  • the damage reduction device is mounted to the moving body apparatus that is equipped with at least a distance sensor for the moving direction and a front camera, and
  • the input unit inputs data from the distance sensor and the front camera as the status data.
  • a damage reduction method including:
  • a damage reduction device including:
  • an input unit that inputs status data regarding a status in a moving direction of a moving body apparatus
  • a prediction unit that predicts a collision with an object in the moving direction on the basis of the status data
  • a recognition unit that recognizes whether the object includes a person
  • a determination unit that determines, when the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data.
  • the determination unit determines, when it is predicted that a collision with the object other than the person is unavoidable, a collision mode of the moving body apparatus to the object on the basis of a type of the object.
  • the determination unit determines, when the object is recognized as a manned moving body, a target collision site of the object, with which the moving body apparatus is to collide, and determines, when the object is recognized as an unmanned structure, a target collision site of the moving body apparatus that collides with the object.
  • the status data includes object passenger data regarding a sitting position of a passenger in the manned moving body, and
  • the determination unit determines, when the object is recognized as the manned moving body, a target collision site of the manned moving body on the basis of the object passenger data.
  • an object passenger estimation unit that estimates the sitting position of the passenger of the manned moving body on the basis of the object passenger data
  • the determination unit determines a non-sitting position of the passenger of the manned moving body or a vicinity thereof as a target collision site of the manned moving body.
  • the input unit further inputs moving-body passenger data regarding a sitting position of a passenger of the moving body apparatus
  • the damage reduction device further includes a passenger grasping unit that grasps the sitting position of the passenger of the moving body apparatus on the basis of the moving-body passenger data, and
  • the determination unit further determines a target collision site of the moving body apparatus that collides with the object, on the basis of the status data and the moving-body passenger data.
  • the determination unit determines a non-sitting position of the passenger of the moving body apparatus or a vicinity thereof as a target collision site of the moving body apparatus.
  • an output unit that outputs control data for moving the moving body apparatus in the steering direction determined by the determination unit, to a control unit that controls a moving operation of the moving body apparatus.
  • the damage reduction device is mounted to the moving body apparatus that is equipped with at least a distance sensor for the moving direction and a front camera, and
  • the input unit inputs data from the distance sensor and the front camera as the status data.
  • a damage reduction method including:

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)
  • Electric Propulsion And Braking For Vehicles (AREA)

Abstract

A damage reduction device according to an embodiment of the present technology includes an input unit, a prediction unit, a recognition unit, and a determination unit. The input unit inputs status data regarding a status in a moving direction of a moving body apparatus. The prediction unit predicts a collision with an object in the moving direction on the basis of the status data. The recognition unit recognizes whether the object includes a person. The determination unit determines, when the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
The present application claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 16/599,740, filed on Oct. 11, 2019, now U.S. Pat. No. 11,254,307, which claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 15/761,844, filed on Mar. 21, 2018, now U.S. Pat. No. 10,464,559, which claims the benefit under 35 U.S.C. § 371 as a U.S. National Stage Entry of International Application No. PCT/JP2016/003769, filed in the Japanese Patent Office as a Receiving Office on Aug. 18, 2016, which claims priority to Japanese Patent Application Number JP 2015-190780, filed in the Japanese Patent Office on Sep. 29, 2015, each of which is hereby incorporated by reference in its entirety.
TECHNICAL FIELD
The present technology relates to an apparatus, method, and program for reducing damage of a collision accident that occurs in driving of an automobile, for example. The present technology is applicable not only to automobiles but also to various moving body apparatuses such as ships and autonomous traveling robots and is also applicable to various technical fields including simulation apparatuses and games of those above.
BACKGROUND ART
An automatic emergency brake that automatically brakes and an automatic avoidance technology of automatically avoiding collisions have been recently developed. Further, there has also been known a technology of minimizing, in a case where a collision of an object cannot be avoided by the automatic emergency brake, damage to a subject vehicle, a pedestrian, or a surrounding environment by use of a collision damage reduction brake, a collision avoidance system, and the like.
For example, Patent Literature 1 discloses a technology of controlling, when a collision is unavoidable and when another vehicle is about to collide with a subject vehicle, the subject vehicle to collide at an area of the subject vehicle where there is no passenger or where the strength of components is high. Further, Patent Literature 2 discloses a technology of causing a collision at an area of the subject vehicle where a collision object receives less shock. Furthermore, Patent Literature 3 discloses a technology of controlling, when a collision is unavoidable, travel of the subject vehicle such that another vehicle that receives a collision collides at a collision site excluding a cabin thereof.
CITATION LIST Patent Literature
  • Patent Literature 1: Japanese Patent Application Laid-open No. 2015-041222
  • Patent Literature 2: Japanese Patent Application Laid-open No. 2005-254923
  • Patent Literature 3: Japanese Patent Application Laid-open No. 2008-037313
DISCLOSURE OF INVENTION Technical Problem
However, the technologies disclosed in Patent Literatures 1 and 2 are for minimizing damage to the subject vehicle and do not disclose a technology for minimizing damage to another vehicle on the other party as well. Meanwhile, the technology disclosed in Patent Literature 3 is for simply avoiding a collision with a cabin portion of the other vehicle on the other party, and does not determine whether there is a passenger therein actually. Furthermore, Patent Literatures 1 to 3 disclose nothing about a technology of performing, in a case where there are multiple collision objects other than the vehicle, different control depending on the objects.
In view of the circumstances as described above, it is an object of the present technology to provide a damage reduction device, a damage reduction method, and a program that are capable of achieving reduction in human damage.
Solution to Problem
A damage reduction device according to an embodiment of the present technology includes an input unit, a prediction unit, a recognition unit, and a determination unit.
The input unit inputs status data regarding a status in a moving direction of a moving body apparatus.
The prediction unit predicts a collision with an object in the moving direction on the basis of the status data.
The recognition unit recognizes whether the object includes a person.
The determination unit determines, when the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data.
Under a situation where a collision with an object is predicted, when it is recognized that the object includes a person, the damage reduction device determines a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data. This enables movement control of the moving body apparatus in which priority is given to avoidance of the collision with the person.
The determination unit may be configured to determine, when it is predicted that a collision with the object other than the person is unavoidable, a collision mode of the moving body apparatus to the object on the basis of a type of the object.
This can suitably achieve reduction in human damage depending on the type of a collision object.
The determination unit may be configured to determine, when the object is recognized as a manned moving body, a target collision site of the object, with which the moving body apparatus is to collide, and determine, when the object is recognized as an unmanned structure, a target collision site of the moving body apparatus that collides with the object.
This can achieve reduction in damage to a passenger of the manned moving body when the object is a manned moving body, and achieve reduction in damage to a passenger of the moving body apparatus when the object is recognized as an unmanned structure.
The status data may include object passenger data regarding a sitting position of a passenger in the manned moving body. In this case, the determination unit determines, when the object is recognized as the manned moving body, a target collision site of the manned moving body on the basis of the object passenger data.
This can achieve reduction in damage to the passenger of the manned moving body.
The damage reduction device may further include an object passenger estimation unit that estimates the sitting position of the passenger of the manned moving body on the basis of the object passenger data. In this case, the determination unit determines a non-sitting position of the passenger of the manned moving body or a vicinity thereof as a target collision site of the manned moving body.
This can suppress human damage on the other party due to a collision to be suppressed at a minimum.
The input unit may be configured to further input moving-body passenger data regarding a sitting position of a passenger of the moving body apparatus. In this case, the damage reduction device further includes a passenger grasping unit that grasps the sitting position of the passenger of the moving body apparatus on the basis of the moving-body passenger data, and the determination unit further determines a target collision site of the moving body apparatus that collides with the object, on the basis of the status data and the moving-body passenger data.
This can reduce damage to the passenger of the moving body apparatus.
In this case, the determination unit may determine a non-sitting position of the passenger of the moving body apparatus or a vicinity thereof as a target collision site of the moving body apparatus.
This can suppress damage to a passenger due to a collision to be suppressed at a minimum.
The damage reduction device may further include an output unit that outputs control data for moving the moving body apparatus in the steering direction determined by the determination unit, to a control unit that controls a moving operation of the moving body apparatus.
The damage reduction device may be mounted to the moving body apparatus that is equipped with at least a distance sensor for the moving direction and a front camera. In this case, the input unit inputs data from the distance sensor and the front camera as the status data.
A damage reduction method according to an embodiment of the present technology includes inputting status data regarding a status in a moving direction of a moving body apparatus.
A collision with an object in the moving direction is predicted on the basis of the status data.
Whether the object includes a person is recognized.
When the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable is determined on the basis of the status data.
A program according to an embodiment of the present technology causes a computer to execute the steps of: inputting status data regarding a status in a moving direction of a moving body apparatus; predicting a collision with an object in the moving direction on the basis of the status data; recognizing whether the object includes a person; and determining, when the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data.
Advantageous Effects of Invention
As described above, according to the present technology, it is possible to achieve reduction in human damage.
It should be noted that the effects described herein are not necessarily limited and any one of the effects described in the present disclosure may be produced.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is an outer appearance view of a configuration of an automobile as an example of a moving body apparatus equipped with a damage reduction device according to an embodiment of the present technology.
FIG. 2 is a block diagram showing a configuration of the automobile.
FIG. 3 is a block diagram showing a configuration of the damage reduction device.
FIG. 4 is a flowchart for describing an action of the damage reduction device.
FIG. 5 is a schematic plan view for describing an operation example of the automobile.
FIG. 6 is a schematic plan view for describing another operation example of the automobile.
FIG. 7 is a diagram for describing another embodiment of the damage reduction device.
FIG. 8 is a flowchart for describing another action of the damage reduction device.
FIG. 9 is a schematic plan view for describing an operation example of an automobile in another embodiment of the present technology.
FIG. 10 is a schematic plan view for describing another operation example of the automobile.
FIG. 11 is a schematic plan view for describing a modified example of an operation of the automobile in another embodiment of the present technology.
FIG. 12 is a schematic plan view for describing another modified example of an operation of the automobile.
MODE(S) FOR CARRYING OUT THE INVENTION General Outline
A technology disclosed in this specification aims at suppressing damage caused by a collision, particularly, human damage at a minimum in a situation where a moving body apparatus cannot avoid a collision with an object such as a different manned moving body, a person, or a structure that is present in a moving direction of the moving body apparatus when the moving body apparatus is traveling.
Here, the moving body apparatus described above is, typically, a vehicle (subject vehicle) such as an automobile. In this case, the different manned moving body described above is also a vehicle on the other party (different vehicle) that receives a collision of the subject vehicle.
Meanwhile, the person described above typically corresponds to a passer-by such as a pedestrian and includes, in addition thereto, a passenger of the subject vehicle or the different vehicle.
Furthermore, the structure described above typically corresponds to a road installation object such as a utility pole, a signal, a street tree, a wall, or a guardrail that is installed on a road or the like and includes, in addition thereto, an unmanned and parked vehicle, for example.
In the following embodiments, a collision object (object) will be exemplified mainly as a vehicle, a structure, or a person, but it is needless to say that the present technology is not limited to those embodiments.
Hereinafter, embodiments according to the present technology will be described with reference to the drawings.
First Embodiment
[Configuration of Automobile]
FIG. 1 is an outer appearance view of a configuration of an automobile as an example of a moving body apparatus equipped with a damage reduction device according to an embodiment of the present technology. FIG. 2 is a block diagram thereof.
(Basic Configuration)
As shown in FIG. 1 , an automobile 100 (hereinafter, also referred to as subject vehicle) includes a distance sensor 110 for a moving direction, a front camera 120, and a vehicle interior imaging camera 130 that images a passenger status within the vehicle.
Further, as shown in FIG. 2 , the automobile 100 includes a steering device 140, a brake device 150, a vehicle body acceleration device 160, a steering angle sensor 141, a wheel speed sensor 151, a brake switch 152, an accelerator sensor 161, a control unit 10, and a damage reduction device 1.
The distance sensor 110 is installed, for example, substantially at the center of the front part of the automobile 100 and outputs, to the control unit 10, data regarding a distance between the automobile 100 and a physical object present in a moving direction thereof. The output of the distance sensor 110 is referred to, for example, for calculation of a relative distance, a relative speed, or a relative acceleration with respect to a physical object (vehicle, pedestrian, structure, or the like) present in front of the subject vehicle, as will be described later. The distance sensor 110 includes, for example, various sensors using a millimeter-wave radar, an infrared laser, and the like.
The front camera 120 is installed in, for example, a cabin or roof part of the automobile 100 and images the forward field of view of the automobile 100 at a predetermined frame rate. The image data captured by the front camera 120 is output to the control unit 10 and is, as will be described later, referred to for determination of the type of the physical object (vehicle, pedestrian, structure, or the like) present in front of the subject vehicle, for calculation of a riding position of a passenger within the vehicle, and relative positions of the physical object described above and the subject vehicle, and the like. The front camera 120 includes, for example, an image sensor such as a CMOS or a CCD.
The vehicle interior imaging camera 130 is installed in the cabin of the automobile 100 and images the interior status of the cabin at a predetermined frame rate. The image data captured by the vehicle interior imaging camera 130 is output to the control unit 10 and is, as will be described later, referred to for determination of the presence or absence of a passenger of the subject vehicle and the riding position thereof. The front camera 120 includes, for example, an image sensor such as a CMOS or a CCD.
It should be noted that the distance sensor 110, the front camera 120, and the vehicle interior imaging camera 130 may be configured such that the outputs therefrom are supplied to the damage reduction device 1, instead of the configuration in which the outputs therefrom are supplied to the control unit 10 as shown in FIG. 2 .
The steering device 140 typically includes a power steering device and transmits a driver's steering wheel operation to a steering wheel. The brake device 150 includes brake actuators attached to respective wheels and a hydraulic circuit that actuates those brake actuators, and transmits an operational force by depressing a brake pedal to the brake actuators via the hydraulic circuit. The brake device 150 typically has an ABS control function for preventing lock (slip) of the wheels or a traction control function for preventing driving slip of drive wheels. The vehicle body acceleration device 160 includes a throttle valve, a fuel injector, and the like and controls a rotational acceleration of the drive wheels.
The control unit 10 controls the steering device 140, the brake device 150, and the vehicle body acceleration device 160. In other words, the control unit 10 detects a steering amount and a steering direction and controls the steering device 140, on the basis of the output of the steering angle sensor 141 that detects the driver's steering wheel operation. Further, the control unit 10 calculates a vehicle body speed of the vehicle and also controls the brake device 150 so as to prevent the lock (slip) of the wheels, on the basis of the outputs of the wheel speed sensors 151 installed on all of the wheels or some of the wheels. The brake switch 152 is for detecting a brake operation (depression of the brake pedal) by the driver, and is referred to in the ABS control and the like. Furthermore, the control unit 10 controls the vehicle body acceleration device 160 on the basis of the output of the accelerator sensor 161 that detects an operation amount of an accelerator pedal of the driver.
The control unit 10 may control some of the steering device 140, the brake device 150, and the vehicle body acceleration device 160 in cooperation with one another as well as when the control unit 10 controls them individually. This enables the automobile 100 to be controlled to have a desired posture in turning, braking, acceleration, and the like.
Further, the control unit 10 is configured to be capable of controlling the steering device 140, the brake device 150, and the vehicle body acceleration device 160 irrespective of the above-mentioned various operations of the driver. For example, the automobile 100 may have an automated driving function. In this case, the control unit 10 takes the initiative in controlling the devices described above on the basis of the outputs of the sensors and cameras described above. In particular, in this embodiment, the control unit 10 is configured to be capable of controlling at least one of the devices described above on the basis of the output of the damage reduction device 1 that will be described later.
It should be noted that the control unit 10 may be an aggregate of ECUs that individually control the steering device 140, the brake device 150, and the vehicle body acceleration device 160 or may be a single controller that collectively controls those devices. Further, the steering device 140, the brake device 150, and the vehicle body acceleration device 160 may individually include the ECUs described above. In this case, the control unit 10 is configured to individually output a control signal to the ECUs of the respective devices.
(Damage Reduction Device)
The damage reduction device 1 executes damage reduction behavior, which will be described later, in an emergency where there is a high possibility of a collision, to thus reduce damage to a passenger of a vehicle on the other party, a passenger of a subject vehicle (automobile 100), and the like. Examples of a collision object include a vehicle, a person, and a road installation object. In this embodiment, a vehicle, typically, an oncoming vehicle or preceding vehicle that is traveling in front of the automobile 100, a vehicle parked in front of the automobile 100, and the like (hereinafter, collectively referred to as object vehicle) will be described as examples of the collision object.
FIG. 3 is a block diagram showing a configuration of the damage reduction device 1.
As shown in FIG. 3 , the damage reduction device 1 includes an input unit 11, a prediction unit 12, an object recognition unit 13, a passenger estimation unit 14, a passenger grasping unit 15, a determination unit 16, and an output unit 17.
The input unit 11 inputs status data regarding a status in the moving direction (traveling direction) of the automobile 100. The status data is, for example, data regarding an object (a vehicle, a person, or a structure such as a road installation object) located in front of the automobile 100.
The status data includes traveling data regarding a traveling state of the object vehicle approaching the automobile 100 (object vehicle traveling data), passenger data regarding a riding position of a passenger of the object vehicle (object vehicle passenger data), and the like. The traveling data and the passenger data of the object vehicle correspond to output data of the distance sensor 110 and imaging data of the front camera 120 that are input via the control unit 10.
The input unit 11 further inputs passenger data regarding a sitting position of a passenger of the automobile 100 (subject vehicle passenger data). The subject vehicle passenger data corresponds to imaging data of the vehicle interior imaging camera 130 that is input via the control unit 10.
It should be noted that the input unit 11 is configured to be capable of inputting various types of data associated with the traveling state of the automobile 100, e.g., outputs of various sensors such as the steering angle sensor 141 and the wheel speed sensor 151, control information for the brake device or the like in the control unit 10, and the like.
The prediction unit 12 is configured to be capable of predicting a collision between the automobile 100 and the object vehicle and a collision site on the basis of the object vehicle traveling data.
As shown in FIG. 3 , the prediction unit 12 includes a relative speed arithmetic unit 121 that calculates a relative distance, a relative speed, and the like between the automobile 100 and the object vehicle from the output data of the distance sensor 110. The prediction unit 12 determines whether the automobile 100 and the object vehicle are likely to collide with each other from the traveling states of the automobile 100 and the object vehicle at the present moment.
Typically, the prediction unit 12 compares, on the basis of the object vehicle traveling data, the traveling state of the subject vehicle (automobile 100) and the traveling state of the object vehicle, and estimates a possibility of a collision, a position of a collision, and further collision sites of the subject vehicle and the vehicle on the other party, from an intersection point of traveling tracks of both of the vehicles. For the traveling track of the subject vehicle, a vehicle body speed and a steering angle that are calculated on the basis of the outputs of the wheel speed sensor 151 and the steering angle sensor 141 are referred to. For the traveling track of the object vehicle, a relative position and a relative speed of the object vehicle with respect to the subject vehicle, which are calculated on the basis of the output of the distance sensor 110, are referred to.
The prediction unit 12 may be configured to refer to the image data of the front camera 120 as well, to predict a collision between the subject vehicle and the object vehicle. In this case, a captured image of the front camera 120 is analyzed, so that the course of the object vehicle or a collision site can be predicted with accuracy. Predicted data generated in the prediction unit 12 is output to the determination unit 16.
The object recognition unit 13 is configured to be capable of recognizing a type of the object with which the automobile 100 collides. In this embodiment, the object recognition unit 13 classifies the object into three types of a vehicle, a person, and a structure such as a road installation object, but the object is not limited thereto as a matter of course. For the recognition method, various person recognition technologies, vehicle recognition technologies, and the like can be used.
In particular, in this embodiment, the object recognition unit 13 recognizes whether the object includes a person. For example, in a case where the object is a vehicle, the object recognition unit 13 recognizes the presence or absence of a passenger in the vehicle. In this case, the object recognition unit 13 analyzes image data (object vehicle passenger data) output from the front camera 120 and detects the presence or absence of a passenger in the object vehicle. Recognized data generated in the object recognition unit 13 is output to the determination unit 16.
The passenger estimation unit 14 estimates a sitting position of a passenger of the object vehicle on the basis of the object vehicle passenger data. In this embodiment, the passenger estimation unit 14 estimates the sitting position of the passenger of the object vehicle on the basis of the output of the object recognition unit 13. The passenger estimation unit 14 estimates the presence or absence of a person in a driver's seat, a front passenger seat, or a rear seat of the object vehicle from a result of the analysis for the interior of the object vehicle, which is imaged by the front camera 120 and recognized as a person by the object recognition unit 13. Estimated data generated in the passenger estimation unit 14 is output to the determination unit 16.
Here, in a case where the object vehicle is traveling, the presence or absence of a passenger in a seat other than the driver's seat is estimated. When it is impossible to determine whether there is a passenger in the front passenger seat or the rear seat, it may be estimated that there is a passenger. Meanwhile, in a case where the object vehicle is parked, the presence or absence of a passenger in all the seats including the driver's seat is estimated. It should be noted that the determination on whether the object vehicle is traveling or parked is made on the basis of the output of the distance sensor 110 or the front camera 120.
The passenger grasping unit 15 grasps a sitting position of a passenger of the automobile 100 on the basis of the image data (subject vehicle passenger data) output from the vehicle interior imaging camera 130. The passenger grasping unit 15 grasps the presence or absence of a passenger other than the driver within the automobile 100, and in a case where there is a passenger, a riding position thereof. Grasped data generated in the passenger grasping unit 15 is output to the determination unit 16.
It should be noted that various methods can be employed for the passenger grasping technology. Typically, the person recognition technology is used. Other than that technology, a technology of executing matching with an image of the interior of an unmanned vehicle to estimate the presence or absence of a person may be employed.
The determination unit 16 determines, when it is predicted that a collision with the object vehicle is unavoidable on the basis of the outputs of the prediction unit 12, the passenger estimation unit 14, and the passenger grasping unit 15, a collision mode of the automobile 100 against the object vehicle (in this example, a target collision site of the object vehicle, at which the automobile 100 is to collide) according to the riding position of the passenger of the object vehicle. This is for the purpose of causing the automobile 100 to collide at the site capable of reducing human damage to the object vehicle on the other party and the automobile 100.
Specifically, the determination unit 16 is configured to determine a non-sitting position of a passenger of the object vehicle or its vicinity as a target collision site of the object vehicle. In other words, when it is recognized that the object vehicle includes a person, a target collision position is determined for a site other than the riding position of the passenger of the object vehicle. This enables human damage on the other party due to the collision to be suppressed at a minimum.
Further, in a case where the object vehicle is moving toward the automobile 100 and it is estimated that a sitting position of a passenger of the object vehicle is only the driver's seat, the determination unit 16 determines the front part on the seat (front passenger seat) side next to the driver's seat of the object vehicle as a target collision site of the object vehicle. This enables reduction in damage to the driver of the object vehicle.
Meanwhile, in a case where the object vehicle is moving toward the automobile 100 and it is estimated that a sitting position of a passenger of the object vehicle includes the driver's seat and the seat next thereto (front passenger seat), the determination unit 16 determines the center of the front part of the object vehicle as a target collision site of the object vehicle. This enables reduction in damage to each passenger on the other party due to the collision.
In such a manner, under the situation where a collision with the object vehicle is unavoidable, a target collision site of the object vehicle, at which the automobile 100 is to collide, is determined on the basis of the object vehicle passenger data. Thus, it is possible to cause the automobile 100 to collide at the site capable of reducing human damage to the object vehicle on the other party.
Furthermore, the determination unit 16 may be configured to refer to a sitting position of a passenger of the automobile 100 (output of the passenger grasping unit 15) as well as that of the object vehicle, to determine a target collision site of the automobile 100 that collides at the target collision site of the object vehicle. This enables reduction not only in damage to the passenger of the object vehicle that receives a collision, but also in damage to the passenger of the automobile 100 that collides.
Specifically, the determination unit 16 determines a non-sitting position of a passenger of the automobile 100 or its vicinity as a target collision site of the automobile 100. This enables not only damage to the other party but also damage to the passenger of the subject vehicle due to the collision to be suppressed at a minimum.
For example, in a case where a sitting position of a passenger of the automobile 100 is grasped as the driver's seat and the seat next thereto, the determination unit 16 determines the center of the front part of the automobile 100 as a target collision site of the automobile 100. This enables reduction in damage to the passengers of the subject vehicle.
The determination unit 16 generates, at the time when a collision with the object vehicle is predicted, control data by which an automatic brake function of actuating the brake device of the automobile 100 is executed. The determination unit 16 then generates, at the time when it is predicted that a collision with the object vehicle is unavoidable, control data by which a steering direction of the automobile 100 is determined such that a target collision site of the automobile 100 can be caused to collide toward a target collision site of the object vehicle.
The output unit 17 is configured to output the control data, which is for moving the automobile 100 toward the target collision site of the object vehicle, the target collision site being determined by the determination unit 16, to the control unit 10 that controls a moving operation of the automobile 100.
The damage reduction device 1 configured as described above includes hardware necessary for a computer, e.g., a CPU, a RAM, and a ROM. The CPU loads a program according to the present technology, which is recorded in the ROM in advance, to the RAM for execution, and thus a damage reduction method according to the present technology is executed.
A specific configuration of the damage reduction device 1 is not limited. For example, a PLD (Programmable Logic Device) such as an FPGA (Field Programmable Gate Array) or other devices such as an ASIC (Application Specific Integrated Circuit) may be used. Further, the damage reduction device 1 may be configured as a part of the control unit 10.
[Operation of Automobile]
Next, a damage reduction method according to this embodiment will be described together with a typical operation of the automobile 100 configured as described above.
FIG. 4 is a flowchart showing a control flow of the determination unit 16 in the damage reduction device 1. FIGS. 5 and 6 are diagrams for describing examples of the movement control of the automobile 100.
As shown in part A of FIG. 5 , considered is a case where the automobile 100 that goes straight ahead or goes to the left in the figure and an oncoming vehicle 200 as an object vehicle that goes straight ahead or goes to the right in the figure approach each other.
The damage reduction device 1 first determines whether there is a possibility that the automobile 100 collides with the oncoming vehicle 200 (Step 101). This determination is executed on the basis of the output of the prediction unit 12. When there is a possibility of a collision, the damage reduction device 1 determines whether a collision is avoidable by steering and braking (Step 102).
When it is determined that a collision with the oncoming vehicle 200 is avoidable by sudden turning, sudden braking, and the like, an operation of avoiding that collision is executed (Step 103). This operation does not need an operation of the driver and causes the control unit 10 to directly output a control command to the steering device 140 and the brake device 150. In order to determine the steering direction (avoidance direction), for example, the image data from the front camera 120 is referred to.
Meanwhile, when it is determined that a collision with the oncoming vehicle 200 is unavoidable, the determination unit 16 determines to which direction among right turning, left turning, and straight (see reference symbols 18R, 18L, and 18S, respectively in part A of FIG. 5 ) the steering direction of the automobile 100 is controlled in order to reduce damage due to a collision. In this regard, the determination unit 16 determines whether there is a passenger in the front passenger seat of any of the automobile 100 (subject vehicle) and the oncoming vehicle 200 (Step 104). In this step, the outputs of the passenger estimation unit 14 and the passenger determination unit 15 are referred to.
As a result, when it is determined that there is no passenger in the front passenger seat of any of the subject vehicle and the oncoming vehicle 200, the determination unit 16 determines that the front part on the front passenger seat side of the subject vehicle and the front part on the front passenger seat side of the oncoming vehicle 200 are target collision positions. The determination unit 16 then outputs control data regarding steering and braking of the subject vehicle to the control unit 10, so as to turn the automobile 100 to the right while applying a brake and cause the front part on the front passenger seat side of the subject vehicle to collide at the front part on the front passenger seat side of the oncoming vehicle 200 in a collision mode shown in part B of FIG. 5 (Step 105). It should be noted that in a case where a driving operation by a driven vehicle of the subject vehicle is being performed during the steering control described above, priority is given to the steering control by the control unit 10.
In such a manner, when the front passenger seat side of the subject vehicle and the front passenger seat side of the oncoming vehicle 200 are caused to collide with each other, a shock at the collision is absorbed in spaces where there are no passengers in both of the vehicles. Therefore, compared with a collision at the front parts on the driver's seat sides or a head-on collision, damage to the drivers of the subject vehicle and the oncoming vehicle 200 can be suppressed at a minimum.
Meanwhile, when it is determined that there is a passenger in the front passenger seat of any of the subject vehicle and the oncoming vehicle 200, the determination unit 16 determines that the center of the front part of the subject vehicle and the center of the front part of the oncoming vehicle 200 are target collision sites. The determination unit 16 then outputs, to the control unit 10, control data regarding steering and braking of the subject vehicle to cause the automobile 100 to go straight while applying a brake and cause the oncoming vehicle 200 and the subject vehicle collide head-on with each other in a collision mode shown in FIG. 6 (Step 106). It should be noted that in a case where a driving operation by a driven vehicle of the subject vehicle is being performed during the steering control described above, priority is given to the steering control by the control unit 10.
In such a manner, when the subject vehicle and the oncoming vehicle 200 are caused to collide head-on with each other, each cabin is inhibited from being locally deformed as in the offset collision. Therefore, compared with a collision at the front parts on the driver's seat sides or the front passenger seat sides, damage to all the passengers of the subject vehicle and the oncoming vehicle 200 can be suppressed at a minimum.
As described above, in this embodiment, in a case where a collision between the automobile 100 and the oncoming vehicle 200 is unavoidable, positions of the passengers of the automobile 100 and the oncoming vehicle 200 are referred to, so that a collision site of each of the automobile 100 and the oncoming vehicle 200 is determined. This enables damage not only to the passengers of the subject vehicle but also to the passengers of the vehicle on the other party to be suppressed at a minimum.
It should be noted that a non-riding position of a passenger of the object vehicle is determined as a target collision site of the object vehicle in a similar manner to the case where the object vehicle is a preceding vehicle or a parked vehicle. This enables human damage on the other party to be suppressed at a minimum while protecting a passenger of the subject vehicle.
Second Embodiment
Subsequently, another embodiment of the present technology will be described.
In this embodiment, as shown in FIG. 7 , a case where an object having a possibility of a collision with an automobile 100 includes not only a vehicle 210 but also a pedestrian 220 and a road installation object such as a utility pole 230 will be described as an example.
A damage reduction device 1 according to this embodiment and an automobile 100 equipped with this damage reduction device 1 have configurations similar to those of the first embodiment. Hereinafter, a configuration different from that of the first embodiment will be mainly described, and a configuration similar to that of the first embodiment will be denoted by a similar reference symbol and description thereof will be omitted or simply described.
The damage reduction device 1 of this embodiment includes an input unit 11, a prediction unit 12, an object recognition unit 13, and a determination unit 16 (see FIG. 3 ).
The input unit 11 inputs status data regarding a status in a moving direction (traveling direction) of the automobile 100. The status data is, for example, data regarding an object (vehicle 210, pedestrian 220, or utility pole 230) located in front of the automobile 100.
The recognition object unit 13 is configured to be capable of recognizing a type of the object with which the automobile 100 collides on the basis of the status data, and classifies the object into three types of, e.g., a car (vehicle 210), a person (pedestrian 220), and an unmanned structure (road installation object such as utility pole 230).
The determination unit 16 determines, when a collision with the object described above is predicted and when it is recognized that the object described above includes a person, a steering direction of the automobile 100 in which a collision with the person is avoidable, on the basis of the status data described above. In other words, when the object includes the pedestrian 220, avoidance of a collision with the pedestrian 220 is set as a control target having the highest priority.
Further, the determination unit 16 is configured to determine, when it is predicted that a collision with the object is unavoidable, a collision mode of the automobile 100 against the object on the basis of the type of the object. In other words, the damage reduction device 1 of this embodiment is configured to perform different collision modes for the objects depending on whether the object is the vehicle 210, the pedestrian 220, or the road installation object such as the utility pole 230.
The determination unit 16 determines, when the object is recognized as the vehicle 210, a target collision site of the vehicle 210 (object vehicle) by a method similar to that of the first embodiment described above. At that time, the determination unit 16 refers to a riding position of a passenger of the subject vehicle as well, and determines a target collision site capable of reducing damage to the passengers of both of the vehicle on the other party and the subject vehicle.
Meanwhile, the determination unit 16 determines, when the object is recognized as an unmanned structure such as the utility pole 230, a site having relatively high rigidity in the automobile 100, e.g., a pillar site, as a target collision site of the automobile 100. This enables damage to the passenger of the subject vehicle to be suppressed. In this case as well, the riding position of the passenger of the subject vehicle can be referred to on the basis of the output of the passenger grasping unit 15 (see FIG. 3 ). For example, in a case where there is no passenger in the front passenger seat, the pillar on the front passenger seat side is determined as a target collision site, thus enabling damage to the driver to be suppressed at a minimum.
Alternatively, under a situation where there is no object other than a pedestrian and where a collision with the pedestrian is already unavoidable, in order to reduce damage to the pedestrian as much as possible, a site having high shock-absorbing property in the automobile 100 is determined as a target collision position. Examples of the site having high shock-absorbing property include a site having relatively low rigidity such as the hood or the front glass, and a site at which a shock-absorbing apparatus such as an air-bag for pedestrians is installed.
FIG. 8 is a flowchart showing a control example of the damage reduction device 1 in this embodiment.
As shown in FIG. 7 , considered is a situation where the objects including the vehicle 210, the pedestrian 220, and the utility pole 230 are present in front of the automobile 100.
The damage reduction device 1 first determines whether there is a possibility that the automobile 100 collides with those objects (Step 201). This determination is executed on the basis of the output of the prediction unit 12. When there is a possibility of a collision, the damage reduction device 1 determines whether a collision is avoidable by steering and braking (Step 202).
When it is determined that a collision with each object described above is avoidable by sudden turning, sudden braking, and the like, an operation of avoiding that collision is executed (Step 203). This operation does not need an operation of the driver and causes the control unit 10 to directly output a control command to the steering device 140 and the brake device 150. In order to determine the steering direction (avoidance direction), for example, image data from the front camera 120 is referred to.
Meanwhile, when it is determined that a collision with any of the objects is unavoidable, the damage reduction device 1 determines whether the objects include a person (Step 204). When the objects include a person, the damage reduction device 1 determines whether there is an object other than the person (Step 205).
In this example, although the objects include the pedestrian 220, the vehicle 210 and the utility pole 230 are also present as the objects other than the pedestrian 220. Thus, the damage reduction device 1 controls the steering direction such that the automobile 100 collides with the objects other than the person.
When it is determined that a collision with the object other than the person is unavoidable and the object is the vehicle 210, the damage reduction device 1 determines the steering direction of the automobile 100 such that damage to the vehicle 210 and the automobile 100 becomes minimum (Steps 206 and 207). Such control is similar to that of the first embodiment described above, and thus description thereof will be omitted here.
It should be noted that when the vehicle 210 is recognized as an unmanned parked vehicle, the vehicle 210 may be considered as an unmanned structure, and steering control similar to that performed when the object is the utility pole 230 may be executed, as will be described later.
Meanwhile, when it is determined that a collision with the object other than the person is unavoidable and the object is other than the vehicle 210 (in this example, utility pole 230), the damage reduction device 1 determines the steering direction of the automobile 100 such that damage to the automobile 100 becomes minimum (Steps 206 and 208).
In this case, as shown in part A of FIG. 9 , the damage reduction device 1 determines to which direction among right turning, left turning, and straight the steering direction of the automobile 100 is controlled. At that time, the damage reduction device 1 refers to the riding position of the passenger of the automobile 100. When there is no passenger in the front passenger seat, the damage reduction device 1 determines a pillar site on the front passenger seat side of the automobile 100 as a target collision site. The damage reduction device 1 then outputs system data regarding steering and braking of the automobile 100 to the control unit 10, so as to turn the automobile 100 to the right while applying a brake and cause the pillar part described above to collide with the utility pole 230 in a collision mode shown in part B of FIG. 9 (Step 208). It should be noted that in a case where a driving operation by a driven vehicle of the subject vehicle is being performed during the steering control described above, priority is given to the steering control by the control unit 10.
In such a manner, when the pillar site on the front passenger seat side of the automobile 100 is caused to collide with the utility pole 230, the destruction of the automobile 100 in the collision is suppressed. Therefore, compared with a case where the front part of the automobile 100 collides with the utility pole 230, damage to the passenger of the automobile 100 can be suppressed at a minimum.
To the contrary, when it is determined that a collision with the pedestrian 220 is unavoidable, the damage reduction device 1 determines the steering direction of the automobile 100 such that damage to the pedestrian becomes minimum (Step 209).
In this case, as shown in part A of FIG. 10 , the damage reduction device 1 determines to which direction among right turning, left turning, and straight the steering direction of the automobile 100 is controlled. At that time, the damage reduction device 1 determines a site having the highest shock-absorbing effect in the automobile 100 (a relatively soft site such as the hood or the front glass, or a site where an air-bag for pedestrians 170 actuates) as a target collision site. The damage reduction device 1 then outputs system data regarding steering and braking of the automobile 100 to the control unit 10, so as to turn the automobile 100 to the right while applying a brake and cause the front of the automobile 100 to collide with the pedestrian 220 in a collision mode shown in part B of FIG. 10 (Step 209). It should be noted that in a case where a driving operation by a driven vehicle of the subject vehicle is being performed during the steering control described above, priority is given to the steering control by the control unit 10.
As described above, according to this embodiment, when a collision with an object is unavoidable, the collision mode is made different depending on the type of the object. Thus, damage not only to the subject vehicle but also to a different vehicle and a pedestrian can be made minimum suitably or comprehensively.
Hereinabove, the embodiments of the present technology have been described, but the present technology is not limited to the embodiments described above and can be variously modified as a matter of course.
For example, in the first embodiment described above, the case where the vehicle on the other party whose unavoidable collision is predicted is the oncoming vehicle 200 has been described, but the present technology is not limited thereto. The present technology is also effective in, for example, a right-turn/straight accident (collision accident between a vehicle turning right and a person going straight) at an intersection or a collision of vehicles on a sudden encounter.
FIG. 11 shows an example of controlling a collision between an automobile 100 traveling straight and a right-turning vehicle 201. In this example, in a case where there is a passenger in the front passenger seat of the right-turning vehicle 201 and it is predicted that the automobile 100 collides with the front passenger seat side of the right-turning vehicle 201 if the automobile 100 collides linearly, the damage reduction device 1 determines a left-side rear part of the right-turning vehicle 201 as a target collision site and executes steering control to turn the automobile 100 to the right.
On the other hand, FIG. 12 shows an example of controlling a collision between the automobile 100 traveling straight and a passing vehicle 202 cutting across in front of the automobile 100. As in this example, in a case where there is a passenger in the front passenger seat of the passing vehicle 202 and it is predicted that the automobile 100 collides with the front passenger seat side of the passing vehicle 202 if the automobile 100 collides linearly, the damage reduction device 1 determines a left-side rear part of the passing vehicle 202 as a target collision site and executes steering control to turn the automobile 100 to the right.
Other than the above examples, the present technology is also effective for a vehicle at a junction, a vehicle rushing out from an alley, and the like as the vehicle whose collision with the subject vehicle is unavoidable.
Furthermore, in the embodiments described above, although the automobile has been described as an example of the moving body apparatus, in addition thereto, the present technology is applicable to a boat traveling on water, an autonomous traveling robot, and the like.
It should be noted that the present technology can have the following configurations.
(1) A damage reduction device, including:
an input unit that inputs object traveling data regarding a traveling state of a manned moving body approaching a moving body apparatus and object passenger data regarding a sitting position of a passenger of the manned moving body;
a prediction unit that predicts a collision with the manned moving body on the basis of the object traveling data;
a passenger estimation unit that estimates the sitting position of the passenger of the manned moving body on the basis of the object passenger data; and
a determination unit that determines, when it is predicted that the collision with the manned moving body is unavoidable, a target collision site of the manned moving body, at which the moving body apparatus is to collide, on the basis of the sitting position of the passenger of the manned moving body.
(2) The damage reduction device according to (1), in which
the determination unit determines a non-sitting position of the passenger of the manned moving body or a vicinity thereof as a target collision site of the manned moving body.
(3) The damage reduction device according to (1) or (2), in which
the determination unit determines, when the manned moving body is moving toward the moving body apparatus and when the sitting position of the passenger of the manned moving body is estimated as only a driver's seat, a front part on a seat side next to the driver's seat as a target collision site of the manned moving body.
(4) The damage reduction device according to (1) or (2), in which
the determination unit determines, when the manned moving body is moving toward the moving body apparatus and when the sitting position of the passenger of the manned moving body is estimated as a driver's seat and a seat next thereto, the center of a front part of the manned moving body as a target collision site of the manned moving body.
(5) The damage reduction device according to any one of (1) to (4), in which
the input unit further inputs moving-body passenger data regarding a sitting position of a passenger of the moving body apparatus,
the damage reduction device further includes a passenger grasping unit that grasps the sitting position of the passenger of the moving body apparatus on the basis of the moving-body passenger data, and
the determination unit further determines a target collision site of the moving body apparatus that collides with the target collision site of the manned moving body, on the basis of the object traveling data, the object passenger data, and the moving-body passenger data.
(6) The damage reduction device according to (5), in which
the determination unit determines a non-sitting position of the passenger of the moving body apparatus or a vicinity thereof as a target collision site of the moving body apparatus.
(7) The damage reduction device according to (5), in which
the determination unit determines, when the sitting position of the passenger of the moving body apparatus is grasped as a driver's seat and a seat next thereto, the center of a front part of the moving body apparatus as a target collision site of the moving body apparatus.
(8) The damage reduction device according to any one of (1) to (7), further including
an output unit that outputs control data for moving the moving body apparatus toward the target collision site of the manned moving body, which is determined by the determination unit, to a control unit that controls a moving operation of the moving body apparatus.
(9) The damage reduction device according to any one of (1) to (8), in which
the damage reduction device is mounted to the moving body apparatus that is equipped with at least a distance sensor for a moving direction and a front camera, and
the input unit inputs data from the distance sensor and the front camera as the object traveling data and the object passenger data.
(10) A damage reduction method, including:
inputting object traveling data regarding a traveling state of a manned moving body approaching a moving body apparatus and object passenger data regarding a sitting position of a passenger of the manned moving body;
predicting a collision with the manned moving body on the basis of the object traveling data;
estimating the sitting position of the passenger of the manned moving body on the basis of the object passenger data; and
determining, when it is predicted that the collision with the manned moving body is unavoidable, a target collision site of the manned moving body, at which the moving body apparatus is to collide, on the basis of the sitting position of the passenger of the manned moving body.
(11) A program causing a computer to execute the steps of:
inputting object traveling data regarding a traveling state of a manned moving body approaching a moving body apparatus and object passenger data regarding a sitting position of a passenger of the manned moving body;
predicting a collision with the manned moving body on the basis of the object traveling data;
estimating the sitting position of the passenger of the manned moving body on the basis of the object passenger data; and
determining, when it is predicted that the collision with the manned moving body is unavoidable, a target collision site of the manned moving body, at which the moving body apparatus is to collide, on the basis of the sitting position of the passenger of the manned moving body.
It should be noted that the present technology can further have the following configurations.
(12) A damage reduction device, including:
an input unit that inputs status data regarding a status in a moving direction of a moving body apparatus;
a prediction unit that predicts a collision with an object in the moving direction on the basis of the status data;
a recognition unit that recognizes a type of the object; and
a determination unit that determines, when it is predicted that the collision with the object is unavoidable, a collision mode of the moving body apparatus to the object on the basis of the type of the object.
(13) The damage reduction device according to (12), in which
the recognition unit recognizes whether the object is any of a manned moving body, an unmanned structure, and a person, and
the determination unit determines, when the object is recognized as the manned moving body, a target collision site of the object, at which the moving body apparatus is to collide, and determines, when the object is recognized as the unmanned structure or the person, a target collision site of the moving body apparatus that collides with the object.
(14) The damage reduction device according to (13), in which
the status data includes object passenger data regarding a sitting position of a passenger in the manned moving body, and
the determination unit determines, when the object is recognized as the manned moving body, a target collision site of the manned moving body on the basis of the object passenger data.
(15) The damage reduction device according to (14), further including
an object passenger estimation unit that estimates the sitting position of the passenger of the manned moving body on the basis of the object passenger data, in which
the determination unit determines a non-sitting position of the passenger of the manned moving body or a vicinity thereof as a target collision site of the manned moving body.
(16) The damage reduction device according to (13), in which
the determination unit determines, when the object is recognized as the unmanned structure, a pillar site of the moving body apparatus as a target collision site of the moving body apparatus.
(17) The damage reduction device according to (13), in which
the determination unit determines, when the object is recognized as the person, a shock-absorbing site of the moving body apparatus as a target collision site of the moving body apparatus.
(18) The damage reduction device according to any one of (13) to (17), in which
the input unit further inputs moving-body passenger data regarding a sitting position of a passenger of the moving body apparatus,
the damage reduction device further includes a passenger grasping unit that grasps the sitting position of the passenger of the moving body apparatus on the basis of the moving-body passenger data, and
the determination unit further determines a target collision site of the moving body apparatus that collides with the object, on the basis of the status data and the moving-body passenger data.
(19) The damage reduction device according to (18), in which
the determination unit determines a non-sitting position of the passenger of the moving body apparatus or a vicinity thereof as a target collision site of the moving body apparatus.
(20) The damage reduction device according to any one of (12) to (19), further including
an output unit that outputs control data for moving the moving body apparatus to the object in the collision mode, which is determined by the determination unit, to a control unit that controls a moving operation of the moving body apparatus.
(21) The damage reduction device according to any one of (12) to (20), in which
the damage reduction device is mounted to the moving body apparatus that is equipped with at least a distance sensor for the moving direction and a front camera, and
the input unit inputs data from the distance sensor and the front camera as the status data.
(22) A damage reduction method, including:
inputting status data regarding a status in a moving direction of a moving body apparatus;
predicting a collision with an object in the moving direction on the basis of the status data;
recognizing a type of the object; and
determining, when it is predicted that the collision with the object is unavoidable, a collision mode of the moving body apparatus to the object on the basis of the type of the object.
(23) A program causing a computer to execute the steps of:
inputting status data regarding a status in a moving direction of a moving body apparatus;
predicting a collision with an object in the moving direction on the basis of the status data;
recognizing a type of the object; and
determining, when it is predicted that the collision with the object is unavoidable, a collision mode of the moving body apparatus to the object on the basis of the type of the object.
It should be noted that the present technology can still further have the following configurations.
(24) A damage reduction device, including:
an input unit that inputs status data regarding a status in a moving direction of a moving body apparatus;
a prediction unit that predicts a collision with an object in the moving direction on the basis of the status data;
a recognition unit that recognizes whether the object includes a person; and
a determination unit that determines, when the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data.
(25) The damage reduction device according to (24), in which
the determination unit determines, when it is predicted that a collision with the object other than the person is unavoidable, a collision mode of the moving body apparatus to the object on the basis of a type of the object.
(26) The damage reduction device according to (25), in which
the determination unit determines, when the object is recognized as a manned moving body, a target collision site of the object, with which the moving body apparatus is to collide, and determines, when the object is recognized as an unmanned structure, a target collision site of the moving body apparatus that collides with the object.
(27) The damage reduction device according to (26), in which
the status data includes object passenger data regarding a sitting position of a passenger in the manned moving body, and
the determination unit determines, when the object is recognized as the manned moving body, a target collision site of the manned moving body on the basis of the object passenger data.
(28) The damage reduction device according to (27), further including
an object passenger estimation unit that estimates the sitting position of the passenger of the manned moving body on the basis of the object passenger data, in which
the determination unit determines a non-sitting position of the passenger of the manned moving body or a vicinity thereof as a target collision site of the manned moving body.
(29) The damage reduction device according to any one of (25) to (28), in which
the input unit further inputs moving-body passenger data regarding a sitting position of a passenger of the moving body apparatus,
the damage reduction device further includes a passenger grasping unit that grasps the sitting position of the passenger of the moving body apparatus on the basis of the moving-body passenger data, and
the determination unit further determines a target collision site of the moving body apparatus that collides with the object, on the basis of the status data and the moving-body passenger data.
(30) The damage reduction device according to (29), in which
the determination unit determines a non-sitting position of the passenger of the moving body apparatus or a vicinity thereof as a target collision site of the moving body apparatus.
(31) The damage reduction device according to any one of (24) to (30), further including
an output unit that outputs control data for moving the moving body apparatus in the steering direction determined by the determination unit, to a control unit that controls a moving operation of the moving body apparatus.
(32) The damage reduction device according to any one of (24) to (31), in which
the damage reduction device is mounted to the moving body apparatus that is equipped with at least a distance sensor for the moving direction and a front camera, and
the input unit inputs data from the distance sensor and the front camera as the status data.
(33) A damage reduction method, including:
inputting status data regarding a status in a moving direction of a moving body apparatus;
predicting a collision with an object in the moving direction on the basis of the status data;
recognizing whether the object includes a person; and
determining, when the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data.
(34) A program causing a computer to execute the steps of:
inputting status data regarding a status in a moving direction of a moving body apparatus;
predicting a collision with an object in the moving direction on the basis of the status data;
recognizing whether the object includes a person; and
determining, when the collision with the object is predicted and it is recognized that the object includes a person, a steering direction of the moving body apparatus in which a collision with the person is avoidable, on the basis of the status data.
REFERENCE SIGNS LIST
  • 1 damage reduction device
  • 10 control unit
  • 11 input unit
  • 12 prediction unit
  • 13 object recognition unit
  • 14 passenger estimation unit
  • 15 passenger grasping unit
  • 16 determination unit
  • 17 output unit
  • 100 automobile
  • 110 distance sensor
  • 120 front camera
  • 130 vehicle interior imaging camera
  • 140 steering device
  • 150 brake device
  • 200 oncoming vehicle
  • 201 right-turning vehicle
  • 202 passing vehicle
  • 210 vehicle
  • 220 pedestrian
  • 230 utility pole

Claims (26)

The invention claimed is:
1. A damage reduction system for a subject vehicle, the damage reduction system comprising:
a camera having a forward field of view from the subject vehicle and configured to generate image data; and
control circuitry configured to:
predict a possibility of a collision with an object based on the image data;
recognize a type of the object as a person or an unmanned structure based on the image data;
determine, when it is predicted that the collision with the object is avoidable, a steering direction of the subject vehicle so as to avoid the collision with the object;
determine, when it is predicted that the collision with the object is unavoidable and the object is recognized as a person, the steering direction of the subject vehicle so as to minimize damage to the person; and
determine, when it is predicted that the collision with the object is unavoidable and the object is recognized as an unmanned structure, the steering direction of the subject vehicle so as to minimize damage to the subject vehicle.
2. The damage reduction system according to claim 1, wherein the control circuitry is configured to determine, when it is predicted that the collision with the object is unavoidable and the object is recognized as an unmanned structure, a target collision site of the subject vehicle that collides with the object.
3. The damage reduction system according to claim 1, wherein the control circuitry is configured to recognize the type of the object based at least in part on status data regarding a status in a moving direction of the subject vehicle, and
wherein the status data includes the image data.
4. The damage reduction system according to claim 1, wherein the control circuitry is configured to determine, when it is predicted that the collision with the object is unavoidable and the object is recognized as a manned vehicle, a target collision site of the manned vehicle based on passenger data regarding a sitting position of a passenger in the manned vehicle.
5. The damage reduction system according to claim 4, wherein the control circuitry is configured to:
estimate the sitting position of the passenger in the manned vehicle, and
determine a non-sitting position of the passenger in the manned vehicle or a vicinity thereof as the target collision site of the manned vehicle based on the sitting position of the passenger in the manned vehicle.
6. The damage reduction system according to claim 2, wherein the control circuitry is configured to:
determine a sitting position of a passenger in the subject vehicle, and
determine the target collision site of the subject vehicle based on the sitting position of the passenger in the subject vehicle.
7. The damage reduction system according to claim 6, wherein the control circuitry is configured to determine a non-sitting position of the passenger in the subject vehicle or a vicinity thereof as the target collision site of the subject vehicle.
8. The damage reduction system according to claim 1, further comprising a steering controller configured to control steering of the subject vehicle based on the determined steering direction.
9. The damage reduction system according to claim 1, further comprising a distance sensor configured to generate distance data representative of a distance between the subject vehicle and the object,
wherein the control circuitry is configured to predict the possibility of the collision with the object based on the image data and the distance data.
10. The damage reduction system according to claim 1, wherein the camera is mounted in a cabin of the subject vehicle.
11. The damage reduction system according to claim 1, wherein the control circuitry is configured to determine, when it is predicted that the collision with the object is unavoidable and the object is recognized as a manned vehicle, the steering direction of the subject vehicle so as to minimize damage to the subject vehicle and the manned vehicle.
12. The damage reduction system according to claim 1, wherein the control circuitry is configured to determine, when it is predicted that the collision with the object is unavoidable and the object is recognized as a manned vehicle, the steering direction of the subject vehicle so as to minimize damage to the passengers of the subject vehicle and the manned vehicle.
13. A damage reduction method, comprising:
generating image data captured by a camera having a forward field of view from a subject vehicle;
predicting a possibility of a collision with an object based at least in part on the image data;
recognizing a type of the object as a person or an unmanned structure based at least in part on the image data;
determining, when it is predicted that the collision with the object is avoidable, a steering direction of the subject vehicle so as to avoid the collision with the object;
determining, when it is predicted that the collision with the object is unavoidable and the object is recognized as a person, the steering direction of the subject vehicle so as to minimize damage to the person; and
determining, when it is predicted that the collision with the object is unavoidable and the object is recognized as an unmanned structure, the steering direction of the subject vehicle so as to minimize damage to the subject vehicle.
14. The damage reduction method according to claim 13, further comprising:
determining, when it is predicted that the collision with the object is unavoidable and the object is recognized as an unmanned structure, a target collision site of the subject vehicle that collides with the object.
15. The damage reduction method according to claim 13, further comprising:
recognizing the type of the object based at least in part on status data regarding a status in a moving direction of the subject vehicle, and
wherein the status data includes the image data.
16. The damage reduction method according to claim 13, further comprising:
determining, when it is predicted that the collision with the object is unavoidable and the object is recognized as a manned vehicle, a target collision site of the manned vehicle based on passenger data regarding a sitting position of a passenger in the manned vehicle.
17. The damage reduction method according to claim 16, further comprising:
estimating the sitting position of the passenger in the manned vehicle, and
determining a non-sitting position of the passenger in the manned vehicle or a vicinity thereof as the target collision site of the manned vehicle based on the sitting position of the passenger in the manned vehicle.
18. The damage reduction method according to claim 14, further comprising:
determining a sitting position of a passenger in the subject vehicle, and
determining the target collision site of the subject vehicle based on the sitting position of the passenger in the subject vehicle.
19. The damage reduction method according to claim 18, further comprising:
determining a non-sitting position of the passenger in the subject vehicle or a vicinity thereof as the target collision site of the subject vehicle.
20. The damage reduction method according to claim 13, further comprising:
controlling steering of the subject vehicle based on the determined steering direction.
21. The damage reduction method according to claim 13, further comprising:
generating, by a distance sensor, distance data representative of a distance between the subject vehicle and the object,
wherein the control circuitry is configured to predict the possibility of the collision with the object based on the image data and the distance data.
22. The damage reduction method according to claim 13, wherein the camera is mounted in a cabin of the subject vehicle.
23. The damage reduction method according to claim 1, further comprising determining, when it is predicted that the collision with the object is unavoidable and the object is recognized as a manned vehicle, the steering direction of the subject vehicle so as to minimize damage to the subject vehicle and the manned vehicle.
24. The damage reduction method according to claim 1, further comprising determining, when it is predicted that the collision with the object is unavoidable and the object is recognized as a manned vehicle, the steering direction of the subject vehicle so as to minimize damage to the passengers of the subject vehicle and the manned vehicle.
25. Damage reduction apparatus comprising:
control circuitry configured to:
acquire image data captured by a camera having a front view from a subject vehicle;
predict a possibility of a collision with an object based on the image data;
recognize a type of the object as a person or an unmanned structure based on the image data;
determine, when it is predicted that the collision with the object is avoidable, a steering direction of the subject vehicle so as to avoid the collision with the object;
determine, when it is predicted that the collision with the object is unavoidable and the object is recognized as a person, the steering direction of the subject vehicle so as to minimize damage to the person; and
determine, when it is predicted that the collision with the object is unavoidable and the object is recognized as an unmanned structure, the steering direction of the subject vehicle so as to minimize damage to the subject vehicle.
26. A non-transitory computer readable medium storing instructions that, when executed by control circuitry, perform a damage reduction method comprising:
acquiring image data captured by a camera having a forward field of view from a subject vehicle;
predicting a possibility of a collision with an object based at least in part on the image data;
recognizing a type of the object as a person or an unmanned structure based at least in part on the image data;
determining, when it is predicted that the collision with the object is avoidable, a steering direction of the subject vehicle so as to avoid the collision with the object;
determining, when it is predicted that the collision with the object is unavoidable and the object is recognized as a person, the steering direction of the subject vehicle so as to minimize damage to the person; and
determining, when it is predicted that the collision with the object is unavoidable and the object is recognized as an unmanned structure, the steering direction of the subject vehicle so as to minimize damage to the subject vehicle.
US17/570,319 2015-09-29 2022-01-06 Damage reduction device, damage reduction method, and program Active US11772644B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/570,319 US11772644B2 (en) 2015-09-29 2022-01-06 Damage reduction device, damage reduction method, and program

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2015190780 2015-09-29
JP2015-190780 2015-09-29
PCT/JP2016/003769 WO2017056374A1 (en) 2015-09-29 2016-08-18 Damage reduction device, damage reduction method, and program
US201815761844A 2018-03-21 2018-03-21
US16/599,740 US11254307B2 (en) 2015-09-29 2019-10-11 Damage reduction device, damage reduction method, and program
US17/570,319 US11772644B2 (en) 2015-09-29 2022-01-06 Damage reduction device, damage reduction method, and program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/599,740 Continuation US11254307B2 (en) 2015-09-29 2019-10-11 Damage reduction device, damage reduction method, and program

Publications (2)

Publication Number Publication Date
US20220126821A1 US20220126821A1 (en) 2022-04-28
US11772644B2 true US11772644B2 (en) 2023-10-03

Family

ID=58422814

Family Applications (3)

Application Number Title Priority Date Filing Date
US15/761,844 Active US10464559B2 (en) 2015-09-29 2016-08-18 Damage reduction device, damage reduction method, and program
US16/599,740 Active 2037-03-06 US11254307B2 (en) 2015-09-29 2019-10-11 Damage reduction device, damage reduction method, and program
US17/570,319 Active US11772644B2 (en) 2015-09-29 2022-01-06 Damage reduction device, damage reduction method, and program

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US15/761,844 Active US10464559B2 (en) 2015-09-29 2016-08-18 Damage reduction device, damage reduction method, and program
US16/599,740 Active 2037-03-06 US11254307B2 (en) 2015-09-29 2019-10-11 Damage reduction device, damage reduction method, and program

Country Status (5)

Country Link
US (3) US10464559B2 (en)
EP (1) EP3357790A4 (en)
JP (1) JP6760297B2 (en)
CN (1) CN108025763B (en)
WO (1) WO2017056374A1 (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6760297B2 (en) * 2015-09-29 2020-09-23 ソニー株式会社 Signal processing equipment, signal processing methods and programs
WO2017214581A1 (en) 2016-06-10 2017-12-14 Duke University Motion planning for autonomous vehicles and reconfigurable motion planning processors
JP7021891B2 (en) * 2017-09-28 2022-02-17 株式会社Subaru Vehicle control device and vehicle control method
JP7004534B2 (en) * 2017-09-28 2022-02-10 株式会社Subaru Vehicle control device and vehicle control method
JP6676025B2 (en) * 2017-10-23 2020-04-08 本田技研工業株式会社 Vehicle control device, vehicle control method, and program
WO2019139815A1 (en) 2018-01-12 2019-07-18 Duke University Apparatus, method and article to facilitate motion planning of an autonomous vehicle in an environment having dynamic objects
TWI822729B (en) 2018-02-06 2023-11-21 美商即時機器人股份有限公司 Method and apparatus for motion planning of a robot storing a discretized environment on one or more processors and improved operation of same
KR102532741B1 (en) * 2018-02-28 2023-05-16 삼성전자주식회사 Autonomous driving device and driving method thereof
EP3769174B1 (en) 2018-03-21 2022-07-06 Realtime Robotics, Inc. Motion planning of a robot for various environments and tasks and improved operation of same
JP7196448B2 (en) * 2018-07-30 2022-12-27 株式会社アドヴィックス Collision control device
CN109878513A (en) * 2019-03-13 2019-06-14 百度在线网络技术(北京)有限公司 Defensive driving strategy generation method, device, equipment and storage medium
CN113905855B (en) 2019-04-17 2023-08-25 实时机器人有限公司 User interface, system, method and rules for generating motion planning diagrams
WO2020247207A1 (en) 2019-06-03 2020-12-10 Realtime Robotics, Inc. Apparatus, methods and articles to facilitate motion planning in environments having dynamic obstacles
CN110329151B (en) * 2019-06-04 2021-08-17 郑龙海 Intelligent collision recognition warning method and system based on automobile safety
EP3993963A4 (en) 2019-08-23 2022-08-24 Realtime Robotics, Inc. Motion planning for robots to optimize velocity while maintaining limits on acceleration and jerk
CN110466514B (en) * 2019-08-30 2020-10-27 北京小马慧行科技有限公司 Vehicle control method and device
CN110466513A (en) * 2019-08-30 2019-11-19 北京小马慧行科技有限公司 Control method for vehicle and device
TW202146189A (en) 2020-01-22 2021-12-16 美商即時機器人股份有限公司 Configuration of robots in multi-robot operational environment
US11535245B2 (en) * 2020-05-08 2022-12-27 The Boeing Company Systems and methods for reducing a severity of a collision
CN112172806B (en) * 2020-08-31 2022-03-22 恒大新能源汽车投资控股集团有限公司 Vehicle state adjusting device and method and electronic equipment
WO2022097387A1 (en) * 2020-11-05 2022-05-12 日立Astemo株式会社 Vehicle control device
JP7414025B2 (en) * 2021-01-21 2024-01-16 トヨタ自動車株式会社 Collision avoidance support device
US20240025395A1 (en) * 2022-07-22 2024-01-25 Motional Ad Llc Path generation based on predicted actions

Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000095130A (en) 1998-09-21 2000-04-04 Toyota Motor Corp Vehicle collision control unit
US6085151A (en) 1998-01-20 2000-07-04 Automotive Systems Laboratory, Inc. Predictive collision sensing system
US6168198B1 (en) 1992-05-05 2001-01-02 Automotive Technologies International, Inc. Methods and arrangements for controlling an occupant restraint device in a vehicle
US6285778B1 (en) 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
DE10328062A1 (en) 2003-06-23 2005-01-20 Robert Bosch Gmbh Method for improving the safety of road users involved in a prematurely recognized accident
JP2005254923A (en) 2004-03-10 2005-09-22 Toyota Motor Corp Vehicular shock-eliminating device
WO2006070865A1 (en) 2004-12-28 2006-07-06 Kabushiki Kaisha Toyota Chuo Kenkyusho Vehicle motion control device
DE102005003274A1 (en) 2005-01-25 2006-07-27 Robert Bosch Gmbh Collision occurrence preventing or reducing method for use in vehicle, involves identifying obstacle using field sensor, determining data of obstacle, and determining vehicle deceleration based on the data of obstacle and data of vehicle
CN1836939A (en) 2005-03-22 2006-09-27 高田株式会社 Object detection system, protection system, and vehicle
JP2007125997A (en) 2005-11-04 2007-05-24 Nissan Motor Co Ltd Vehicular intelligent brake assist system
JP2008037313A (en) 2006-08-08 2008-02-21 Toyota Motor Corp Vehicle control device, vehicle control system and vehicle control method
JP2009012538A (en) 2007-07-02 2009-01-22 Mazda Motor Corp Occupant crash protection device of vehicle
DE102008005310A1 (en) 2008-01-21 2009-07-23 Bayerische Motoren Werke Aktiengesellschaft Method for influencing the movement of a vehicle in case of premature detection of an unavoidable collision with an obstacle
US7954587B2 (en) 2005-01-17 2011-06-07 Kabushiki Kaisha Toyota Chuo Kenkyusyo Collision behavior control apparatus
DE102011115875A1 (en) 2011-10-12 2013-04-18 Volkswagen Aktiengesellschaft Driver assistance method for e.g. passenger car, involves determining crash severity forecasts of vehicle in response to collision situation between vehicle and object, and determining engagement of actuator of vehicle based upon forecasts
DE102012021004A1 (en) 2012-10-26 2014-04-30 Volkswagen Aktiengesellschaft Method and device for reducing medical consequences of accidents in case of unavoidable accidents in cross traffic
WO2014164327A1 (en) 2013-03-11 2014-10-09 Honda Motor Co., Ltd. Real time risk assessments using risk functions
JP2015041222A (en) 2013-08-21 2015-03-02 株式会社デンソー Collision relaxing device
CN104842995A (en) 2014-02-14 2015-08-19 现代自动车株式会社 Apparatus and method for preventing vehicle collision
US9315192B1 (en) 2013-09-30 2016-04-19 Google Inc. Methods and systems for pedestrian avoidance using LIDAR
US9440649B2 (en) 2014-10-29 2016-09-13 Robert Bosch Gmbh Impact mitigation by intelligent vehicle positioning
US9481366B1 (en) 2015-08-19 2016-11-01 International Business Machines Corporation Automated control of interactions between self-driving vehicles and animals
US9487195B2 (en) * 2012-09-04 2016-11-08 Toyota Jidosha Kabushiki Kaisha Collision avoidance assistance device and collision avoidance assistance method
US9701306B2 (en) * 2014-12-23 2017-07-11 Toyota Motor Engineering & Manufacturing North America, Inc. Risk mitigation for autonomous vehicles relative to turning objects
US20180281786A1 (en) 2015-09-29 2018-10-04 Sony Corporation Damage reduction device, damage reduction method, and program

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285778B1 (en) 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
US6168198B1 (en) 1992-05-05 2001-01-02 Automotive Technologies International, Inc. Methods and arrangements for controlling an occupant restraint device in a vehicle
US6085151A (en) 1998-01-20 2000-07-04 Automotive Systems Laboratory, Inc. Predictive collision sensing system
JP2000095130A (en) 1998-09-21 2000-04-04 Toyota Motor Corp Vehicle collision control unit
DE10328062A1 (en) 2003-06-23 2005-01-20 Robert Bosch Gmbh Method for improving the safety of road users involved in a prematurely recognized accident
JP2005254923A (en) 2004-03-10 2005-09-22 Toyota Motor Corp Vehicular shock-eliminating device
US20080097699A1 (en) 2004-12-28 2008-04-24 Kabushiki Kaisha Toyota Chuo Kenkyusho Vehicle motion control device
WO2006070865A1 (en) 2004-12-28 2006-07-06 Kabushiki Kaisha Toyota Chuo Kenkyusho Vehicle motion control device
US7954587B2 (en) 2005-01-17 2011-06-07 Kabushiki Kaisha Toyota Chuo Kenkyusyo Collision behavior control apparatus
DE102005003274A1 (en) 2005-01-25 2006-07-27 Robert Bosch Gmbh Collision occurrence preventing or reducing method for use in vehicle, involves identifying obstacle using field sensor, determining data of obstacle, and determining vehicle deceleration based on the data of obstacle and data of vehicle
CN1836939A (en) 2005-03-22 2006-09-27 高田株式会社 Object detection system, protection system, and vehicle
JP2007125997A (en) 2005-11-04 2007-05-24 Nissan Motor Co Ltd Vehicular intelligent brake assist system
JP2008037313A (en) 2006-08-08 2008-02-21 Toyota Motor Corp Vehicle control device, vehicle control system and vehicle control method
JP2009012538A (en) 2007-07-02 2009-01-22 Mazda Motor Corp Occupant crash protection device of vehicle
DE102008005310A1 (en) 2008-01-21 2009-07-23 Bayerische Motoren Werke Aktiengesellschaft Method for influencing the movement of a vehicle in case of premature detection of an unavoidable collision with an obstacle
DE102011115875A1 (en) 2011-10-12 2013-04-18 Volkswagen Aktiengesellschaft Driver assistance method for e.g. passenger car, involves determining crash severity forecasts of vehicle in response to collision situation between vehicle and object, and determining engagement of actuator of vehicle based upon forecasts
US9487195B2 (en) * 2012-09-04 2016-11-08 Toyota Jidosha Kabushiki Kaisha Collision avoidance assistance device and collision avoidance assistance method
DE102012021004A1 (en) 2012-10-26 2014-04-30 Volkswagen Aktiengesellschaft Method and device for reducing medical consequences of accidents in case of unavoidable accidents in cross traffic
WO2014164327A1 (en) 2013-03-11 2014-10-09 Honda Motor Co., Ltd. Real time risk assessments using risk functions
JP2015041222A (en) 2013-08-21 2015-03-02 株式会社デンソー Collision relaxing device
US9315192B1 (en) 2013-09-30 2016-04-19 Google Inc. Methods and systems for pedestrian avoidance using LIDAR
CN104842995A (en) 2014-02-14 2015-08-19 现代自动车株式会社 Apparatus and method for preventing vehicle collision
US9440649B2 (en) 2014-10-29 2016-09-13 Robert Bosch Gmbh Impact mitigation by intelligent vehicle positioning
US9701306B2 (en) * 2014-12-23 2017-07-11 Toyota Motor Engineering & Manufacturing North America, Inc. Risk mitigation for autonomous vehicles relative to turning objects
US9481366B1 (en) 2015-08-19 2016-11-01 International Business Machines Corporation Automated control of interactions between self-driving vehicles and animals
US20180281786A1 (en) 2015-09-29 2018-10-04 Sony Corporation Damage reduction device, damage reduction method, and program
US10464559B2 (en) * 2015-09-29 2019-11-05 Sony Corporation Damage reduction device, damage reduction method, and program
US20200039509A1 (en) 2015-09-29 2020-02-06 Sony Corporation Damage reduction device, damage reduction method, and program
US11254307B2 (en) * 2015-09-29 2022-02-22 Sony Corporation Damage reduction device, damage reduction method, and program

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
Chinese Office Action dated Apr. 28, 2020 in connection with Chinese Application No. 201680055024.4, and English translation thereof.
Chinese Office Action dated Dec. 22, 2020 in connection with Chinese Application No. 201680055024.4, and English translation thereof.
Communication pursuant to Article 94(3) EPC dated Oct. 5, 2022 in connection with European Application No. 16850557.6.
Extended European Search Report dated May 16, 2019 in connection with European Application No. 16850557.6.
International Preliminary Report on Patentability and English translation thereof dated Apr. 12, 2018 in connection with International Application No. PCT/JP2016/003769.
International Search Report and English translation thereof dated Oct. 25, 2016 in connection with International Application No. PCT/JP2016/003769.
Japanese Office Action dated Apr. 27, 2020 in connection with Japanese Application No. 2017-542689, and English translation thereof.
Written Opinion and English translation thereof dated Oct. 25, 2016 in connection with International Application No. PCT/JP2016/003769.

Also Published As

Publication number Publication date
US20180281786A1 (en) 2018-10-04
CN108025763B (en) 2021-09-07
WO2017056374A1 (en) 2017-04-06
US20220126821A1 (en) 2022-04-28
US10464559B2 (en) 2019-11-05
EP3357790A4 (en) 2019-06-19
CN108025763A (en) 2018-05-11
US20200039509A1 (en) 2020-02-06
JP6760297B2 (en) 2020-09-23
US11254307B2 (en) 2022-02-22
JPWO2017056374A1 (en) 2018-07-12
EP3357790A1 (en) 2018-08-08

Similar Documents

Publication Publication Date Title
US11772644B2 (en) Damage reduction device, damage reduction method, and program
US20180086338A1 (en) Travel assist device and travel assist method
US9758176B2 (en) Vehicle control apparatus
JP4937656B2 (en) Vehicle collision control device
Isermann et al. Collision-avoidance systems PRORETA: Situation analysis and intervention control
US9008957B2 (en) Method and device for avoiding and/or reducing the consequences of collisions
WO2021124794A1 (en) Vehicle control device, and vehicle control system
JP7429589B2 (en) Vehicle movement support system
CN108357492A (en) For mitigating the device and method that forward direction collides between road vehicle
JP6526832B2 (en) Object tracking before and during a collision
US9290177B2 (en) Vehicle control apparatus
WO2017056381A1 (en) Damage reduction device, damage reduction method, and program
US20170259815A1 (en) Rear accident protection
CN110225853A (en) It is avoided collision with cross traffic
CN114523928A (en) Method and apparatus for controlling safety device, safety system for vehicle, and storage medium
WO2017056375A1 (en) Damage reduction device, damage reduction method, and program
CN114454839A (en) Method and apparatus for controlling vehicle safety device, vehicle safety system, and storage medium
JP2005254923A (en) Vehicular shock-eliminating device
JP2012045984A (en) Collision reducing device
CN103144597A (en) Method for operating motor vehicle and motor vehicle
JP2016016743A (en) Vehicle control apparatus
JP2019209910A (en) Vehicle control system
WO2017056454A1 (en) Damage reduction device, damage reduction method, and program
JP5065151B2 (en) Vehicle motion control apparatus and vehicle motion control method
Isermann et al. Collision avoidance systems PRORETA: situation analysis and intervention control

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: SONY GROUP CORPORATION, JAPAN

Free format text: CHANGE OF NAME;ASSIGNOR:SONY CORPORATION;REEL/FRAME:059634/0542

Effective date: 20210401

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OYAIZU, HIDEKI;KONDO, YUHI;HIRASAWA, YASUTAKA;AND OTHERS;SIGNING DATES FROM 20180220 TO 20180301;REEL/FRAME:059540/0834

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE