WO2018116461A1 - Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule - Google Patents

Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule Download PDF

Info

Publication number
WO2018116461A1
WO2018116461A1 PCT/JP2016/088467 JP2016088467W WO2018116461A1 WO 2018116461 A1 WO2018116461 A1 WO 2018116461A1 JP 2016088467 W JP2016088467 W JP 2016088467W WO 2018116461 A1 WO2018116461 A1 WO 2018116461A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
occupant
control unit
seat
seat arrangement
Prior art date
Application number
PCT/JP2016/088467
Other languages
English (en)
Japanese (ja)
Inventor
嘉崇 味村
正彦 朝倉
博典 高埜
淳一 丸山
熊切 直隆
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to PCT/JP2016/088467 priority Critical patent/WO2018116461A1/fr
Priority to US16/468,306 priority patent/US20200086764A1/en
Priority to CN201680091671.0A priority patent/CN110087939A/zh
Priority to JP2018557493A priority patent/JPWO2018116461A1/ja
Publication of WO2018116461A1 publication Critical patent/WO2018116461A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/002Seats provided with an occupancy detection means mounted therein or thereon
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/005Arrangement or mounting of seats in vehicles, e.g. dismountable auxiliary seats
    • B60N2/01Arrangement of seats relative to one another
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/02Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable
    • B60N2/0224Non-manual adjustments, e.g. with electrical operation
    • B60N2/0244Non-manual adjustments, e.g. with electrical operation with logic circuits
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/20Steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2720/00Output or target parameters relating to overall vehicle dynamics
    • B60W2720/10Longitudinal speed
    • B60W2720/106Longitudinal acceleration

Definitions

  • the present invention relates to a vehicle control system, a vehicle control method, and a vehicle control program.
  • the device according to the above-mentioned prior art controls only so that long packages can be loaded into the vehicle, and other matters are not considered.
  • the present invention has been made in consideration of such circumstances, and a vehicle control system, a vehicle control method, and a vehicle control program capable of effectively utilizing the space in the vehicle according to the configuration or state of the occupant.
  • One of the purposes is to provide
  • the invention according to claim 1 corresponds to a seat provided in a vehicle, an occupant detection unit for detecting a configuration or a state of an occupant in a vehicle compartment of the vehicle, and a configuration or a state of the occupant detected by the occupant detection unit. And a seat arrangement control unit that performs seat arrangement control to change at least one of the posture, the position, and the orientation of the seat.
  • the invention according to claim 2 further comprises an automatic operation control unit for executing an automatic operation for automatically controlling at least one of acceleration and deceleration and steering of the vehicle according to the invention according to claim 1, wherein the seat arrangement
  • the control unit performs the seat arrangement control when the automatic operation is performed by the automatic operation control unit.
  • the invention according to claim 3 is the invention according to claim 1 or 2, wherein the seat arrangement control unit detects the state in which a plurality of occupants are in conversation by the occupant detection unit.
  • the seat arrangement control is performed so that at least two bodies of a plurality of occupants face each other.
  • the occupant detection unit can detect the degree of irradiation of direct sunlight to the occupant, and the seat arrangement control unit The seat arrangement control is performed so as to avoid a state in which the direct sunlight hits the occupant when the occupant detection unit detects a state in which the direct sunlight more than a predetermined degree strikes the occupant.
  • the invention according to a fifth aspect is the invention according to any one of the first to fourth aspects, wherein the occupant detection unit determines that a plurality of occupants require a private space.
  • the seat arrangement control unit performs the seat arrangement control so that the bodies of at least two of the plurality of occupants do not face each other.
  • a sixth aspect of the present invention in the fifth aspect of the present invention, in the case where a plurality of occupants join together, at least one of the plurality of occupants takes the private space. It is determined that it is in the required state.
  • the invention according to a seventh aspect is the invention according to any one of the first to sixth aspects, further comprising an imaging unit for imaging a landscape outside the vehicle, and the landmark on the landscape outside the vehicle captured by the imaging unit
  • the seat arrangement control unit includes the seat arrangement control unit
  • the seat arrangement control unit performs the seat arrangement control such that the occupant's body faces the landmark.
  • a computer mounted on a vehicle equipped with a seat detects a configuration or a state of an occupant in a vehicle compartment of the vehicle, and the posture and position of the seat according to the configuration or the state of the occupant. It is a vehicle control method which performs seat arrangement control which changes at least one of and direction.
  • the invention according to claim 9 causes a computer mounted on a vehicle equipped with a seat to detect the configuration or state of the occupant in the vehicle compartment of the vehicle, and the attitude and position of the seat according to the configuration or state of the occupant. It is a vehicle control program which performs seat arrangement control which changes at least one of and direction.
  • the space in the vehicle can be effectively utilized.
  • the space in the vehicle during automatic driving can be effectively utilized.
  • a plurality of occupants can easily talk.
  • private spaces for a plurality of occupants can be secured.
  • the occupant can easily view the landmark.
  • FIG. 1 It is a block diagram of the vehicle system 1 to which the vehicle control system of 1st Embodiment was applied. It is detail drawing of the joining control part 164 and the landmark visual recognition control part 168 which are shown in FIG. It is a figure which shows a mode that the relative position or attitude
  • FIG. 9 is a diagram for describing another example of the configuration or state of the occupant detected by the occupant detection unit 160, and the seat arrangement control performed in step S206 of FIG. 8.
  • FIG. 11 is a diagram for describing another example of the configuration or state of the occupant detected by the occupant detection unit 160 and the seat arrangement control performed in step S306 of FIG. 10. It is a figure which shows an example of the content output toward the vehicle exterior. It is a figure which shows an example of a movement of the character string shown by the image 300F and 300L. It is a figure for demonstrating the determination content of the boarding applicant by the boarding candidate determination part 166.
  • FIG. 11 is a diagram for describing another example of the configuration or state of the occupant detected by the occupant detection unit 160 and the seat arrangement control performed in step S306 of FIG. 10. It is a figure which shows an example of the content output toward the vehicle exterior. It is a figure which shows an example of a movement of the character string shown by the image 300F and 300L. It is a figure for demonstrating the determination content of the boarding applicant by the boarding candidate determination part 166.
  • FIG. 11 is a diagram for describing another example of the configuration or state of the
  • FIG. 17 is a diagram for describing another example of the configuration or state of the occupant detected by the occupant detection unit 160 and the seat arrangement control performed in step S506 of FIG. 16.
  • FIG. 7 is a diagram showing an example of the positional relationship between the host vehicle M and the landmark 600 in the case where a landscape is captured by the camera 10 and the landscape outside the vehicle includes the landmark.
  • FIG. 1 is a block diagram of a vehicle system 1 to which the vehicle control system of the first embodiment is applied.
  • FIG. 2 is a detailed view of the joining control unit 164 and the landmark visual recognition control unit 168 shown in FIG.
  • the vehicle on which the vehicle system 1 is mounted is, for example, a vehicle such as a two-wheeled vehicle, a three-wheeled vehicle, or a four-wheeled vehicle, and a drive source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof.
  • the electric motor operates using the power generated by a generator connected to the internal combustion engine or the discharge power of a secondary battery or a fuel cell.
  • the vehicle system 1 includes, for example, a camera 10, a radar device 12, a finder 14, an object recognition device 16, a communication device 20, an HMI (Human Machine Interface) 30, a navigation device 50, and an MPU (Micro-Processing).
  • Unit 60 a vehicle sensor 70, a drive operator 80, an in-vehicle camera 90, an automatic driving control unit 100, a traveling driving force output device 200, a brake device 210, and a steering device 220.
  • These devices and devices are mutually connected by a multiplex communication line such as a CAN (Controller Area Network) communication line, a serial communication line, a wireless communication network or the like.
  • CAN Controller Area Network
  • serial communication line a wireless communication network or the like.
  • the vehicle system 1 to which the vehicle control system of the first embodiment is applied includes, for example, the seats 82-1 to 82-5 in addition to the above configuration.
  • the seats 82-1 to 82-5 include a driver's seat 82-1 on which the driver sits and an occupant seat 82-2 to 82-5 on which the occupant of the host vehicle M other than the driver sits.
  • Be The sheets 82-1 to 82-5 include actuators that change at least one of the attitude, position, and orientation of the sheets 82-1 to 82-5.
  • the camera 10 is, for example, a digital camera using a solid-state imaging device such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS).
  • CMOS complementary metal oxide semiconductor
  • One or more cameras 10 are attached to any part of a vehicle (hereinafter referred to as a host vehicle M) on which the vehicle system 1 is mounted.
  • the camera 10 When imaging the front, the camera 10 is attached to the top of the front windshield, the rear surface of the rearview mirror, or the like.
  • the camera 10 periodically and repeatedly captures the periphery of the vehicle M.
  • the camera 10 may be a stereo camera.
  • the radar device 12 emits radio waves such as millimeter waves around the host vehicle M and detects radio waves (reflected waves) reflected by the object to detect at least the position (distance and direction) of the object.
  • radio waves such as millimeter waves around the host vehicle M and detects radio waves (reflected waves) reflected by the object to detect at least the position (distance and direction) of the object.
  • One or more of the radar devices 12 are attached to any part of the host vehicle M.
  • the radar device 12 may detect the position and the velocity of the object by a frequency modulated continuous wave (FM-CW) method.
  • FM-CW frequency modulated continuous wave
  • the finder 14 is LIDAR (Light Detection and Ranging, or Laser Imaging Detection and Ranging) which measures scattered light with respect to the irradiation light and detects the distance to the object.
  • LIDAR Light Detection and Ranging, or Laser Imaging Detection and Ranging
  • One or more finders 14 are attached to any part of the host vehicle M.
  • the object recognition device 16 performs sensor fusion processing on the detection result of a part or all of the camera 10, the radar device 12, and the finder 14 to recognize the position, type, speed, etc. of the object.
  • the object recognition device 16 outputs the recognition result to the automatic driving control unit 100.
  • the communication device 20 communicates with another vehicle around the host vehicle M, for example, using a cellular network, Wi-Fi network, Bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or wireless It communicates with various server devices via the base station.
  • a cellular network for example, using a cellular network, Wi-Fi network, Bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or wireless It communicates with various server devices via the base station.
  • the HMI 30 presents various information to the occupants in the vehicle, and accepts input operations by the occupants.
  • the HMI 30 includes, for example, an in-vehicle device 31.
  • the in-vehicle device 31 is, for example, various display devices, a speaker, a buzzer, a touch panel, a switch, a key, and the like.
  • the HMI 30 also presents information to the outside of the vehicle.
  • the HMI 30 includes, for example, an external display 32 and an external speaker 33.
  • the external speaker 33 outputs sound to a predetermined range outside the vehicle.
  • the vehicle exterior speaker 33 may output sound having directivity in a predetermined direction.
  • the navigation device 50 includes, for example, a GNSS (Global Navigation Satellite System) receiver 51, a navigation HMI 52, and a path determination unit 53, and stores the first map information 54 in a storage device such as an HDD (Hard Disk Drive) or a flash memory. Hold
  • the GNSS receiver specifies the position of the host vehicle M based on the signal received from the GNSS satellite. The position of the host vehicle M may be identified or supplemented by an INS (Inertial Navigation System) using the output of the vehicle sensor 70.
  • the navigation HMI 52 includes a display device, a speaker, a touch panel, keys and the like. The navigation HMI 52 may be partially or entirely shared with the above-described HMI 30.
  • the route determination unit 53 for example, the route from the position of the vehicle M specified by the GNSS receiver 51 (or any position input) to the destination input by the occupant using the navigation HMI 52 is 1 Determine with reference to the map information 54.
  • the first map information 54 is, for example, information in which a road shape is represented by a link indicating a road and a node connected by the link.
  • the first map information 54 may include road curvature, POI (Point Of Interest) information, and the like.
  • the path determined by the path determination unit 53 is output to the MPU 60.
  • the navigation device 50 may perform route guidance using the navigation HMI 52 based on the route determined by the route determination unit 53.
  • the navigation device 50 may be realized, for example, by the function of a terminal device such as a smartphone or a tablet terminal owned by the user.
  • the navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire the route returned from the navigation server.
  • the MPU 60 functions as, for example, a recommended lane determination unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory.
  • the recommended lane determination unit 61 divides the route provided from the navigation device 50 into a plurality of blocks (for example, in units of 100 [m] in the traveling direction of the vehicle), and refers to the second map information 62 for each block. Determine the recommended lanes.
  • the recommended lane determination unit 61 determines which lane to travel from the left.
  • the recommended lane determination unit 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for traveling to a branch destination when a branch point, a junction point, or the like exists in the route.
  • the second map information 62 is map information that is more accurate than the first map information 54.
  • the second map information 62 includes, for example, information on the center of the lane or information on the boundary of the lane. Further, the second map information 62 may include road information, traffic regulation information, address information (address / zip code), facility information, telephone number information, and the like.
  • the road information includes information indicating the type of road such as expressways, toll roads, national roads, and prefectural roads, the number of lanes of the road, the width of each lane, the slope of the road, the position of the road (longitude, latitude, height 3) (including three-dimensional coordinates), curvature of a curve of a lane, locations of merging and branching points of lanes, and information such as signs provided on roads.
  • the second map information 62 may be updated as needed by accessing another device using the communication device 20.
  • Vehicle sensor 70 includes a vehicle speed sensor that detects the speed of vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects an angular velocity around the vertical axis, an orientation sensor that detects the direction of vehicle M, and the like.
  • the operating element 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, and other operating elements.
  • a sensor for detecting the amount of operation or the presence or absence of an operation is attached to the driving operation element 80, and the detection result is the automatic driving control unit 100 or the traveling driving force output device 200, the brake device 210, and the steering device. It is output to one or both of 220.
  • the in-vehicle camera 90 captures an occupant of the vehicle M of the host vehicle M. Further, the in-vehicle camera 90 is provided with means for acquiring in-vehicle sound such as a microphone. The image captured by the in-vehicle camera 90 and the in-vehicle audio acquired by the in-vehicle camera 90 are output to the automatic driving control unit 100.
  • the automatic driving control unit 100 includes, for example, a first control unit 120, a second control unit 140, an occupant detection unit 160, a seat arrangement control unit 162, a joining control unit 164, and a landmark visual recognition control unit 168.
  • a processor such as a central processing unit (CPU) It is realized by executing software.
  • the first control unit 120 includes, for example, an external world recognition unit 121, a host vehicle position recognition unit 122, and an action plan generation unit 123.
  • the external world recognition unit 121 recognizes the position of the surrounding vehicle and the state of the speed, acceleration, and the like based on the information input from the camera 10, the radar device 12, and the finder 14 via the object recognition device 16.
  • the position of the nearby vehicle may be represented by a representative point such as the center of gravity or a corner of the nearby vehicle, or may be represented by an area represented by the contour of the nearby vehicle.
  • the "state" of the surrounding vehicle may include the acceleration or jerk of the surrounding vehicle, or the "action state” (e.g., whether or not a lane change is being made or is going to be made).
  • the external world recognition unit 121 may also recognize the positions of guardrails, utility poles, parked vehicles, pedestrians, and other objects in addition to surrounding vehicles.
  • the host vehicle position recognition unit 122 recognizes, for example, the lane in which the host vehicle M is traveling (traveling lane) and the relative position and posture of the host vehicle M with respect to the traveling lane.
  • the vehicle position recognition unit 122 may use a pattern of road division lines obtained from the second map information 62 (for example, an array of solid lines and broken lines) and a periphery of the vehicle M recognized from an image captured by the camera 10.
  • the travel lane is recognized by comparing it with the pattern of road division lines. In this recognition, the position of the host vehicle M acquired from the navigation device 50 or the processing result by the INS may be added.
  • FIG. 3 is a diagram showing how the host vehicle position recognition unit 122 recognizes the relative position and posture of the host vehicle M with respect to the traveling lane L1.
  • the host vehicle position recognition unit 122 makes, for example, a deviation OS of the reference point (for example, the center of gravity) of the host vehicle M from the center CL of the travel lane and a center of the travel lane CL in the traveling direction of the host vehicle M
  • the angle ⁇ is recognized as the relative position and posture of the host vehicle M with respect to the driving lane L1.
  • the host vehicle position recognition unit 122 recognizes the position of the reference point of the host vehicle M with respect to any one side end of the host lane L1 as the relative position of the host vehicle M with respect to the traveling lane. It is also good.
  • the relative position of the vehicle M recognized by the vehicle position recognition unit 122 is provided to the recommended lane determination unit 61 and the action plan generation unit 123.
  • the action plan generation unit 123 determines events to be sequentially executed in automatic driving so as to travel on the recommended lane determined by the recommended lane determination unit 61 and to correspond to the surrounding situation of the host vehicle M.
  • Events include, for example, a constant-speed travel event that travels the same traffic lane at a constant speed, a follow-up travel event that follows a preceding vehicle, a lane change event, a merging event, a branch event, an emergency stop event, and automatic driving There is a handover event or the like for switching to the manual operation.
  • an action for avoidance may be planned based on the peripheral situation of the host vehicle M (the presence of surrounding vehicles and pedestrians, lane constriction due to road construction, and the like).
  • the action plan generation unit 123 generates a target track on which the vehicle M travels in the future.
  • the target trajectory includes, for example, a velocity component.
  • a target trajectory sets a plurality of future reference times for each predetermined sampling time (for example, about 0 comma [sec]), and is generated as a set of target points (orbit points) to reach those reference times. Ru. For this reason, when the width of the track point is wide, it indicates that the section between the track points travels at high speed.
  • FIG. 4 is a diagram showing how a target track is generated based on the recommended lane.
  • the recommended lanes are set to be convenient to travel along the route to the destination.
  • the action plan generation unit 123 When the action plan generation unit 123 approaches a predetermined distance before the switching point of the recommended lane (may be determined according to the type of event), it activates a lane change event, a branch event, a merging event, and the like. When it is necessary to avoid an obstacle during the execution of each event, an avoidance trajectory is generated as illustrated.
  • the action plan generation unit 123 generates, for example, a plurality of target trajectory candidates, and selects an optimal target trajectory at that time based on the viewpoint of safety and efficiency.
  • the second control unit 140 includes a traveling control unit 141.
  • the traveling control unit 141 controls the traveling driving force output device 200, the steering device 220, and the braking device 210 so that the host vehicle M passes the target trajectory generated by the action plan generating unit 123 as scheduled. Do.
  • the traveling driving force output device 200 outputs traveling driving force (torque) for the vehicle to travel to the driving wheels.
  • the traveling driving force output device 200 includes, for example, a combination of an internal combustion engine, an electric motor, a transmission, and the like, and an ECU that controls these.
  • the ECU controls the above configuration in accordance with the information input from the traveling control unit 141 or the information input from the drive operator 80.
  • the brake device 210 includes, for example, a brake caliper, a cylinder that transmits hydraulic pressure to the brake caliper, an electric motor that generates hydraulic pressure in the cylinder, and a brake ECU.
  • the brake ECU controls the electric motor in accordance with the information input from the travel control unit 141 or the information input from the drive operator 80 so that the brake torque corresponding to the braking operation is output to each wheel.
  • the brake device 210 may include, as a backup, a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the drive operator 80 to the cylinder via the master cylinder.
  • the brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that transmits the hydraulic pressure of the master cylinder to the cylinder by controlling the actuator according to the information input from the travel control unit 141 Good.
  • the steering device 220 includes, for example, a steering ECU and an electric motor.
  • the electric motor for example, applies a force to the rack and pinion mechanism to change the direction of the steered wheels.
  • the steering ECU drives the electric motor to change the direction of the steered wheels in accordance with the information input from the traveling control unit 141 or the information input from the drive operator 80.
  • the occupant detection unit 160 detects the configuration or state of the occupant based on the image of the occupant captured by the in-vehicle camera 90 and the in-vehicle audio acquired by the in-vehicle camera 90.
  • the seat arrangement control unit 162 sets at least one of the posture, the position, and the direction of part or all of the seats 82-1 to 82-5 according to the configuration or state of the occupant detected by the occupant detection unit 160. Perform sheet arrangement control to change.
  • the sharing control unit 164 executes sharing control, which will be described in detail later.
  • the landmark visibility control unit 168 executes landmark visibility control which will be described in detail later.
  • the automatic driving control unit 100 including the first control unit 120 and the second control unit 140 described above is an automatic driving control unit that executes automatic driving that automatically controls at least one of acceleration / deceleration and steering of the host vehicle M. Function.
  • the automatic driving performed by the automatic driving control unit 100 includes, for example, a first mode, a second mode, and a third mode.
  • the first mode of automatic driving is a mode in which the degree of automatic driving is the highest compared to other modes.
  • automatic driving in the first mode since all vehicle control such as complex merging control is automatically performed, there is no obligation for the driver to perform a required driving operation. For example, the driver does not have to monitor the surroundings or the state of the host vehicle M (the driver does not have a duty to monitor the surroundings). In addition, the driver does not have to perform driving operations for the accelerator pedal, the brake pedal, the steering, etc. (there is no driving operation duty required of the driver), and may direct awareness other than driving the vehicle.
  • the seat arrangement control unit 162 performs the seat arrangement control on the driver's seat 82-1 while the first mode automatic operation is being performed.
  • the second mode of the automatic driving is a mode in which the degree of the automatic driving is higher after the first mode.
  • all vehicle control is performed automatically in principle, but the driving operation of the host vehicle M is delegated to the driver depending on the scene (compared to the first mode vehicle Duties on driving increase) For this reason, it is necessary for the driver to monitor the surroundings and the state of the host vehicle M and to pay attention to the driving of the host vehicle M (the duty for driving the vehicle increases compared to the first mode). That is, since the driver is required to perform a driving operation and the like during execution of the second mode automatic operation, the seat arrangement control unit 162 does not perform the seat arrangement control on the driver's seat 82-1.
  • the third mode of the automatic driving is a mode in which the degree of the automatic driving is the second highest after the second mode.
  • the driver needs to perform a confirmation operation according to the scene on the HMI 30 (the duty on the vehicle driving is increased compared to the second mode).
  • the third mode for example, when the driver is notified of the timing of lane change and the driver instructs the HMI 30 to change the lane, automatic lane change is performed. For this reason, it is necessary for the driver to monitor the surroundings and the state of the host vehicle M (the duty for driving the vehicle increases compared to the second mode). That is, during the execution of the automatic driving in the third mode, since the driver is required to perform a driving operation and the like, the seat arrangement control unit 162 does not perform the seat arrangement control on the driver's seat 82-1.
  • FIG. 5 is a flow chart showing an example of a flow of processing for selecting a mode of automatic driving which is executed by the automatic driving control unit 100.
  • the processing of this flowchart is repeatedly performed, for example, in a predetermined cycle.
  • the automatic driving control unit 100 determines whether or not the automatic driving in the first mode can be performed (step S10). If the automatic driving in the first mode is executable, the automatic driving control unit 100 executes the automatic driving in the first mode (step S11). On the other hand, when the automatic driving in the first mode can not be performed, the automatic driving control unit 100 determines whether the automatic driving in the second mode can be performed (step S12). If the automatic driving in the second mode is executable, the automatic driving control unit 100 executes the automatic driving in the second mode (step S13).
  • the automatic driving control unit 100 determines whether the automatic driving in the third mode can be performed (step S14). If the automatic driving in the third mode is executable, the automatic driving control unit 100 executes the automatic driving in the third mode (step S15). On the other hand, when the automatic operation in the third mode can not be performed, the processing of one routine of this flowchart ends.
  • FIG. 6 is a flow chart showing an example of the flow of processing executed by the automatic driving control unit 100 to effectively utilize the space in the vehicle during automatic driving.
  • FIG. 7 is a view for explaining an example of the configuration or state of the occupant detected by the occupant detection unit 160, and the seat arrangement control performed in step S106 of FIG.
  • the process of the flowchart shown in FIG. 6 is repeatedly performed, for example, in a predetermined cycle.
  • the automatic driving control unit 100 determines whether automatic driving is being performed (step S100). Specifically, the automatic driving control unit 100 determines whether the automatic driving is being performed in any one of the first mode, the second mode, and the third mode. When the automatic operation is not performed in any of the first mode, the second mode and the third mode, the processing of one routine of this flowchart ends.
  • the occupant detection unit 160 When automatic driving is being performed in any one of the first mode, the second mode, and the third mode, the occupant detection unit 160 generates an image of the occupant captured by the in-vehicle camera 90 and the in-vehicle camera 90. The configuration or state of the occupant is detected based on the acquired in-vehicle voice (step S102). Then, the occupant detection unit 160 determines whether the occupant sitting on the seats 82-1 to 82-5 is in a state of conversation (step S104). For example, as shown in FIG.
  • the occupant detection unit 160 determines that the occupant sitting on the seats 82-1 to 82-5 is in a state of conversation.
  • the processing of one routine of this flowchart ends.
  • the seat arrangement control unit 162 responds to the configuration or state of the occupant detected by the occupant detection unit 160.
  • Sheet arrangement control is performed to change at least one of the attitude, position, and orientation of the sheets 82-1 to 82-5 (step S106). Specifically, as shown in FIG.
  • the seat arrangement control unit 162 changes at least one of the posture, position, and orientation of the sheets 82-1 to 82-5 so that the occupant's body faces each other.
  • the sheet arrangement control unit 162 turns the sheets 82-1 to 82-5, but instead, the sheet arrangement control unit 162 may, for example, By moving 82-1 to 82-5, the occupant seated on the seats 82-1 to 82-5 may be faced. Further, in the example shown in FIGS. 7A and 7B, the sheet arrangement control unit 162 turns the sheets 82-1 and 82-2 and turns the sheets 82-3 and 82-5. Alternatively, the sheet arrangement control unit 162 may turn the sheets 82-1 and 82-2 and not the sheets 82-3 and 82-5. That is, even if the seat arrangement control unit 162 does not turn the seats 82-3 and 82-5, the bodies of the occupants sitting on the seats 82-1 to 82-5 face each other.
  • step S106 in FIG. The seat arrangement control unit 162 executes seat arrangement control for the driver's seat 82-1.
  • step S106 in FIG. The seat arrangement control for the seat 82-1 is not executed.
  • FIG. 8 is a flow chart showing another example of the flow of processing executed by the automatic driving control unit 100 in order to make efficient use of the space in the vehicle during automatic driving.
  • FIG. 9 is a view for explaining another example of the configuration or state of the occupant detected by the occupant detection unit 160, and the seat arrangement control performed in step S206 of FIG.
  • steps S100 and S102 of FIG. 8 The process of the flowchart shown in FIG. 8 is repeatedly performed, for example, in a predetermined cycle. In steps S100 and S102 of FIG. 8, the same processes as steps S100 and S102 of FIG. 6 are performed.
  • step S204 the occupant detection unit 160 detects the degree of irradiation of direct sunlight to the occupant, and the occupant sitting on the sheets 82-1 to 82-5 is in a state where direct sunlight of a predetermined degree or more is applied It is determined whether or not. For example, as shown in FIG. 9A, when the occupant sitting on the seats 82-1 to 82-5 twists the upper body with respect to the lower body so as to avoid direct sunlight, or the seat 82 In the case where the occupant sitting in seat -1 to 82-5 is exposed to direct sunlight, the occupant detection unit 160 is configured such that the occupant sitting on seat 82-1 to 82-5 has a degree of direct sunlight equal to or greater than a predetermined degree. It determines that it is in the hit state.
  • the processing of one routine of this flowchart ends.
  • the seat arrangement control unit 162 determines the configuration of the occupant detected by the occupant detection unit 160.
  • sheet arrangement control is performed to change at least one of the attitude, position, and orientation of the sheets 82-1 to 82-5 in accordance with the state (step S206).
  • the seat arrangement control unit 162 controls the posture, position, and orientation of the seats 82-1 to 82-5 so as to avoid direct sunlight from reaching the occupants seated on the seats 82-1 to 82-5. Change at least one of the In the example shown in FIG. 9 (B), the seat arrangement allows the occupants sitting on the seats 82-1 to 82-5 to avoid direct sunlight without twisting the upper body with respect to the lower body.
  • the control unit 162 changes at least one of the attitude, position, and orientation of the sheets 82-1 to 82-5.
  • the sheet arrangement control unit 162 turns the sheets 82-1 to 82-5, but instead, the sheet arrangement control unit 162 may, for example, be a sheet By moving 82-1 to 82-5, it is possible to avoid direct sunlight from reaching the occupants seated on the sheets 82-1 to 82-5.
  • step S206 in FIG. The seat arrangement control unit 162 executes seat arrangement control for the driver's seat 82-1.
  • the seat arrangement control unit 162 determines in step S206 in FIG. The seat arrangement control for the seat 82-1 is not executed.
  • FIG. 10 is a flow chart showing another example of the flow of processing executed by the automatic driving control unit 100 in order to make effective use of the space in the vehicle during automatic driving.
  • FIG. 11 is a view for explaining another example of the configuration or state of the occupant detected by the occupant detection unit 160, and the seat arrangement control performed in step S306 in FIG.
  • steps S100 and S102 of FIG. 10 processing similar to that of steps S100 and S102 of FIG. 6 is performed.
  • step S304 the occupant detection unit 160 determines whether the occupant sitting on the seats 82-1 to 82-5 requires a private space. For example, as shown in FIG. 11A, the occupants sitting on the seats 82-1 to 82-5 are seated on the seats 82-1 to 82-5 so that the bodies of the occupants do not face the bodies of the next occupants. If the occupant in question is twisting the upper body with respect to the lower body, the occupant detection unit 160 determines that the occupant sitting on the sheets 82-1 to 82-5 needs a private space.
  • the seat arrangement control unit 162 responds to the configuration or state of the occupant detected by the occupant detection unit 160. Then, sheet arrangement control is performed to change at least one of the attitude, position, and orientation of the sheets 82-1 to 82-5 (step S306). Specifically, the seat arrangement control unit 162 controls the positions, positions, and positions of the seats 82-1 to 82-5 so that at least two of the occupants seated on the seats 82-1 to 82-5 do not face each other. Change at least one of the orientations. In the example shown in FIG.
  • the seat arrangement control unit 162 changes at least one of the posture, the position, and the direction of the sheets 82-1 to 82-5 so that the body of one passenger does not face the body of the next passenger.
  • the sheet arrangement control unit 162 turns the sheets 82-1 to 82-5, but instead, the sheet arrangement control unit 162 may, for example, By moving 82-1 to 82-5, the occupant seated on the seat 82-1 to 82-5 may not face the adjacent occupant.
  • step S306 in FIG. The seat arrangement control unit 162 executes seat arrangement control for the driver's seat 82-1.
  • step S306 in FIG. The seat arrangement control for the seat 82-1 is not executed.
  • the seat provided in the vehicle, the occupant detection unit for detecting the configuration or the state of the occupant seated on the seat, and the occupant detected by the occupant detection unit A space inside the vehicle can be effectively utilized by providing a seat arrangement control unit that performs seat arrangement control that changes at least one of the posture, the position, and the direction of the seat according to the configuration or the state.
  • the vehicle control system according to the second embodiment is applied to a sharing vehicle system 1.
  • the sharing control unit 164 includes an interface control unit 165, a boarding candidate determination unit 166, and a sharing adjustment unit 167.
  • the action plan generation unit 123 generates a target trajectory in consideration of processing results of, for example, the occupant detection unit 160 functioning as an in-vehicle condition acquisition unit, the interface control unit 165, and the boarding person determination unit 166.
  • the host vehicle M according to the second embodiment outputs information outside the vehicle by interface control described later, based on, for example, the in-vehicle condition and a predetermined condition.
  • the own vehicle M according to the second embodiment performs stop control for allowing the passenger to get in when the person outside the vehicle is determined to be the passenger.
  • the own vehicle M according to the second embodiment performs ride sharing when the passenger who rides on the ride gets off.
  • the occupant detection unit 160 acquires the situation in the host vehicle M.
  • the vehicle system 1 includes an external display 32 and an indoor camera 90.
  • the display for outside the vehicle 32 includes a front display 32F, a right side display, a left side display 32L, and a rear display 32B of the host vehicle M.
  • the front display 32F is, for example, a light transmissive liquid crystal panel formed on at least a part of a windshield.
  • the front display 32F secures the driver's front view and displays an image that can be seen by a person in front of the vehicle.
  • each of the right side display, the left side display 32L, and the rear display 32B is a light transmission type liquid crystal panel formed on at least a part of the glass provided in each direction, similarly to the front display 32F.
  • the right side display and the left side display 32L are formed in the side window of the rear seat in the own vehicle M, but the present invention is not limited thereto, and may be formed in the side window of the front seat. It may be formed on both of the rear seats.
  • the display 32 for vehicles exteriors shall be provided in at least one part of the glass of the own vehicle M as mentioned above, it replaces with this (or addition), and the body part of the outer side of the own vehicle M May be provided.
  • the occupant detection unit 160 acquires a captured image captured by the in-vehicle camera 90, analyzes the captured image, and the occupant is seated on any one of the sheets 82-1 to 82-5 in the host vehicle M. Determine if it is. For example, the occupant detection unit 160 determines whether or not there is a face area including face feature information (for example, an eye, a nose, a mouth, and a face outline) in the captured image. In addition, when it is determined that the face area is present, the occupant detection unit 160 selects one of the sheets 82-1 to 82-5 based on the position (center position) of the face area present in the captured image. Determine if the occupant is seated.
  • face feature information for example, an eye, a nose, a mouth, and a face outline
  • the occupant detection unit 160 seats the occupant on the sheet when the load value from each load sensor is equal to or greater than the threshold value. It may be determined that the
  • the occupant detection unit 160 may analyze the occupant's hairstyle, clothes, face shape, color, and the like from the captured image of the vehicle interior camera 90, and may estimate the occupant's gender based on the analysis result. . For example, when the hair of the occupant is long and the color of the lips is red, the occupant detection unit 160 determines that the occupant is a woman. In addition, the occupant detection unit 160 may use the in-vehicle device 31 to receive input of information on the sex of the occupant when the occupant gets on the vehicle. The occupant detection unit 160 may acquire, for example, the male-female ratio of the occupant based on the acquired information on the gender of each occupant.
  • the occupant detection unit 160 calculates the remaining number of people who can get on the host vehicle M, based on the total number of the seats 82-1 to 82-5 and the number of seats on which the occupants are seated (the number of occupants).
  • the occupant detection unit 160 acquires information on in-vehicle equipment set for the host vehicle M.
  • the information on the in-vehicle equipment is, for example, information on whether or not the charging equipment for charging the terminal device is provided, and whether or not the humidifying equipment for humidifying the interior of the vehicle is provided.
  • the information on the in-vehicle equipment may be held, for example, in a storage device such as an HDD or a flash memory (not shown) in the automatic driving control unit 100.
  • the information on the in-vehicle equipment may be preset, for example, at the time of factory shipment, and may be updated when the equipment is attached to the vehicle M or removed.
  • the interface control unit 165 outputs information to the outside of the vehicle using at least one of the display for outside the vehicle 32 and the speaker 33 for the outside of the vehicle.
  • the information is, for example, content such as an image displayed on the display for outside of the vehicle 32 or a sound output from the speaker 33 for outside of the vehicle.
  • the information presented by the content is, for example, information for recruiting passengers.
  • the information presented by the content is, for example, information related to the number of people who can get in the host vehicle M obtained from the occupant detection unit 160. Further, the information presented by the content may be the in-vehicle equipment acquired by the occupant detection unit 160 or information of the sex ratio of the occupant or the like.
  • the information presented by the content may be information on a travel plan of the host vehicle M.
  • the information related to the travel plan of the host vehicle M includes, for example, at least one of the destination or the via point of the host vehicle M. By outputting the via point, it is possible to ride a person with the same destination halfway along the way.
  • the interface control unit 165 may appropriately combine each of the information presented by the content described above and output the information to the outside of the vehicle.
  • FIG. 12 is a diagram showing an example of content output toward the outside of the vehicle.
  • the interface control unit 165 outputs the content using the external display 32 in the direction seen from the position of the person P3.
  • the front display 32F and the left side display 32L of the host vehicle M traveling in the traveling lane L1 display images 300F and 300L regarding the destination and the number of people who can get into the host vehicle M.
  • the interface control unit 165 may display the images 300F and 300L in a flickering manner, or may change the color between daytime and nighttime.
  • the interface control unit 165 outputs a voice having the same content as the information shown in the image 300L, using the vehicle exterior speaker 33.
  • the interface control unit 165 may output music or an alarm that the surroundings draw attention by using the vehicle exterior speaker 33.
  • the interface control unit 165 may also display the character strings shown in the images 300F and 300L while sequentially moving them from the beginning of the character.
  • FIG. 13 is a diagram showing an example of movement of the character strings shown in the images 300F and 300L.
  • the interface control unit 165 moves the image 300F displayed on the front display 32F in the arrow D1 direction, and moves the image 300L displayed on the left side display 32L in the arrow D2 direction.
  • the interface control unit 165 repeatedly displays the images 300F and 300L.
  • the interface control unit 165 controls the direction and the display speed for moving the images 300F and 300L based on the walking direction and the walking speed of the person recognized by the external world recognition unit 121.
  • the interface control unit 165 displays the image 300L while moving in a direction opposite to the walking direction of the person P3. Moreover, it is preferable that the speed which moves the display of the image 300L is the same speed as the walking speed of the person P3. Thus, the interface control unit 165 can easily make the person P3 visually recognize the image 300L. The person P3 can also recognize that the vehicle M is aware of himself.
  • the interface control unit 165 instructs the action plan generating unit 123 to reduce the traveling speed of the host vehicle M based on the traveling speed of the person P3. You may For example, the interface control unit 165 can easily make the person P3 visually recognize the images 300F and 300L by causing the vehicle M to travel at a speed that is the same as or similar to the traveling speed of the person P3.
  • the interface control unit 165 causes the display for outside the vehicle 32 to output an image, for example, for the person recognized first. Further, the interface control unit 165 may cause the display for outside the vehicle 32 to output an image for the person closest to the vehicle M.
  • the predetermined conditions for outputting contents outside the vehicle are, for example, (1) the traveling position of the own vehicle M, (2) the traveling speed of the own vehicle M, (3) the operation of the person outside the vehicle, (4) It is a condition regarding the number of people etc. who can get in to the host vehicle M.
  • the interface control unit 165 outputs the content to the outside of the vehicle when all of the set conditions are satisfied.
  • Traveling position of the host vehicle M The interface control unit 165 is traveling in a section defined by the host vehicle M in advance, for example, based on the position information of the host vehicle M recognized by the host vehicle position recognition unit 122 In the case, output the content outside the car.
  • the setting of the section may be performed at the time of factory shipment, or may be performed by an occupant or the like.
  • a setting prohibited section such as a highway may be set.
  • the interface control unit 165 outputs the content outside the vehicle, for example, when the travel speed of the own vehicle M is equal to or less than a threshold.
  • the threshold may be set in advance for each road, or may be set by an occupant.
  • the interface control unit 165 can suppress the output of the content to the outside of the vehicle in a situation where a person such as an expressway can not get on the vehicle, for example. Further, the person outside the vehicle can easily grasp the content output to the host vehicle M traveling at a low speed. By outputting the content when traveling at a low speed, it is possible to smoothly stop the host vehicle M when a passenger who wants to get on the vehicle travels.
  • the interface control unit 165 may output the content outside the vehicle, for example, when it is estimated that the person outside the vehicle is raising a hand.
  • the interface control unit 165 analyzes the image captured by the camera 10 and raises the hand by pattern matching between the contour shape of the person included in the captured image and the contour shape of the person who raised the hand set in advance. Estimate the person who is Thus, the interface control unit 165 can output content to a person who is highly likely to be a passenger.
  • the interface control unit 165 may output content outside the vehicle, for example, when the number of people capable of getting into vehicle M is one or more. As a result, the interface control unit 165 can suppress the output of the content when the content is full.
  • the interface control unit 165 may output the content to the occupant of the host vehicle M by using the in-vehicle device 31 of the HMI 30. If an input indicating that the output may be output is received from the occupant, the content may be output to the outside of the vehicle. As a result, the interface control unit 165 can be configured not to output the content for recruiting a ride, for example, in response to the request of a passenger who does not want to ride the ride.
  • the boarding person determination unit 166 determines whether the person recognized by the external world recognition unit 121 is the boarding person if the interface control unit 165 is outputting contents toward the outside of the vehicle.
  • FIG. 14 is a diagram for explaining the determination contents of the boarding applicant by the boarding person determining part 166.
  • the terminal devices 400-1 and 400-2 possessed by the own vehicle M, the persons P4 to P6, and the persons P4 and P5 (hereinafter referred to as “the terminal “Abbreviated as“ ”indicates the server device 500. Communication between the own vehicle M, the terminal device 400, and the server device 500 is performed via the network NW.
  • the network NW is, for example, a wide area network (WAN) or a local area network (LAN).
  • the terminal device 400 is, for example, a smartphone or a tablet terminal.
  • the terminal device 400 has a function of communicating with the vehicle M existing in the vicinity using a cellular network, Wi-Fi network, Bluetooth (registered trademark), DSRC, etc., or communicating with the server device 500 via a wireless base station. Equipped with
  • the server device 500 manages traveling positions, situations, and the like of one or more vehicles.
  • the server device 500 is, for example, one information processing device. Further, the server device 500 may be a cloud server configured of one or more information processing devices.
  • the terminal device 400-1 of the person P4 who is outside the vehicle is notified of the information indicating that he / she is a passenger
  • the person P4 who is recognized by the outside world recognition portion 121 wishes to get in It is determined that the In the example of FIG. 14, the person P4 uses the terminal device 400-1 to output a signal indicating that he / she is a passenger desires to get around.
  • the surrounding is a communicable range defined by a communication standard.
  • the own vehicle M receives a signal from the terminal device 400-1 by the communication device 20.
  • the passenger identification unit 166 recognizes the person near the host vehicle M by the external world recognition unit 121, and determines that the recognized person P4 is a passenger. Do.
  • the person P5 transmits the information indicating that he / she is a passenger and the position information of the terminal device 400-2 to the server device 500 via the network NW using the terminal device 400-2.
  • the server device 500 extracts the own vehicle M which is traveling closest to the position of the terminal device 400-2 based on the information received from the terminal device 400-2, and gets on the extracted own vehicle M.
  • the information indicating that it is a desired person and the position information of the terminal device 400-2 are transmitted.
  • the boarding person determination unit 166 determines that the person P5 near the position of the terminal device 400-2 is the boarding person.
  • the boarding person determination unit 166 analyzes the image captured by the camera 10 and determines that the person is the boarding person if it is determined that the person included in the captured image is raising his hand. It is also good. In the example of FIG. 14, the person P6 is raising a hand. Therefore, the boarding person determining unit 166 determines that the person P6 is a boarding person by analysis of the image captured by the camera 10.
  • the boarding candidate determination unit 166 outputs an instruction to stop the host vehicle M near the person to the action plan generating unit 123 when there is a boarding candidate.
  • the action plan generation unit 123 generates a target trajectory for stopping the vehicle according to an instruction from the passenger applicant determination unit 166, and outputs the generated target trajectory to the travel control unit 141. Thus, the vehicle M can be stopped near the passenger.
  • the interface control unit 165 may output information indicating that the vehicle M is stopped, by using at least one of the external display 32 or the external speaker 33 toward the outside of the vehicle. Furthermore, the interface control unit 165 may output information about a point (planned stop position) where the passenger who wants to board the vehicle is to get on the vehicle, using at least one of the vehicle external display 32 and the vehicle external speaker 33 outside the vehicle.
  • the interface control unit 165 acquires the planned parking position based on the target trajectory generated by the action plan generation unit 123, and acquires information regarding the acquired planned parking position according to at least the display 32 for the car or the speaker 33 for the car. Present to the passenger using the one.
  • the interface control unit 165 uses the front display 32F to display an image regarding a planned stopping position.
  • the image includes, for example, information such as "Stop at 15 m ahead".
  • the sharing settlement unit 167 calculates the cost for each passenger based on conditions such as the number of people, section, distance, actual cost (fuel cost, high-speed charge) and the like when a plurality of people ride on the host vehicle M. For example, each passenger can reach the destination at a small cost by dividing the total amount by the number of persons who share the car and the passenger settlement unit 167. In addition, when the passenger gets off, the sharing settlement unit 167 may present the result of the settlement to the passenger using the in-vehicle device 31.
  • the sharing settlement unit 167 may calculate points for the shared passenger instead of calculating the amount of money.
  • the calculated amount or point may be settled on the spot or may be transmitted to the server device 500 shown in FIG. 14 via the communication device 20.
  • the server device 500 manages the amount or point for each occupant. As a result, the passenger can settle the amount of money used each month, and obtain benefits such as using the accumulated points when sharing with him or exchanging points for goods etc. Can.
  • FIG. 15 is a flow chart showing another example of the flow of processing executed by the automatic driving control unit 100 in order to make effective use of the space in the vehicle during automatic driving.
  • the process of the flowchart shown in FIG. 15 is repeatedly performed, for example, in a predetermined cycle.
  • steps S100 and S102 of FIG. 15 processing similar to that of steps S100 and S102 of FIG. 6 is performed.
  • step S404 the joining control unit 164 determines whether or not a plurality of occupants join together.
  • a sharing switch (not shown) is provided in the vehicle system 1. The passenger operates the sharing switch when riding in a ride, and makes the vehicle system 1 recognize that he / she is a passenger in a ride.
  • the sharing control unit 164 determines that the plurality of passengers are sharing.
  • an image captured by the in-vehicle camera 90 and an in-vehicle voice acquired by the in-vehicle camera 90 are used.
  • the joining control unit 164 determines that the plurality of occupants join together.
  • the face of the occupant imaged by the in-vehicle camera 90 is stored in advance as occupant information in a storage device such as an HDD or a flash memory.
  • the joining control unit 164 determines that those faces are the faces of the accompanying occupants. Then, it is determined that a plurality of crew members ride together.
  • step S404 If it is determined in step S404 that a plurality of occupants are not riding together, the processing of one routine of this flowchart ends.
  • the occupant detection unit 160 determines that at least one of the plurality of occupants needs a private space, and the seat arrangement control unit 162 detects the occupant Seat arrangement control is performed to change at least one of the attitude, position, and orientation of the seats 82-1 to 82-5 in accordance with the configuration or state of the occupant detected by the unit 160 (step S406).
  • the seat arrangement control unit 162 controls the seat arrangement control unit 162 so that at least one of the posture, the position, and the direction of the seats 82-1 to 82-5, so that the bodies of the passengers sitting on the seats 82-1 to 82-5 do not face each other Change one.
  • the occupant sitting on the seats 82-1 to 82-5 sits on the seats 82-1 to 82-5 without twisting the upper body with respect to the lower body.
  • the seat arrangement control unit 162 changes at least one of the posture, the position, and the direction of the sheets 82-1 to 82-5 so that the body of one passenger does not face the body of the next passenger.
  • the camera 10 functions as an imaging unit that images a landscape outside the vehicle.
  • the landmark visual recognition control unit 168 includes a determination unit 169.
  • the determination unit 169 determines whether there is a predetermined landmark around the host vehicle M.
  • the information indicating the landmark is, for example, linked with the first map information 54 of the navigation device 50 and stored. For example, with reference to the first map information 54 of the navigation device 50, the determination unit 169 determines whether or not the position of the vehicle M specified by the GNSS receiver 51 of the navigation device 50 has entered the visible region of the landmark. Determine if The visible area of the landmark is an area which is predetermined as a place where the landmark can be viewed from inside the vehicle. For example, the visible region of the landmark is an area of a predetermined shape centered on the set landmark.
  • the determination unit 169 determines that there is a landmark around the host vehicle M. For example, when the position of the host vehicle M has moved from the outside of the visible area of the landmark to the inside, the determination unit 169 determines that there is a landmark around the host vehicle M.
  • FIG. 16 is a flow chart showing another example of the flow of processing executed by the automatic driving control unit 100 in order to make effective use of the space in the vehicle during automatic driving.
  • FIG. 17 is a view for explaining another example of the configuration or state of the occupant detected by the occupant detection unit 160, and the seat arrangement control performed in step S506 in FIG.
  • steps S100 and S102 of FIG. 16 processing similar to that of steps S100 and S102 of FIG. 6 is performed.
  • step S504 the landmark visual recognition control unit 168 determines whether the landscape is captured by the camera 10 functioning as the imaging unit and the landscape is included in the exterior of the vehicle.
  • FIG. 18 is a diagram showing an example of the positional relationship between the host vehicle M and the landmark 600 in the case where a landscape is included in a landscape outside the vehicle captured by the camera 10.
  • the seat arrangement control unit 162 controls the seat 82 according to the configuration or state of the occupant detected by the occupant detection unit 160.
  • the sheet arrangement control is performed to change at least one of the posture, the position, and the orientation from -1 to 82-5 (step S506).
  • the seat arrangement control unit 162 adjusts the posture, position, and orientation of the seats 82-1 to 82-5 so that the body of the occupant sitting on the seats 82-1 to 82-5 faces the landmark 600. Change at least one of the In the example shown in FIG.
  • the seat arrangement control unit 162 changes at least one of the posture, the position, and the direction of the sheets 82-1 to 82-5 so that the body of the passenger who faces the vehicle is directed to the landmark 600.
  • the sheet arrangement control unit 162 turns the sheets 82-1 to 82-5, but instead, the sheet arrangement control unit 162, for example, may be a sheet By moving 82-1 to 82-5, the body of the occupant sitting on the seats 82-1 to 82-5 may be directed to the landmark 600.
  • step S506 in FIG. The seat arrangement control unit 162 executes seat arrangement control for the driver's seat 82-1.
  • the seat arrangement control unit 162 determines in step S506 in FIG.
  • the seat arrangement control for the seat 82-1 is not executed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Automation & Control Theory (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Seats For Vehicles (AREA)

Abstract

L'invention concerne un système de commande de véhicule équipé: d'un siège prévu dans un véhicule; d'une unité de détection de passagers pour détecter la configuration ou l'état de passagers à l'intérieur d'un habitacle du véhicule; et une unité de commande d'agencement de sièges pour exécuter une commande d'agencement de sièges pour modifier un ou plusieurs aspect(s) parmi la posture, la position et l'orientation des sièges, en fonction de la configuration ou de l'état de passagers détecté(e) par l'unité de détection de passagers.
PCT/JP2016/088467 2016-12-22 2016-12-22 Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule WO2018116461A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/JP2016/088467 WO2018116461A1 (fr) 2016-12-22 2016-12-22 Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule
US16/468,306 US20200086764A1 (en) 2016-12-22 2016-12-22 Vehicle control system, vehicle control method, and vehicle control program
CN201680091671.0A CN110087939A (zh) 2016-12-22 2016-12-22 车辆控制系统、车辆控制方法及车辆控制程序
JP2018557493A JPWO2018116461A1 (ja) 2016-12-22 2016-12-22 車両制御システム、車両制御方法、および車両制御プログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/088467 WO2018116461A1 (fr) 2016-12-22 2016-12-22 Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule

Publications (1)

Publication Number Publication Date
WO2018116461A1 true WO2018116461A1 (fr) 2018-06-28

Family

ID=62626129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/088467 WO2018116461A1 (fr) 2016-12-22 2016-12-22 Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule

Country Status (4)

Country Link
US (1) US20200086764A1 (fr)
JP (1) JPWO2018116461A1 (fr)
CN (1) CN110087939A (fr)
WO (1) WO2018116461A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3683091A1 (fr) * 2019-01-16 2020-07-22 Toyota Jidosha Kabushiki Kaisha Dispositif de commande de cabine de véhicule
JP2020111186A (ja) * 2019-01-11 2020-07-27 株式会社オートネットワーク技術研究所 仕切開閉システム
JP2020117029A (ja) * 2019-01-22 2020-08-06 トヨタ自動車株式会社 車室内制御システム
JP2021123311A (ja) * 2020-02-10 2021-08-30 トヨタ自動車株式会社 情報処理装置、車両システム、情報処理方法、およびプログラム
JPWO2020157991A1 (ja) * 2019-02-01 2021-11-18 本田技研工業株式会社 空間管理システム、移動体、プログラム及び空間管理方法

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6940969B2 (ja) * 2017-03-29 2021-09-29 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 車両制御装置、車両制御方法及びプログラム
CN112585660B (zh) * 2018-06-29 2022-09-27 日产自动车株式会社 行驶辅助方法及车辆控制装置
CN115014383A (zh) * 2019-02-14 2022-09-06 御眼视觉技术有限公司 用于车辆的导航系统和用于导航车辆的方法
DE102019128880A1 (de) * 2019-10-25 2021-04-29 Bayerische Motoren Werke Aktiengesellschaft Vorrichtung für einen Sitz, Sitz und Fahrzeug mit einer solchen Vorrichtung, und Verfahren zur Wiedergabe von Medieninhalten
US11511756B2 (en) * 2020-01-13 2022-11-29 Ford Global Technologies, Llc Passenger authentication system for a vehicle
JP2022014373A (ja) * 2020-07-06 2022-01-19 トヨタ自動車株式会社 車両用シート及び車両
JP2022026321A (ja) * 2020-07-30 2022-02-10 株式会社Subaru 車両用シート制御装置
CN114557566B (zh) * 2022-02-08 2023-06-27 珠海格力电器股份有限公司 床体姿态调整系统、方法、存储介质及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005271771A (ja) * 2004-03-25 2005-10-06 Nissan Motor Co Ltd 運転姿勢調節装置
JP2009149263A (ja) * 2007-12-21 2009-07-09 Toyota Motor Corp 乗物用シート装置
JP2009149264A (ja) * 2007-12-21 2009-07-09 Toyota Motor Corp 乗物用シート装置
WO2015011866A1 (fr) * 2013-07-23 2015-01-29 日産自動車株式会社 Dispositif et procédé d'assistance à la conduite de véhicule

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5163776U (fr) * 1974-11-13 1976-05-19
JPS5688375U (fr) * 1979-12-10 1981-07-15
JP2008290624A (ja) * 2007-05-25 2008-12-04 Aisin Seiki Co Ltd 車両用シートシステム
CN201082684Y (zh) * 2007-05-25 2008-07-09 东风柳州汽车有限公司 汽车用多位置快速拆装座椅

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005271771A (ja) * 2004-03-25 2005-10-06 Nissan Motor Co Ltd 運転姿勢調節装置
JP2009149263A (ja) * 2007-12-21 2009-07-09 Toyota Motor Corp 乗物用シート装置
JP2009149264A (ja) * 2007-12-21 2009-07-09 Toyota Motor Corp 乗物用シート装置
WO2015011866A1 (fr) * 2013-07-23 2015-01-29 日産自動車株式会社 Dispositif et procédé d'assistance à la conduite de véhicule

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113302085A (zh) * 2019-01-11 2021-08-24 株式会社自动网络技术研究所 间壁开闭系统
JP2020111186A (ja) * 2019-01-11 2020-07-27 株式会社オートネットワーク技術研究所 仕切開閉システム
JP7092042B2 (ja) 2019-01-11 2022-06-28 株式会社オートネットワーク技術研究所 仕切開閉システム
CN111483359B (zh) * 2019-01-16 2022-05-13 丰田自动车株式会社 车室内控制装置
CN111483359A (zh) * 2019-01-16 2020-08-04 丰田自动车株式会社 车室内控制装置
EP3683091A1 (fr) * 2019-01-16 2020-07-22 Toyota Jidosha Kabushiki Kaisha Dispositif de commande de cabine de véhicule
US11338706B2 (en) 2019-01-16 2022-05-24 Toyota Jidosha Kabushiki Kaisha Vehicle cabin control device
JP7092045B2 (ja) 2019-01-16 2022-06-28 トヨタ自動車株式会社 車室内制御装置
JP2020111292A (ja) * 2019-01-16 2020-07-27 トヨタ自動車株式会社 車室内制御装置
JP2020117029A (ja) * 2019-01-22 2020-08-06 トヨタ自動車株式会社 車室内制御システム
JP7047786B2 (ja) 2019-01-22 2022-04-05 トヨタ自動車株式会社 車室内制御システム
JPWO2020157991A1 (ja) * 2019-02-01 2021-11-18 本田技研工業株式会社 空間管理システム、移動体、プログラム及び空間管理方法
JP7261824B2 (ja) 2019-02-01 2023-04-20 本田技研工業株式会社 空間管理システム、移動体、プログラム及び空間管理方法
JP2021123311A (ja) * 2020-02-10 2021-08-30 トヨタ自動車株式会社 情報処理装置、車両システム、情報処理方法、およびプログラム
JP7347249B2 (ja) 2020-02-10 2023-09-20 トヨタ自動車株式会社 情報処理装置および車両システム

Also Published As

Publication number Publication date
US20200086764A1 (en) 2020-03-19
CN110087939A (zh) 2019-08-02
JPWO2018116461A1 (ja) 2019-07-04

Similar Documents

Publication Publication Date Title
JP6458792B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム
WO2018116461A1 (fr) Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule
US10337872B2 (en) Vehicle control system, vehicle control method, and vehicle control program
JP6493923B2 (ja) 情報表示装置、情報表示方法、および情報表示プログラム
JP6428746B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム
JP6715959B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム
US20170313321A1 (en) Vehicle control system, vehicle control method, and vehicle control program
WO2018116409A1 (fr) Système, procédé et programme de commande de véhicule
WO2018138769A1 (fr) Appareil, procédé et programme de commande de véhicule
JP7071173B2 (ja) 車両制御装置、車両制御方法、およびプログラム
WO2018138768A1 (fr) Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule
JP6327424B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム
WO2018083778A1 (fr) Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule
WO2018122973A1 (fr) Système de commande de véhicule, procédé de commande de véhicule et programme de commande de véhicule
JP2018203006A (ja) 車両制御システムおよび車両制御方法
WO2018087862A1 (fr) Système, procédé et programme de commande de véhicule
WO2018142560A1 (fr) Système, procédé et programme de commande de véhicule
JP2018076027A (ja) 車両制御システム、車両制御方法、および車両制御プログラム
JP6796145B2 (ja) 車両制御装置、車両制御方法、及びプログラム
JP6696006B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム
JP6460420B2 (ja) 情報表示装置、情報表示方法、および情報表示プログラム
JPWO2018142566A1 (ja) 通過ゲート決定装置、車両制御システム、通過ゲート決定方法、およびプログラム
JP2019158646A (ja) 車両制御装置、車両制御方法、及びプログラム
JP6916852B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム
JP6627128B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16924826

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018557493

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16924826

Country of ref document: EP

Kind code of ref document: A1