CN117916139A - Vehicle control system, vehicle control method, and program - Google Patents

Vehicle control system, vehicle control method, and program Download PDF

Info

Publication number
CN117916139A
CN117916139A CN202280060884.2A CN202280060884A CN117916139A CN 117916139 A CN117916139 A CN 117916139A CN 202280060884 A CN202280060884 A CN 202280060884A CN 117916139 A CN117916139 A CN 117916139A
Authority
CN
China
Prior art keywords
vehicle
display
content
occupant
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280060884.2A
Other languages
Chinese (zh)
Inventor
三浦幸治
今村洋
松田干宪
宮川健太
长良彻
笼谷毅
长田浩二
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of CN117916139A publication Critical patent/CN117916139A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F21/00Mobile visual advertising
    • G09F21/04Mobile visual advertising by land vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position

Abstract

A vehicle control system (11) has a human-machine interface (31). A human-computer interface (31) performs display control of entertainment Content (CT) based on the monitoring result of the occupant (PA).

Description

Vehicle control system, vehicle control method, and program
Technical Field
The invention relates to a vehicle control system, a vehicle control method, and a program.
Background
A technique for assisting driving of a driver by displaying information on a windshield or the like is known. In recent years, efforts related to autonomous driving have progressed, and level 3 autonomous driving is also being implemented, in which the driver has no driving obligation.
CITATION LIST
Patent literature
Patent document 1: JP 2017-174043A
Disclosure of Invention
Technical problem
In the case of automatic driving, the vehicle interior can be used as an entertainment space. The occupant can enjoy various entertainment content presented inside the vehicle. However, in the conventional vehicle control system, convenience of enjoying entertainment content by the occupant is not considered.
Accordingly, the present disclosure proposes a vehicle control system, a vehicle control method, and a program capable of enhancing convenience of enjoying entertainment content by occupants.
Solution to the problem
According to the present disclosure, there is provided a vehicle control system including a human-machine interface that performs display control of entertainment content based on a monitoring result of an occupant. According to the present disclosure, there is provided a vehicle control method in which information processing of a vehicle control system is performed by a computer, and a program for causing the computer to perform information processing of the vehicle control system.
Drawings
Fig. 1 is a diagram showing an outline of display control of entertainment content.
Fig. 2 is a block diagram showing a configuration example of a vehicle control system.
Fig. 3 is a diagram showing an example of the sensing area of the external recognition sensor 25.
Fig. 4 is a diagram showing a configuration example of an in-vehicle sensor and an image display unit.
Fig. 5 is a diagram showing a display example of an image provided as content.
Fig. 6 is a diagram showing another configuration example of the image display unit.
Fig. 7 is a diagram showing a display example of an image provided as content.
Fig. 8 is a diagram listing states of the inside and outside of the vehicle that can be detected by the sensor.
Fig. 9 is a diagram showing an example of a display control method of content.
Fig. 10 is a table showing an example of correspondence among information transmission levels, surrounding conditions, and display methods.
Fig. 11 is a flowchart showing an example of display control based on the content of the monitoring results of the inside and outside of the vehicle.
Fig. 12 is a diagram showing details of the processing.
Fig. 13 is a diagram showing details of the processing.
Fig. 14 is a diagram showing details of the processing.
Fig. 15 is a diagram showing details of the processing.
Fig. 16 is a diagram showing details of the processing.
Fig. 17 is a diagram showing a notification example of an event.
Fig. 18 is a diagram showing a transparent display example of an event.
Fig. 19 is a diagram showing a control example of a content display position based on a luminance distribution.
Fig. 20 is a diagram showing a control example of a content display position based on a luminance distribution.
Fig. 21 is a diagram showing a control example of a notification method of an event based on an information transmission level.
Fig. 22 is a diagram showing a control example of a content display position based on an occurrence position of an event.
Fig. 23 is a diagram showing a control example of the content display position based on the viewing state of the occupant.
Fig. 24 is a diagram showing a control example of the content display position based on the viewing state of the occupant.
Fig. 25 is a diagram showing an example of display control based on the viewing states of a plurality of occupants.
Fig. 26 is a diagram showing an example of display control based on the viewing states of a plurality of occupants.
Fig. 27 is a diagram showing an example of display control based on the viewing states of a plurality of occupants.
Fig. 28 is a diagram showing an example of display control based on the viewing states of a plurality of occupants.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described in detail based on the drawings. In the following embodiments, the same reference numerals are given to the same parts, and redundant description is omitted.
Note that description will be given in the following order.
[1. Overview of display control of entertainment Contents ]
[2. Configuration example of vehicle control System ]
[3 ] Sensing region of external identification sensor ]
[4. Configuration example of in-vehicle sensor and image display Unit ]
[5 ] Content display control based on the monitoring results of the inside and outside of the vehicle ]
[5-1. Surveillance object ]
[5-2. Content display control depending on the information Transmission level ]
[5-3. Process flow ]
[6. Display control example ]
[6-1. Notification of event ]
[6-2 Transparent display of events ]
[6-3. Control of content display position based on luminance distribution ]
[6-4. Control of event notification method based on information Transmission level ]
[6-5. Control of content display position based on event occurrence position ]
[6-6. Control of content display position based on occupant's viewing State ]
[6-7 ] Display control based on the viewing states of a plurality of occupants ]
[7. Effect ]
[1. Overview of display control of entertainment Contents ]
Fig. 1 is a diagram showing an overview of display control of entertainment content CT.
The vehicle 1 includes a vehicle control system 11 capable of performing autonomous driving. The occupant PA can use the vehicle interior as an entertainment space. The vehicle 1 includes an image display unit 93 capable of displaying entertainment content CT. The entertainment content CT is not for assisting the driver in driving but for entertaining the occupant PA. Examples of entertainment content CT include movies, music, games, news, mail, map or travel information, SNS, and chat.
The image display unit 93 can be switched between, for example, a display state and a transparent state. The image display unit 93 is disposed in such a manner as to surround the vehicle interior space along the inner wall or partition wall of the vehicle 1. The image display unit 93 presents the content CT in the display state and causes the occupant PA to see the vehicle outside therethrough in the transparent state.
The "display state" includes a semi-transparent state in which information such as an image displayed on a scene outside the vehicle can be overlapped like Augmented Reality (AR). The image display unit 93 may perform AR display by superimposing an image on a scene outside the vehicle. The image display unit 93 adjusts, for example, display luminance, transparency, or haze (haze) of the image display unit 93 according to the luminance outside the vehicle. This may enhance the visibility of the content CT.
The vehicle control system 11 includes a monitoring unit MU and a human-machine interface (HMI) 31. The monitoring unit MU acquires various types of information inside and outside the vehicle using various sensors mounted on the vehicle 1. The monitoring unit MU outputs the acquired information to the HMI 31 as monitoring information. The monitoring unit MU includes, for example, the DMS 30 and an identification unit 73 described later.
The monitoring information includes information on the viewing state of the occupant PA (viewing state information VS), information on an event outside the vehicle (event information ET), information on the surrounding environment (environment information CE), information on the driving state DS, and the like. The HMI 31 performs display control of the entertainment content CT based on the monitoring information. For example, the monitor unit MU records the display position of the content CT and the gaze condition of the occupant PA on the content CT as a record of the in-vehicle condition during traveling, and stores the record in the storage unit 28 (see fig. 2). A portion of the vehicle control system 11 including the storage unit 28 may be implemented as functions on the cloud via the communication unit 22.
[2. Configuration example of vehicle control System ]
Fig. 2 is a block diagram showing a configuration example of the vehicle control system 11.
The vehicle control system 11 is included in the vehicle 1, and performs processing related to driving assistance and autonomous driving of the vehicle 1.
The vehicle control system 11 includes a vehicle control Electronic Control Unit (ECU) 21, a communication unit 22, a map information accumulation unit 23, a position information acquisition unit 24, an external recognition sensor 25, an in-vehicle sensor 26, a vehicle sensor 27, a storage unit 28, a travel assist and autonomous driving control unit 29, a Driver Monitoring System (DMS) 30, a human-machine interface (HMI) 31, and a vehicle control unit 32.
The vehicle control ECU 21, the communication unit 22, the map information accumulation unit 23, the positional information acquisition unit 24, the external recognition sensor 25, the in-vehicle sensor 26, the vehicle sensor 27, the storage unit 28, the travel assist and autonomous driving control unit 29, the Driver Monitoring System (DMS) 30, the human-machine interface (HMI) 31, and the vehicle control unit 32 are communicably connected to each other via the communication network 41. The communication network 41 includes, for example, an in-vehicle communication network, a bus, or the like conforming to a digital bidirectional communication standard such as a Controller Area Network (CAN), a Local Interconnect Network (LIN), a Local Area Network (LAN), flexRay (registered trademark), or ethernet (registered trademark). The communication network 41 may be selectively used according to the type of data to be transmitted. For example, CAN may be applied to data related to vehicle control, while ethernet may be applied to large-volume data. Note that each unit of the vehicle control system 11 may be directly connected not via the communication network 41 but by using wireless communication based on the premise of relatively short-range communication, such as Near Field Communication (NFC) or bluetooth (registered trademark).
Note that, in the following, in the case where each unit of the vehicle control system 11 performs communication via the communication network 41, the communication network 41 will not be mentioned. For example, in the case where the vehicle control ECU 21 and the communication unit 22 perform communication via the communication network 41, it is simply described that the vehicle control ECU 21 and the communication unit 22 communicate with each other.
The vehicle control ECU 21 is implemented by various types of processors such as a Central Processing Unit (CPU) or a Micro Processing Unit (MPU). The vehicle control ECU 21 controls all or part of the functions of the vehicle control system 11.
The communication unit 22 communicates with various devices inside and outside the vehicle, other vehicles, servers, base stations, and the like, and transmits and receives various types of data. At this time, the communication unit 22 may perform communication using a variety of communication schemes.
External communication that can be performed by the communication unit 22 will be schematically described. The communication unit 22 communicates with a server (hereinafter, referred to as an external server) or the like existing on an external network via a base station or an access point by a wireless communication scheme such as a 5 th generation mobile communication system (5G), long Term Evolution (LTE), or Dedicated Short Range Communication (DSRC). The external network with which the communication unit 22 communicates is, for example, the internet, a cloud network, a network unique to a company, or the like. The communication scheme performed by the communication unit 22 with respect to the external network is not particularly limited as long as it is a wireless communication scheme capable of performing digital bidirectional communication at a communication speed equal to or higher than a predetermined speed and at a distance longer than or equal to the predetermined distance.
Further, for example, the communication unit 22 may communicate with a terminal existing in the vicinity of the own vehicle using peer-to-peer (P2P) technology. Terminals existing in the vicinity of the host vehicle include, for example, terminals worn by a traveling body such as a pedestrian or a bicycle traveling at a relatively low speed, fixed-position terminals installed in a store or the like, or Machine Type Communication (MTC) terminals. Further, the communication unit 22 may also perform V2X communication. V2X communication refers to communication between the host vehicle and other things, such as vehicle-to-vehicle communication with another vehicle, vehicle-to-infrastructure communication with a roadside device, vehicle-to-home communication, and vehicle-to-pedestrian communication with a terminal carried by a pedestrian.
For example, the communication unit 22 may receive a program for updating software for controlling the operation of the vehicle control system 11 from outside (over the air). The communication unit 22 may also receive map information, traffic information, information around the vehicle 1, and the like from the outside. Further, for example, the communication unit 22 may transmit information about the vehicle 1, information around the vehicle 1, and the like to the outside. Examples of the information about the vehicle 1 transmitted to the outside through the communication unit 22 include data indicating the state of the vehicle 1, the recognition result of the recognition unit 73, and the like. Further, for example, the communication unit 22 performs communication conforming to a vehicle emergency call system such as eCall.
For example, the communication unit 22 receives electromagnetic waves transmitted by a road traffic information communication system (vehicle information and communication system (VICS) (registered trademark)) such as a radio wave beacon, an optical beacon, or FM multiplex broadcasting.
Communication with the vehicle interior that the communication unit 22 can perform will be schematically described. The communication unit 22 may communicate with each device in the vehicle using, for example, wireless communication. The communication unit 22 may perform wireless communication with in-vehicle devices by a communication scheme capable of performing digital two-way communication by wireless communication at a predetermined communication speed or higher, such as wireless LAN, bluetooth, NFC, or Wireless USB (WUSB). The communication unit 22 may also communicate with each device in the vehicle using wired communication, not limited to the above. For example, the communication unit 22 may communicate with each device in the vehicle by wired communication via a cable connected to a connection terminal (not shown). The communication unit 22 may communicate with each device in the vehicle through a communication scheme capable of performing digital bidirectional communication by wired communication at a predetermined communication speed or higher, such as Universal Serial Bus (USB), high-definition multimedia interface (HDMI) (registered trademark), or mobile high-definition link (MHL).
Here, the device in the vehicle refers to, for example, a device in the vehicle that is not connected to the communication network 41. As examples of devices in the vehicle, a mobile device or a wearable device carried by an occupant such as a driver, an information device brought into the vehicle and temporarily mounted, or the like can be considered.
The map information accumulation unit 23 accumulates one or both of a map acquired from the outside and a map generated by the vehicle 1. For example, the map information accumulating unit 23 accumulates a three-dimensional high-precision map, a global map having a lower precision than the high-precision map but a wide coverage, and the like.
The high-precision map is, for example, a dynamic map, a point cloud map, a vector map, or others. The dynamic map is, for example, a map of four layers including dynamic information, semi-static information, and is provided to the vehicle 1 from an external server or the like. The point cloud map includes a point cloud (point group data). A vectorial illustration is, for example, a map in which traffic information such as positions of lanes and traffic lights is associated with a point cloud map and adapted to ADAS (advanced driver assistance system) or AD (autonomous driving).
The point cloud map and the vector map may be provided from, for example, an external server or the like, or may be created by the vehicle 1 as a map for matching with a local map described later based on the sensing result of the camera 51, the radar 52, the LiDAR 53, or the like, and may be accumulated in the map information accumulation unit 23. In addition, in the case of providing a high-precision map from an external server or the like, for example, map data of several hundred meters square with respect to a planned path along which the vehicle 1 subsequently travels is acquired from the external server or the like in order to reduce the communication capacity.
The position information acquiring unit 24 receives GNSS signals from global navigation satellite system GNSS satellites and acquires position information of the vehicle 1. The acquired position information is supplied to the travel assist and autonomous driving control unit 29. Note that the position information acquiring unit 24 is not limited to a method using GNSS signals, and for example, may acquire position information using beacons.
The external recognition sensor 25 includes various sensors for recognizing the condition outside the vehicle 1, and provides sensor data from each sensor to units in the vehicle control system 11. Any type and any number of sensors may be included in the external identification sensor 25.
For example, the external recognition sensor 25 includes a camera 51, a radar 52, a light detection and ranging or laser imaging detection and ranging (LiDAR) 53, and an ultrasonic sensor 54. But is not limited thereto, the external recognition sensor 25 may include one or more types of sensors among the camera 51, the radar 52, the LiDAR 53, and the ultrasonic sensor 54. The number of cameras 51, radar 52, liDAR 53, and ultrasonic sensors 54 is not particularly limited as long as they can be practically installed in the vehicle 1. Further, the type of sensor included in the external recognition sensor 25 is not limited to this example, and the external recognition sensor 25 may include another type of sensor. An example of the sensing area of each sensor included in the external recognition sensor 25 will be described later.
Note that the imaging method of the camera 51 is not particularly limited. For example, cameras of various imaging methods, such as a time-of-flight (ToF) camera using an imaging method capable of ranging, a stereoscopic camera, a monocular camera, and an infrared camera, may be applied to the camera 51 as needed. Without being limited thereto, the camera 51 may simply acquire a photographed image regardless of ranging.
Further, for example, the external recognition sensor 25 may include an environment sensor for detecting the environment of the vehicle 1. The environment sensor is a sensor for detecting an environment such as weather, climate, or brightness, and may include various sensors such as a raindrop sensor, a fog sensor, a sun light sensor, a snow sensor, and an illuminance sensor.
Further, for example, the external recognition sensor 25 includes a microphone for detecting sounds around the vehicle 1, the position of a sound source, and the like.
The in-vehicle sensor 26 includes various sensors for detecting information inside the vehicle, and provides sensor data from each sensor to each unit of the vehicle control system 11. The types and the number of the various sensors included in the in-vehicle sensor 26 are not particularly limited as long as they can be actually installed in the vehicle 1.
For example, the in-vehicle sensors 26 may include one or more types of sensors from among cameras, radars, seat sensors, steering wheel sensors, microphones, and biosensors. As the camera included in the in-vehicle sensor 26, for example, cameras of various imaging methods capable of ranging such as a ToF camera, a stereo camera, a monocular camera, and an infrared camera can be used. Without being limited thereto, the camera included in the in-vehicle sensor 26 may simply acquire a photographed image regardless of ranging. The biological sensor included in the in-vehicle sensor 26 is provided on, for example, a seat, a steering wheel, or the like, and detects various types of biological information of an occupant such as a driver.
The vehicle sensors 27 include various sensors for detecting the state of the vehicle 1, and provide sensor data from each sensor to each unit of the vehicle control system 11. The types or the number of the various sensors included in the vehicle sensor 27 are not particularly limited as long as they can be actually installed in the vehicle 1.
For example, the vehicle sensor 27 includes a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an Inertial Measurement Unit (IMU) integrating these sensors. For example, the vehicle sensor 27 includes a steering angle sensor that detects a steering angle of a steering wheel, a yaw rate sensor, an accelerator sensor that detects an operation amount of an accelerator pedal, and a brake sensor that detects an operation amount of a brake pedal. For example, the vehicle sensor 27 includes a rotation sensor that detects the number of revolutions of an engine or a motor, an air pressure sensor that detects the air pressure of a tire, a slip rate sensor that detects the slip rate of a tire, and a wheel speed sensor that detects the rotation speed of a wheel. For example, the vehicle sensor 27 includes a battery sensor that detects the remaining amount and temperature of the battery and an impact sensor that detects an impact from the outside. The vehicle sensor 27 includes a sensor that detects the driving state DS.
The storage unit 28 includes at least one of a nonvolatile storage medium or a volatile storage medium, and stores data or programs. The storage unit 28 functions as, for example, an electrically erasable programmable read-only memory (EEPROM) and a Random Access Memory (RAM), and magnetic storage devices such as a Hard Disk Drive (HDD), a semiconductor storage device, an optical storage device, and a magneto-optical storage device may be used as the storage medium. The storage unit 28 stores various programs, data used by each unit of the vehicle control system 11. For example, the storage unit 28 includes an Event Data Recorder (EDR) and an automatic driving Data Storage System (DSSAD), and stores information of the vehicle 1 before and after an event such as an accident and information acquired by the in-vehicle sensor 26.
The driving assistance and autonomous driving control unit 29 controls driving assistance and autonomous driving of the vehicle 1. The travel assist and autonomous driving control unit 29 includes, for example, an analysis unit 61, an action planning unit 62, and an operation control unit 63.
The analysis unit 61 performs analysis processing of the conditions of the vehicle 1 and the surrounding environment. The analysis unit 61 includes a self-position estimation unit 71, a sensor fusion unit 72, and an identification unit 73.
The own position estimating unit 71 estimates the own position of the vehicle 1 based on the sensor data from the external recognition sensor 25 and the high-precision map accumulated in the map information accumulating unit 23. For example, the self-position estimating unit 71 generates a local map from the sensor data from the external recognition sensor 25, and estimates the self-position of the vehicle 1 by matching the local map with a high-precision map. The position of the vehicle 1 is based on, for example, the centers of the axles of a pair of rear wheels.
The local map is, for example, a three-dimensional high-precision map created using techniques such as simultaneous localization and mapping (SLAM), occupied grid map, and the like. The three-dimensional high-precision map is, for example, the above-described point cloud map or the like. The occupied mesh map is a map in which a three-dimensional or two-dimensional space around the vehicle 1 is divided into meshes of a predetermined size, and the occupied state of an object is represented for each mesh. The occupancy state of an object is indicated, for example, by the presence or absence or the presence probability of the object. The partial map may also be used for, for example, detection processing and recognition processing of the situation outside the vehicle 1 by the recognition unit 73.
Note that the own position estimating unit 71 may estimate the own position of the vehicle 1 based on the position information acquired by the position information acquiring unit 24 and the sensor data from the vehicle sensor 27.
The sensor fusion unit 72 performs a sensor fusion process of combining a plurality of different types of sensor data (for example, image data supplied from the camera 51 and sensor data supplied from the radar 52) to obtain new information. Methods for combining different types of sensor data include integration, fusion, correlation, and the like.
The identifying unit 73 performs a detection process of detecting the external condition of the vehicle 1 and an identifying process of identifying the external condition of the vehicle 1.
For example, the identifying unit 73 performs detection processing and identification processing of the condition outside the vehicle 1 based on information from the external identifying sensor 25, information from the own position estimating unit 71, information from the sensor fusion unit 72, and the like.
Specifically, the recognition unit 73 performs, for example, detection processing, recognition processing, and the like of objects in the vicinity of the vehicle 1. The object detection process is, for example, a process of detecting the presence or absence, size, shape, position, movement, and the like of an object. The object recognition processing is, for example, processing of recognizing an attribute such as the type of an object or distinguishing a specific object. However, the detection process and the identification process are not necessarily clearly distinguished, and may overlap each other.
For example, the recognition unit 73 detects objects around the vehicle 1 by performing clustering that classifies point clouds based on sensor data of the radar 52, liDAR 53, and the like into point clusters. As a result, the presence or absence, size, shape, and position of the object around the vehicle 1 are detected.
For example, the recognition unit 73 detects the movement of the objects around the vehicle 1 by tracking the movement of clusters of the point clusters classified by the clusters. As a result, the speed and the traveling direction (motion vector) of the objects around the vehicle 1 are detected.
For example, the recognition unit 73 detects or recognizes a vehicle, a person, a bicycle, an obstacle, a structure, a road, a signal lamp, a traffic sign, a road sign, or the like from the image data supplied from the camera 51. Further, the recognition unit 73 can recognize the kind of the object in the periphery of the vehicle 1 by performing recognition processing such as semantic division.
For example, the identifying unit 73 can perform the identifying process of the traffic rule around the vehicle 1 based on the map accumulated in the map information accumulating unit 23, the estimation result of the own position by the own position estimating unit 71, and the identifying result of the identifying unit 73 on the object around the vehicle 1. Through this process, the recognition unit 73 can recognize the position and state of the traffic light, the content of the traffic sign and the road sign, the content of the traffic rule, the usable lane, and the like.
For example, the recognition unit 73 can perform recognition processing of the surrounding environment of the vehicle 1. As the surrounding environment to be recognized by the recognition unit 73, the presence or absence of pedestrians and surrounding vehicles, weather, temperature, humidity, brightness, road surface condition, and the like can be considered.
The action plan unit 62 generates an action plan of the vehicle 1. For example, the action planning unit 62 generates an action plan by performing processing of global path planning and path tracking.
Note that the global path plan is a process of planning a rough path from the start to the target. This global path plan is called a trajectory plan, and includes a process of executing a trajectory path plan (local path plan) that enables safe and smooth travel in the vicinity of the vehicle 1 in consideration of the motion characteristics of the vehicle 1 in the planned path.
Path tracking is a process that is scheduled for operations that safely and accurately travel on a path planned by a global path plan within a planned time. For example, the action planning unit 62 can calculate the target speed and the target angular speed of the vehicle 1 based on the result of the path-tracking process.
The operation control unit 63 controls the operation of the vehicle 1 in order to implement the action plan generated by the action plan unit 62.
For example, the operation control unit 63 controls a steering control unit 81, a braking control unit 82, and a driving control unit 83 included in the vehicle control unit 32 described later to perform acceleration/deceleration control and directional control so that the vehicle 1 travels on a trajectory calculated by a trajectory plan. For example, the operation control unit 63 performs cooperative control to realize functions of ADAS such as collision avoidance or impact reduction, follow-up running, vehicle speed keeping running, collision warning to the own vehicle, lane departure warning to the own vehicle, and the like. The operation control unit 63 performs cooperative control such as autonomous driving intended for autonomous running of the vehicle without depending on the operation of the driver.
The DMS 30 performs an authentication process of the driver, an identification process of the state of the driver, and the like based on sensor data from the in-vehicle sensor 26, input data input to the HMI 31 described later, and the like. As the state of the driver to be recognized, for example, a physical state, a degree of wakefulness, a degree of concentration, a degree of fatigue, a direction of line of sight, a degree of intoxication, a driving operation, a posture, and the like can be considered.
Note that the DMS 30 may perform authentication processing of an occupant other than the driver and identification processing of the state of the occupant. Further, for example, the DMS 30 may perform the recognition processing of the condition of the vehicle interior based on the sensor data from the in-vehicle sensor 26. As the condition of the vehicle interior to be identified, for example, temperature, humidity, brightness, smell, fragrance, or the like can be considered.
The HMI 31 inputs various types of data, instructions, and the like, and presents the various types of data to an occupant including a driver.
The data input by the HMI 31 will be schematically described. The HMI 31 includes an input device for human input data. The HMI 31 generates an input signal according to data, instructions, and the like input by the input device, and supplies the input signal to each unit of the vehicle control system 11. The HMI 31 includes an operator such as a touch panel, a button, a switch, or a lever as an input device. Without being limited thereto, the HMI 31 may also include an input device capable of inputting information by a method other than manual operation, for example, by voice, gesture, or the like. Further, the HMI 31 may use, for example, a remote control device using infrared rays or radio waves or an external connection device such as a mobile device or a wearable device supporting the operation of the vehicle control system 11 as an input device.
The presentation of data by the HMI 31 will be schematically described. The HMI 31 generates visual information, auditory information, and tactile information for the occupant or the outside of the vehicle. In addition, the HMI 31 performs output control for controlling output, output content, output timing, an output method, and the like of each piece of information generated. The HMI 31 generates and outputs, as visual information, for example, an operation screen, a status display of the vehicle 1, a warning display, a monitoring image indicating a situation around the vehicle 1, and information indicated by an image or light such as entertainment content CT. Further, the HMI 31 generates and outputs information indicated by sound such as voice guidance, warning sound, or warning message as acoustic information. Further, the HMI 31 generates and outputs information giving the occupant a sense of touch by, for example, force, vibration, motion, or the like as the touch information.
As an output device of the HMI 31 outputting visual information, for example, a display device presenting visual information by displaying an image thereon or a projector device presenting visual information by projecting an image is applicable. Note that the display device may be a device that displays visual information in the occupant's field of view, such as a heads-up display, a transmissive display, or a wearable device having an Augmented Reality (AR) function, in addition to a display device having a normal display. In addition, the HMI 31 may use a display device included in a navigation device, a dashboard, a Camera Monitoring System (CMS), an electronic mirror, a lamp, or the like included in the vehicle 1 as an output device that outputs visual information.
As an output device from which the HMI 31 outputs auditory information, for example, an audio speaker, a headphone, or an earphone is applicable.
As an output device to which the HMI 31 outputs haptic information, for example, a haptic element using a haptic technique is applicable. The haptic element is for example provided at a portion of the vehicle 1 where an occupant contacts, such as a steering wheel or a seat.
The vehicle control unit 32 controls each unit of the vehicle 1. The vehicle control unit 32 includes a steering control unit 81, a brake control unit 82, a drive control unit 83, a vehicle body system control unit 84, a lamp control unit 85, and a horn control unit 86.
The steering control unit 81 detects and controls the state of the steering system of the vehicle 1. The steering system includes, for example, a steering mechanism including a steering wheel or the like, an electric power steering device, and the like. The steering control unit 81 includes, for example, a steering ECU that controls the steering system, an actuator that drives the steering system, and the like.
The brake control unit 82 performs detection and control of the state of the brake system of the vehicle 1. The brake system includes, for example, a brake mechanism including a brake pedal, an Antilock Brake System (ABS), a regenerative brake mechanism, and the like. The brake control unit 82 includes, for example, a brake ECU that controls a brake system, an actuator that drives the brake system, and the like.
The drive control unit 83 detects and controls the state of the drive system of the vehicle 1. The drive system includes, for example, a drive force generating device (such as an accelerator pedal, an internal combustion engine, and a drive motor) for generating a drive force, a drive force transmitting mechanism for transmitting the drive force to wheels, and the like. The drive control unit 83 includes, for example, a drive ECU that controls the drive system, an actuator that drives the drive system, and the like.
The vehicle body system control unit 84 detects and controls the state of the vehicle body system of the vehicle 1. The vehicle body system includes, for example, a keyless entry system, a smart key system, a power window apparatus, a power seat, an air conditioner, an air bag, a seat belt, a shift lever, and the like. The vehicle body system control unit 84 includes, for example, a vehicle body system ECU that controls the vehicle body system, an actuator that drives the vehicle body system, and the like.
The lamp control unit 85 detects and controls the states of various lamps of the vehicle 1. As the lamp to be controlled, for example, a headlight, a backlight, a fog lamp, a turn signal, a brake lamp, a projection, a display on a bumper, and the like are conceivable. The lamp control unit 85 includes a lamp ECU that controls the lamp, an actuator that drives the lamp, and the like.
The horn control unit 86 detects and controls the state of the automobile horn of the vehicle 1. The horn control unit 86 includes, for example, a horn ECU that controls the horn of the automobile, an actuator that drives the horn of the automobile, and the like.
[3 ] Sensing region of external identification sensor ]
Fig. 3 is a diagram showing an example of a sensing area by the camera 51, radar 52, liDAR 53, ultrasonic sensor 54, and the like of the external recognition sensor 25. Note that fig. 3 schematically shows the vehicle 1 as viewed from above. When fig. 3 is viewed from the direction in which the symbol is displayed in the correct direction, the left end side is the front end (front) side of the vehicle 1, and the right end side is the rear end (rear) side of the vehicle 1.
The sensing areas 101F and 101B represent examples of sensing areas of the ultrasonic sensor 54. The sensing region 101F covers around the front end of the vehicle 1 by the plurality of ultrasonic sensors 54. The sensing region 101B covers around the rear end of the vehicle 1 by the plurality of ultrasonic sensors 54.
The sensing results in the sensing region 101F and the sensing region 101B are used for parking assistance of the vehicle 1, for example.
Sensing region 102F or sensing region 102B represents an example of a close or medium range sensing region of radar 52. Sensing region 102F covers a position farther forward of vehicle 1 than sensing region 101F. Sensing region 102B covers a position farther rearward of vehicle 1 than sensing region 101B. The sensing area 102L covers the rear periphery of the left side face of the vehicle 1. The sensing region 102R covers the rear periphery of the right side face of the vehicle 1.
The sensing result in the sensing region 102F is used, for example, to detect a vehicle, a pedestrian, or the like existing in front of the vehicle 1. The detection result in the sensing region 102B is used for, for example, a collision prevention function or the like behind the vehicle 1. The sensing results in the sensing region 102L and the sensing region 102R are used, for example, to detect objects in blind spots on the side of the vehicle 1.
The sensing region 103F or the sensing region 103B represents an example of a sensing region of the camera 51. The sensing region 103F covers a position farther forward of the vehicle 1 than the sensing region 102F. The sensing region 103B covers a position farther rearward of the vehicle 1 than the sensing region 102B. The sensing region 103L covers the periphery of the left side face of the vehicle 1. The sensing region 103R covers the periphery of the right side face of the vehicle 1.
The sensing result in the sensing region 103F may be used for, for example, recognition of traffic lights or traffic signs, lane departure prevention assistance systems, and automatic headlight control systems. The sensing results in sensing region 103B may be used, for example, in a parking assistance and look around system. For example, the sensing results in sensing region 103L and sensing region 103R may be used in a surround view system.
The sensing region 104 represents an example of a sensing region of LiDAR 53. The sensing region 104 covers a position farther forward of the vehicle 1 than the sensing region 103F. Meanwhile, the sensing region 104 has a narrower area than the sensing region 103F in the left-right direction.
For example, an object such as a surrounding vehicle is detected using the sensing result in the sensing region 104.
The sensing region 105 represents an example of a sensing region of the long-range radar 52. The sensing region 105 covers a position farther forward of the vehicle 1 than the sensing region 104. Meanwhile, the sensing region 105 has a narrower area than the sensing region 104 in the left-right direction.
The sensing results in the sensing region 105 are used for, for example, adaptive Cruise Control (ACC), emergency braking, collision avoidance, and the like.
Note that the sensing areas of the sensors of the camera 51, radar 52, liDAR 53, and ultrasonic sensor 54 included in the external recognition sensor 25 may have various configurations other than those in fig. 3. Specifically, the ultrasonic sensor 54 may also perform sensing on the side of the vehicle 1, or the LiDAR 53 may perform sensing on the rear of the vehicle 1. In addition, the mounting position of the sensor is not limited to the above example. The number of sensors may be one or more.
[4. Configuration example of in-vehicle sensor and image display Unit ]
Fig. 4 is a diagram showing a configuration example of the in-vehicle sensor 26 and the image display unit 93. Fig. 5 is a diagram showing a display example of an image IM provided as a content CT.
In the example of fig. 4 and 5, the image display unit 93 is provided in such a manner as to partition between the front-row seats (driver seat and passenger seat) and the rear-row seats. A camera CM that monitors the occupant PA in the rear seat is mounted in the vehicle. The camera CM is used as the in-vehicle sensor 26. In the example of fig. 4, two cameras CM monitor the entire in-vehicle space (rear space) behind the front seat.
The DMS 30 acquires viewing state information VS of the occupant PA in the rear space from the image captured by the camera CM. The viewing state information VS includes, for example, information on the number of occupants PA, whether the occupants PA are awake, the field of view of the occupants PA, and whether the occupants PA are looking at the content CT. In the case where there are a plurality of occupants PA, for each occupant PA, a field of view of the occupant PA, whether the occupant PA is awake, and whether the occupant PA is looking at the content CT are determined. The HMI 31 controls the display of the content CT based on the viewing state information VS.
The image display unit 93 is an optical see-through type or a video see-through type display capable of transmitting and displaying a background. In the present disclosure, as an example, an optical see-through display is used as the image display unit 93. As the image display unit 93, a known transparent display or technology such as a head-up display (HUD), a transparent Organic Light Emitting Diode (OLED), or a transparent Light Emitting Diode (LED) is used. In the present disclosure, as the image display unit 93, for example, a configuration for projecting an image on the transparent screen SCR is adopted.
The image display unit 93 includes, for example, a transparent screen SCR and a projector PJ. The transparent screen SCR includes an optical lens film that reflects light projected from the projector PJ. A light control film capable of electrically controlling the transmittance of external light is laminated on the optical lens film. The HMI 31 adjusts the display luminance or transparency or haze of the image display unit 93 (transparent screen SCR) based on the outside luminance information extracted from the environmental information CE. When the content CT is not displayed on the image display unit 93, the occupant PA can see the scenery outside the vehicle through the transparent screen SCR from the gap between the driver seat and the occupant seat.
Fig. 6 is a diagram showing another configuration example of the image display unit 93. Fig. 7 is a diagram showing a display example of the image IM provided as the content CT.
In the example of fig. 6 and 7, the image display unit 93 is mounted on the side of the occupant PA in the rear seat. The side glass of the vehicle 1 is replaced by a transparent screen SCR. Instead of using the side glass as the transparent screen SCR, a film-like transparent screen may be attached to the surface of the side glass or sandwiched inside the side glass. When the image IM is not displayed, the occupant PA can see the outside scenery through the transparent screen SCR.
In the examples of fig. 6 and 7, the side glass of the rear seat has a function as a transparent screen SCR; however, those having a function as a transparent screen SCR are not limited to the side glass of the rear seat. The side glass or rear glass of the windshield, front seat may have a function as a transparent screen SCR.
[5 ] Content display control based on the monitoring results of the inside and outside of the vehicle ]
[5-1. Surveillance object ]
Fig. 8 is a diagram listing states of the inside and outside of the vehicle that can be detected by the sensor.
Ways to spend time in a vehicle include "A1: movie, music or game "," A2: watch news or read mail "," A3: view VR image "," A4: view map or travel information "," A5: voice Chat (VC) "," A6: view external scene "," A7: talking to occupant "," A8: talk with smart phone "and" A9: sleep or slow-release).
A1 to A5 are in a state of "watch something alone", A4 to A7 are in a state of "watch with other person", A6 to A7 are in a state of "not particularly watch", A8 is in a state of "use of a device carried in a hand", and A9 is in a state of "sleep". "viewing something alone" and "viewing with other people" are states in which the image display unit 93 is being used, and "not particularly viewing", "using a device carried on a hand", and "sleeping" are states in which the image display unit 93 is not being used. The DMS 30 detects in what mode the occupant PA is using the image display unit 93 as a viewing state.
The states of the vehicle 1 include "in manual driving", "in autonomous driving", and "in reverse control". "in autonomous driving" or "in manual driving" is a state in which driving is performed autonomously or manually. The "in the reverse control" is a state in which the authority is being transferred from the autonomous driving or the request is transferred to the manual driving. "in autonomous driving" is a state in which the content CT can be freely enjoyed, and "in manual driving" and "in reverse control" are states in which the provision of the content CT should be restricted so as to allow the driver to concentrate on driving. The "during the reverse control" may be said to be a state in which it is necessary to notify the occupant PA other than the driver that some abnormality has occurred in the vehicle 1. The vehicle sensor 27 detects the current state of the vehicle 1 out of the three states as the driving condition DS.
Examples of external environments include "danger around", "sunning in the west", and "clear sky". The "danger of surrounding" is a state in which there is an event to notify the occupant PA, and the "sunning in the west" and "sunning in the sky" are states in which luminance control or the like is required according to the surrounding environment. The recognition unit 73 recognizes the external environment based on the sensor information of the external recognition sensor 25.
The vehicle control system 11 appropriately controls the display of the content CT in accordance with the viewing state of the occupant PA, the driving state DS, events outside the vehicle, and the surrounding environment.
[5-2. Content display control according to information Transmission level ]
Fig. 9 is a diagram showing an example of a display control method of the content CT.
The HMI 31 performs display control of the entertainment content CT based on the monitoring result of the occupant PA. For example, the HMI 31 includes an information transmission method control unit 91, a display method control unit 92, and an image display unit 93.
The information transmission method control unit 91 determines the information transmission level of an event occurring outside the vehicle based on the monitoring information acquired from the monitoring unit MU. The information transmission level indicates the urgency or importance of information transmission for an event occurring outside the vehicle. The information transmission method control unit 91 determines a notification method of an event based on the information transmission level. The notification method to be determined is a method of transmitting information about an event in a broad sense. The display method control unit 92 determines a detailed display method (display position, display timing, etc.) of various types of information including the content CT depending on the notification method based on the monitoring information.
Fig. 10 is a table showing an example of correspondence among information transmission levels, surrounding conditions, and display methods.
In the example of fig. 10, five ranks from rank S to rank D are defined as information transmission ranks. The level S corresponds to the case where the accident occurred before. In the level S, viewing of entertainment content CT that has been previously displayed is stopped. The view outside and in front of the vehicle is ensured, and warning information is displayed in the display area in front.
The class a corresponds to a case where an accident occurs in the left-right direction of the vehicle 1. In class a, the necessary outside view of the side face side (the side where the accident occurs) is ensured. Entertainment content CT displayed in the left or right display area where the accident has occurred is moved to the display area on the side where the accident has not occurred. Then, a warning message is displayed in the display area on the accident occurrence side.
The class B corresponds to a case where another vehicle exists at a short distance. In the class B, a field of view of the outside of the vehicle in the necessary direction (the side where the other vehicle is present) is ensured. Entertainment content CT is moved to a position where it is not obscured (e.g., a position that is not overlapping on another vehicle). Then, warning information is displayed in a display area on the side where the other vehicle is present.
The class C corresponds to a case where another vehicle exists at a long distance. In the class C, a field of view of the outside of the vehicle in the necessary direction (the side on which another vehicle exists) is ensured. Since the risk of collision with another vehicle is small, the size of the field of view to be ensured is smaller than that of the level S, the level a, and the level B, and no warning information is displayed.
The class D corresponds to the case where there is no other vehicle around. In class D, there is no event that assumes a hazard such as a collision. Thus, viewing of entertainment content CT continues.
[5-3. Process flow ]
Fig. 11 is a flowchart showing an example of display control of content CT based on monitoring results of the inside and outside of the vehicle. Fig. 12 to 16 are diagrams showing details of the processing.
As shown in fig. 11, the monitoring unit MU acquires information about the inside of the vehicle, information about the surroundings of the vehicle, and the driving condition DS based on the sensor information (steps SA1 to SA 3). The monitoring unit MU supplies the acquired various information as monitoring information to the HMI 31.
For example, information in the vehicle is acquired according to the flow of fig. 12. First, the monitoring unit MU checks the number of occupants from the captured image of the camera CM (step SB 1). The monitoring unit MU checks the activity level of each occupant PA (step SB 2) and determines whether all the occupants PA are sleeping or one or more of the occupants PA are awake. In the case where one or more occupants PA are awake, the monitoring unit MU checks the action content of each of the awake occupants PA (whether the occupant PA is watching the content CT) (step SB 3).
For example, information about the periphery of the vehicle is acquired according to the flow of fig. 13. First, the monitoring unit MU checks weather information based on the sensor information acquired from the external recognition sensor 25, and determines whether the field of view is good (step SC 1). If the field of view is good, the monitoring unit MU obtains the direction of the sun by using the information of the own position and the time information detected by using SLAM (step SC 2). The monitoring unit MU acquires surrounding accident information and surrounding vehicle information from the sensor information acquired from the external recognition sensor 25 (steps SC3 to SC 4).
Returning to fig. 11, the information transmission method control unit 91 detects the condition around the vehicle 1 based on the monitoring information. The storage unit 28 stores a table in which definition information of an information transmission level is defined for each case. In this table, correspondence relations among information transmission levels, surrounding situations, and display methods are defined as definition information. The information transmission method control unit 91 defines the information transmission level corresponding to the case based on the table (step SA 4). The information transmission method control unit 91 selects an information transmission level corresponding to the surrounding situation, and determines a notification method of an event based on the information transmission level that has been selected (steps SA5 to SA 6).
The information transmission level is determined, for example, according to the flow of fig. 14. First, the information transmission method control unit 91 acquires accident information and surrounding vehicle information from the monitoring unit MU (steps SD1 to SD 2). The information transmission method control unit 91 determines the urgency of information transmission regarding an event that has occurred outside the vehicle based on the acquired information (step SD 3).
In the case where it is determined that there is an emergency of information transmission, the information transmission method control unit 91 determines the direction in which information related to an event (for example, threat to the vehicle 1) needs to be displayed, based on the position where the event occurs (step SD 4). In the case where there is a threat in the front, the information transmission method control unit 91 selects the rank S as the information transmission rank. In the case where there is a threat in a direction other than the front, the information transmission method control unit 91 selects the level a as the information transmission level.
In the case where it is determined that there is no urgency of information transmission, the information transmission method control unit 91 determines the risk degree of the nearby vehicle (step SD 5). For example, if another vehicle exists in the vicinity, it is determined that the direction in which information about the other vehicle needs to be displayed (step SD 6). When another vehicle exists in the left or right, the information transmission method control unit 91 selects the class B as the information transmission class. In the case where another vehicle exists behind, the information transmission method control unit 91 selects the level C as the information transmission level. In the case where another vehicle does not exist in the vicinity, the information transmission method control unit 91 selects the level D as the information transmission level.
Returning to fig. 11, the display method control unit 92 detects viewing state information VS of the occupant PA based on the monitoring information (steps SA7 to SA 8). The viewing state information VS includes information (visual field information) related to the visual field of the occupant PA and information (state information) related to the state of the occupant PA. The state information includes information on whether the occupant PA is awake and information on whether the occupant PA is looking at the content CT.
For example, the viewing state of the occupant PA is detected according to the flow of fig. 15. First, the display method control unit 92 selects all the occupants PA detected based on the monitoring information as detection targets (step SE 1). The display method control unit 92 defines the state of each occupant PA based on the monitoring information, and determines whether each occupant PA is looking at the content CT (step SE 2). For example, in the case where the line of sight is directed to the display area, it is determined that the occupant PA is looking at the content CT, and in the case where the line of sight is not directed to the display area, it is determined that the occupant PA is not looking at the content CT.
The display method control unit 92 determines all the occupants PA who are looking at the content CT as detection targets (step SE 3). The display method control unit 92 acquires the determined visual field information of each occupant PA based on the monitoring information (step SE 4).
The display method control unit 92 determines the display area of the content CT from the visual field information of the occupant PA (step SA 9). The display method control unit 92 determines the display timing of the event information and the content CT based on the state information of the occupant PA (step SA 10). The display method is determined by the display area and the display timing. The display method control unit 92 displays the content CT according to the display method that has been determined (step SA 11).
For example, the display method is determined according to the flow of fig. 16. First, the display method control unit 92 calculates a display area where a luminance distribution exceeding an allowable level is generated due to external light (e.g., sunlight). The permission level is preset based on the visibility of the content CT, etc. In the case where the visibility falls below the allowable limit due to the luminance distribution, it is determined that the allowable level has been exceeded. For example, the display method control unit 92 calculates a display area reflecting the sun based on the direction of the sun. The display method control unit 92 recognizes the calculated display area as an external light influence area (step SF 1).
When there are a plurality of occupants PA looking at the content CT, the display method control unit 92 calculates information (common view information) related to the common view of the plurality of occupants PA (step SF 2). The common view refers to a portion where the views of the plurality of occupants PA overlap each other on the transparent screen SCR.
If there is a common field of view, the display method control unit 92 switches the display area to correspond to the common field of view (step SF 3). For example, the display method control unit 92 sets a display area of the content CT in an area other than an area affected by external light and including a common field of view. If there is no common field of view, the display area is switched to correspond to the field of view of the specific occupant PA having a high level of gaze (step SF 4). For example, the display method control unit 92 sets the display area of the content CT in an area other than the area affected by the external light and including the field of view of the specific occupant PA. Then, the display method control unit 92 displays the content CT according to the defined level (step SF 5).
[6: Display control example
[6-1. Notification of event ]
Hereinafter, an example of display control of the HMI 31 will be described with reference to fig. 17 to 28. Fig. 17 is a diagram showing a notification example of an event.
When an event to be notified to the occupant PA occurs outside the vehicle, the HMI 31 performs the event notification NT in the display area of the portion overlapping the view VA of the occupant PA at the timing of the event occurring outside the vehicle. As events to notify the occupant PA, an accident that threatens the safety of the own vehicle, and another vehicle that is in danger of driving are conceivable.
For example, the HMI 31 displays the event information ET on the image display unit 93 to execute the event notification NT. The event information ET includes, for example, information such as an event type and an event occurrence position. In the example of fig. 17, a message "emergency" is displayed at the lower left of the view VA of the occupant PA, and an arrow indicating the direction in which another vehicle is approaching is displayed at the lower right of the view VA at the timing when the other vehicle has approached the host vehicle by a predetermined distance. "Emergency" indicates the type of event and arrow indicates the location of occurrence of the event.
In the entire display area (displayable area DA) of the image display unit 93, the event notification NT is performed in the display area of the portion overlapping the field of view VA of the occupant PA. Therefore, the occupant PA can acquire the event information ET without moving the view VA while viewing the content CT.
[6-2 Transparent display of events ]
Fig. 18 is a diagram showing a transparent display example of an event.
The HMI 31 causes the display area of the portion overlapping the view VA of the occupant PA to be in a transparent state at the timing when an event that occurs outside the vehicle in the view VA of the occupant PA is confirmed. According to this configuration, even without moving the visual field VA, the situation of the event can be visually recognized through the display area (transparent area TA) in the transparent state.
For example, the HMI 31 moves the display position of the content CT displayed in the display area of the portion overlapping the view VA, or stops the display of the display area of the portion overlapping the view VA. As a result, the HMI 31 brings the display area of the portion overlapped on the view VA into a transparent state.
In the example of fig. 18, the display area of the content CT (content display area CDA) is moved from a lower right position of the screen superimposed on the view VA to an upper left position of the screen not superimposed on the view VA. The lower right area of the screen on which the content CT has been displayed becomes the transparent area TA. The occupant PA can visually recognize the event outside the vehicle through the transparent area TA. According to this configuration, the continuity of the content CT before and after the display event is maintained. Therefore, the occupant PA viewing the content CT is unlikely to feel uncomfortable. That is, since the occupant PA can visually recognize an event outside the vehicle and grasp the situation, if it can be determined that there is no significant problem, the occupant PA can continue to view the content CT. The vehicle control system 11 may have means by which the user notifies the HMI 31 of a determination as to whether there is a problem.
[6-3. Control of content display position based on luminance distribution ]
Fig. 19 and 20 are diagrams showing control examples of content display positions based on luminance distribution.
When a luminance distribution exceeding an allowable level is generated in the content display area CDA by external light, the HMI 31 moves the display position of the content CT to another display area in which the luminance distribution satisfies the allowable level. For example, in the case where the luminance ratio between the brightest portion and the darkest portion in the content display area CDA is greater than a preset threshold, the HMI 31 determines that the luminance distribution exceeds the allowable level.
In the example of fig. 19, as the sun SN is reflected in the content display area CDA, a luminance distribution exceeding the allowable level is generated in the content display area CDA. Thus, the HMI 31 moves the content display area CDA to avoid reflection of the sun SN. As a result, the decrease in the visibility of the content CT due to the external light is suppressed.
When the display position of the content CT is moved to the other display area, the HMI 31 displays guide information GI for guiding the line of sight of the occupant PA to the other display area. In the example of fig. 19, as the guide information FI, an arrow indicating the destination of the content display area CDA is displayed at the lower left of the content display area CDA together with the message "the display position is to be changed". The occupant PA grasps the movement of the content display area CDA and the movement direction of the content display area CDA in advance by the guidance information GI. Therefore, the display position of the content CT can be moved without giving discomfort to the occupant PA.
In the example of fig. 20, as external light is directly incident on a portion of the content display area CDA, a luminance distribution exceeding an allowable level is generated in the content display area CDA. In the case where the direct incident area DL of the external light is significantly bright compared with other areas (shadow areas), the luminance distribution is not sufficiently eliminated even when the transparency and haze of the transparent screen SCR are adjusted. Therefore, after the notification is performed by the guidance information GI, the HMI 31 moves the content display area CDA to a position avoiding the direct incidence area DL.
Shown in fig. 19 and 20 are examples in which a luminance distribution is generated in the content display area CDA by sunlight; however, the luminance distribution may be generated by external light other than sunlight. For example, a luminance distribution may be generated in the content display area CDA due to a streetlight, a town light in a night scene, or the like. Also in this case, the HMI 31 moves the content display area CDA to an area having a small luminance distribution.
[6-4. Control of event notification method based on information Transmission level ]
Fig. 21 is a diagram showing a control example of a notification method of an event based on an information transmission level. Note that this is based on the premise that the occupant PA uses the image display unit 93 in the vehicle center shown in fig. 4.
The HMI 31 determines a notification method of an event based on the urgency or importance of information transmission regarding an event that has occurred outside the vehicle. For example, in the upper example of fig. 21, there is no other vehicle OT around the own vehicle OW. Accordingly, the HMI 31 determines that an event to notify the occupant PA does not occur, and does not provide notification of the event.
In the example of the middle portion in fig. 21, another vehicle OT exists at a position slightly apart from the right rear side of the own vehicle OW. The HMI 31 determines that a less urgent event has occurred on the right rear side of the own vehicle OW. The HMI 31 lights up the right end of the content CT to match the direction in which the event occurs. As a result, event notification NT is performed.
The degree of urgency is indicated by the light intensity. The HMI 31 changes the intensity of light according to the distance between the own vehicle OW and another vehicle OT. The shorter the distance, the more urgent, the higher the intensity of the light. In the example of the middle portion of fig. 21, the distance between the own vehicle OW and another vehicle OT is relatively large. Therefore, the intensity of light illuminating the right end of the content CT is relatively small.
In the lower example of fig. 21, the other vehicle OT runs near the right side of the own vehicle OW. The HMI 31 determines that a high degree of emergency has occurred on the right side of the own vehicle OW. The HMI 31 performs event notification NT by strongly illuminating the right end of the content CT.
[6-5. Control of content display position based on event occurrence position ]
Fig. 22 is a diagram showing a control example of a content display position based on an occurrence position of an event. Note that this is based on the premise that the occupant PA uses the image display unit 93 in the vehicle center shown in fig. 4.
The HMI 31 controls the display position of the content CT based on the occurrence position of the event. For example, in the upper example of fig. 22, there is no other vehicle OT around the own vehicle OW. Therefore, the HMI 31 determines that no event to be notified to the occupant PA has occurred, and does not change the display position of the content CT.
In the example of the middle portion of fig. 22, an accident (event) occurs in front of the own vehicle OW. The HMI 31 reduces the display size of the content CT so as to secure a front view. The HMI 31 sets the reduced content CT at a position that does not overlap with the line of sight of the occupant PA (for example, the lower end of the transparent screen SCR) so AS not to obstruct the occupant PA from viewing the accident scene AS.
In the lower example of fig. 22, an accident occurs on the right side of the own vehicle OW. The HMI 31 reduces the display size of the content CT in such a manner as to secure a right-side field of view. The HMI 31 sets the reduced content CT at a position that does not overlap with the line of sight of the occupant PA (for example, the left end of the transparent screen SCR) so AS not to obstruct the occupant PA from viewing the accident scene AS.
[6-6. Control of content display position based on occupant's viewing State ]
Fig. 23 and 24 are diagrams showing control examples of content display positions based on the viewing state of the occupant PA. Which is based on the premise that the occupant PA uses the image display unit 93 of the vehicle side portion shown in fig. 6.
In the example of fig. 23, the occupant PA leans backward and views the content CT. The HMI 31 detects the state in which the occupant PA is looking at the content CT from the viewing state information VS. The HMI 31 displays the content CT on the entire transparent screen SCR so that the occupant PA can view the content CT on a large screen.
In the example of fig. 24, the occupant PA brings the face close to the transparent screen SCR, and is looking at the landscape GA outside the vehicle. The HMI 31 detects the state in which the occupant PA wants to view the landscape GA outside the vehicle based on the viewing state information VS. The HMI 31 reduces the display size of the content CT so as to secure a view on one side. The HMI 31 configures the reduced content CT at a position (for example, a corner of the transparent screen SCR) that does not overlap with the line of sight of the occupant PA so as not to obstruct the occupant PA from viewing the landscape GA.
[6-7 ] Display control based on the viewing states of a plurality of occupants ]
Fig. 25 to 28 are diagrams showing examples of display control based on the viewing states of a plurality of occupants PA. In fig. 25 to 28, the individual occupants PA are distinguished by numerals appended after symbols, as needed. The view VA of each occupant PA is also distinguished in a similar manner.
The HMI 31 detects the field of view VA of one or more occupants PA gazing at the content CT based on the viewing state information VS. The HMI 31 displays the content CT in a display area of a portion overlapped on the view VA for each detected view VA.
In the example of fig. 25 and 26, the occupant PA-2 is looking at a landscape outside the vehicle. Accordingly, the HMI 31 selectively displays the content CT in the display area of the portion overlapping the field of view VA-1 of the occupant PA-1. The HMI 31 makes the display area of the portion overlapped on the visual field VA-2 of the occupant PA-2 transparent so that the occupant PA-2 can see the outside of the vehicle therethrough.
When it is detected that the occupant PA has started driving based on the monitoring result in the vehicle, the HMI 31 stops displaying the entertainment content CT at which the occupant PA has started driving looks. According to this configuration, the occupant PA can be concentrated on driving, and thus the safety of driving is improved. Note that the fact that manual driving has been started is detected based on the driving state DS. It is determined which occupant PA has started driving by performing image analysis on an image of a camera mounted in the vehicle.
Fig. 27 and 28 are diagrams showing examples of display control based on the common view CVA of the plurality of occupants PA.
When the common view CVA exists between the views VA of the plurality of occupants PA, the HMI 31 performs display control of the event information ET or the like based on the position of the common view CVA.
For example, in the example of fig. 27, a plurality of occupants PA view the same content CT. The common field CVA exists on the content CT. At this time, when the approach (event) of the other vehicle OT is detected, the HMI 31 performs notification NT for providing notification of the approach of the other vehicle OT in the display area of the portion overlapping with the common view CVA. When the other vehicle OT reaches a position overlapping the common view CVA, the HMI 31 moves the content display area CDA to a position not overlapping the common view CVA, and brings the display area of the portion overlapping the common view CVA into a transparent state.
In the example of fig. 28, the content display area CDA is moved to avoid reflection of the sun SN. When the display position of the content CT is moved to another display area, the HMI 31 displays the guidance information GI in the display area of the portion overlapped on the common field CVA. This allows the occupant PA who views the common content CT in its entirety to recognize the movement of the content display area CDA and the movement direction of the content display area CDA.
[7. Effect ]
The vehicle control system 11 includes an HMI 31. The HMI 31 performs display control of the entertainment content CT based on the monitoring result of the occupant PA. In the vehicle control method of the present disclosure, the processing of the vehicle control system 11 is performed by a computer. The program of the present disclosure causes a computer to realize the processing of the vehicle control system 11.
According to this configuration, display control in consideration of the viewing state of the occupant PA can be performed. Therefore, the convenience of enjoying the CT content by the occupant PA is improved.
The content CT is displayed by the image display unit 93. The image display unit 93 can be switched between a display state and a transparent state. HMI 31 adjusts display brightness or transparency or haze of a display area (transparent screen SCR) of content CT based on the outside brightness information.
According to this configuration, the image display unit 93 may have a content display function and a window function. The AR display may also be performed by superimposing the image IM on a landscape outside the vehicle. In addition, by adjusting the brightness of the display and the transparency or haze of the display area of the content CT according to the brightness outside the vehicle, the visibility of the content CT is improved.
The HMI 31 determines a notification method of an event based on the urgency or importance of information transmission regarding an event that has occurred outside the vehicle.
According to this configuration, the presence of an event occurring outside the vehicle and the urgency or importance of the event can be recognized while enjoying the content CT.
The vehicle control system 11 includes a monitoring unit MU. The monitoring unit MU records the display position of the content CT and the gaze condition of the occupant PA with respect to the content CT.
According to this configuration, when an accident occurs, it is easier to investigate the cause of the accident.
Note that the effects described herein are merely examples and are not limiting, and other effects may also be achieved.
[ Appendix ]
Note that the present technology may also have the following configuration.
(1) A vehicle control system includes a human-machine interface that performs display control of entertainment content based on a result of monitoring an occupant.
(2) The vehicle control system according to (1),
Wherein the human-machine interface determines a notification method of an event that has occurred outside the vehicle based on the urgency or importance of information transmission regarding the event.
(3) The vehicle control system according to (1) or (2),
Wherein the human-machine interface provides notification of an event in a display area of a portion overlapping a field of view of the occupant at a timing when the event occurs outside the vehicle.
(4) The vehicle control system according to any one of (1) to (3),
Wherein the human-machine interface brings a display area of a portion overlapping with the view of the occupant into a transparent state at a timing when occurrence of an event outside the vehicle is confirmed in the view of the occupant.
(5) The vehicle control system according to (4),
The man-machine interface enables the display area to enter a transparent state by moving the display position of the content displayed in the display area or stopping the display of the display area.
(6) The vehicle control system according to any one of (1) to (5),
Wherein when a luminance distribution exceeding an allowable level is generated in a display area of the content by external light, the human-machine interface moves a display position of the content to a second display area where the luminance distribution satisfies the allowable level.
(7) The vehicle control system according to (6),
Wherein the human-machine interface displays guide information for guiding the line of sight of the occupant to the second display area when the display position of the content is moved to the second display area.
(8) The vehicle control system according to any one of (1) to (7),
Wherein the human-machine interface stops displaying the content gazed by the occupant who has started driving.
(9) The vehicle control system according to any one of (1) to (8), further comprising:
And a monitoring unit for recording the display position of the content and the watching condition of the passenger for the content.
(10) The vehicle control system according to any one of (1) to (9),
Wherein the human-machine interface adjusts the brightness of the display or the transparency or haze of the display area of the content based on the brightness information outside the vehicle.
(11) A vehicle control method executed by a computer, the vehicle control method comprising a step of performing display control of entertainment content based on a monitoring result of an occupant.
(12) A program for causing a computer to realize display control of entertainment content based on a monitoring result of an occupant.
List of reference numerals
11. Vehicle control system
31. Human-machine interface (HMI)
93. Image display unit
CT content
GI guidance information
MU monitoring unit
PA passenger
VA view

Claims (12)

1. A vehicle control system includes a human-machine interface that performs display control of entertainment content based on a result of monitoring an occupant.
2. The vehicle control system according to claim 1,
Wherein the human-machine interface determines a notification method of an event that has occurred outside the vehicle based on the urgency or importance of information transmission regarding the event.
3. The vehicle control system according to claim 1,
Wherein the human-machine interface provides notification of an event in a display area of a portion overlapping a field of view of the occupant at a timing when the event occurs outside the vehicle.
4. The vehicle control system according to claim 1,
Wherein the human-machine interface brings a display area of a portion overlapping with the view of the occupant into a transparent state at a timing when occurrence of an event outside the vehicle is confirmed in the view of the occupant.
5. The vehicle control system according to claim 4,
The man-machine interface enables the display area to enter a transparent state by moving the display position of the content displayed in the display area or stopping the display of the display area.
6. The vehicle control system according to claim 1,
Wherein when a luminance distribution exceeding an allowable level is generated in a display area of the content by external light, the human-machine interface moves a display position of the content to a second display area where the luminance distribution satisfies the allowable level.
7. The vehicle control system according to claim 6,
Wherein the human-machine interface displays guide information for guiding the line of sight of the occupant to the second display area when the display position of the content is moved to the second display area.
8. The vehicle control system according to claim 1,
Wherein the human-machine interface stops displaying the content gazed by the occupant who has started driving.
9. The vehicle control system according to claim 1, further comprising:
And a monitoring unit for recording the display position of the content and the watching condition of the passenger for the content.
10. The vehicle control system according to claim 1,
Wherein the human-machine interface adjusts the brightness of the display or the transparency or haze of the display area of the content based on the brightness information outside the vehicle.
11. A vehicle control method executed by a computer, the vehicle control method comprising a step of performing display control of entertainment content based on a monitoring result of an occupant.
12. A program for causing a computer to realize display control of entertainment content based on a monitoring result of an occupant.
CN202280060884.2A 2021-09-15 2022-02-16 Vehicle control system, vehicle control method, and program Pending CN117916139A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2021150607 2021-09-15
JP2021-150607 2021-09-15
PCT/JP2022/006036 WO2023042418A1 (en) 2021-09-15 2022-02-16 Vehicle control system, vehicle control method, and program

Publications (1)

Publication Number Publication Date
CN117916139A true CN117916139A (en) 2024-04-19

Family

ID=85602619

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280060884.2A Pending CN117916139A (en) 2021-09-15 2022-02-16 Vehicle control system, vehicle control method, and program

Country Status (2)

Country Link
CN (1) CN117916139A (en)
WO (1) WO2023042418A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017174043A (en) 2016-03-23 2017-09-28 パイオニア株式会社 Display device
US11586223B2 (en) * 2017-06-16 2023-02-21 Honda Motor Co., Ltd. Vehicle and service management device
JP2019133260A (en) * 2018-01-29 2019-08-08 トヨタ自動車株式会社 Vehicle display system
JP2019135614A (en) * 2018-02-05 2019-08-15 トヨタ自動車株式会社 Display system for vehicle
JP2020192877A (en) * 2019-05-28 2020-12-03 本田技研工業株式会社 Control device, control method and program

Also Published As

Publication number Publication date
WO2023042418A1 (en) 2023-03-23

Similar Documents

Publication Publication Date Title
KR101895485B1 (en) Drive assistance appratus and method for controlling the same
KR20190007286A (en) Driving system for vehicle and Vehicle
US11873007B2 (en) Information processing apparatus, information processing method, and program
US11694408B2 (en) Information processing device, information processing method, program, and movable object
JP7382327B2 (en) Information processing device, mobile object, information processing method and program
EP4049877A1 (en) Display system, display device, display method, and moving device
US20200302780A1 (en) Information processing apparatus, information processing method, moving object, and vehicle
WO2021241189A1 (en) Information processing device, information processing method, and program
WO2019117104A1 (en) Information processing device and information processing method
US20230001845A1 (en) Vehicle
WO2023042418A1 (en) Vehicle control system, vehicle control method, and program
US20230415652A1 (en) Camera module, information processing system, information processing method, and information processing apparatus
WO2024038759A1 (en) Information processing device, information processing method, and program
WO2023074126A1 (en) Information processing device, information processing method, program, and information processing system
WO2024048180A1 (en) Information processing device, information processing method, and vehicle control system
WO2024043053A1 (en) Information processing device, information processing method, and program
WO2022201892A1 (en) Information processing apparatus, information processing method, and program
WO2023166982A1 (en) Information processing device, information processing method, and mobile object
US20230274316A1 (en) Information displaying apparatus, information displaying method, and program
WO2023053498A1 (en) Information processing device, information processing method, recording medium, and in-vehicle system
EP4273834A1 (en) Information processing device, information processing method, program, moving device, and information processing system
WO2024062976A1 (en) Information processing device and information processing method
WO2024009829A1 (en) Information processing device, information processing method, and vehicle control system
WO2023171401A1 (en) Signal processing device, signal processing method, and recording medium
WO2023054090A1 (en) Recognition processing device, recognition processing method, and recognition processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication