US20210024098A1 - Vehicle position sensing system - Google Patents

Vehicle position sensing system Download PDF

Info

Publication number
US20210024098A1
US20210024098A1 US16/889,169 US202016889169A US2021024098A1 US 20210024098 A1 US20210024098 A1 US 20210024098A1 US 202016889169 A US202016889169 A US 202016889169A US 2021024098 A1 US2021024098 A1 US 2021024098A1
Authority
US
United States
Prior art keywords
vehicle
regions
region
driving
driving regions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/889,169
Other versions
US11535279B2 (en
Inventor
Yuki Ito
Tsukasa NAKANISHI
Yuta MORIKAWA
Naoki YAMAMURO
Yuki TATSUMOTO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Corp
Original Assignee
Toyota Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp filed Critical Toyota Motor Corp
Assigned to TOYOTA JIDOSHA KABUSHIKI KAISHA reassignment TOYOTA JIDOSHA KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TATSUMOTO, YUKI, Yamamuro, Naoki, MORIKAWA, YUTA, Nakanishi, Tsukasa, ITO, YUKI
Publication of US20210024098A1 publication Critical patent/US20210024098A1/en
Application granted granted Critical
Publication of US11535279B2 publication Critical patent/US11535279B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/082Selecting or switching between different modes of propelling
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/85Arrangements for transferring vehicle- or driver-related data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0018Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • B60W60/0053Handover processes from vehicle to occupant
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • B60W60/0059Estimation of the risk associated with autonomous or manual driving, e.g. situation too complex, sensor failure or driver incapacity
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0055Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements
    • G05D1/0061Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements for transition from automatic pilot to manual pilot and vice versa
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/166Navigation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/172Driving mode indication
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/175Autonomous driving
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/186Displaying information according to relevancy
    • B60K2360/1876Displaying information according to relevancy according to vehicle situations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/197Blocking or enabling of input functions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/592Data transfer involving external databases
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Definitions

  • the present disclosure relates to a vehicle position sensing system.
  • Patent Document 1 discloses an invention relating to remote control for a group of autonomous transport vehicles.
  • the autonomous vehicle In this remote control for a group of autonomous transport vehicles, if an event occurs, the autonomous vehicle notifies a control center of that event, and starts interacting with the control center, or the control center takes over the driving of the autonomous vehicle.
  • an object of the present disclosure is to provide a vehicle position sensing system that enables an occupant of a vehicle to perceive regions in which manual driving is needed, and at least one type of region among regions in which automatic driving of the vehicle is possible and regions in which remote operation driving of the vehicle is possible.
  • a vehicle position sensing system of a first aspect includes: a position information acquiring section that acquires position information of a vehicle; a storage section that stores first region information, which expresses at least one type of region among automatic driving regions where automatic driving of the vehicle is possible and remote driving regions where remote operation driving of the vehicle is possible, and second region information that expresses manual driving regions where only manual driving of the vehicle is permitted; a display portion that can be viewed by an occupant of the vehicle; and a control section that, based on the position information, the first region information and the second region information, causes the display portion to display relative positional relationships among the vehicle, the manual driving regions and the at least one type of region.
  • position information of the vehicle is acquired by the position information acquiring section.
  • first region information which expresses at least one type of region among automatic driving regions where automatic driving of the vehicle is possible and remote driving regions where remote operation driving of the vehicle is possible, and second region information that expresses manual driving regions where only manual driving of the vehicle is permitted, are stored in the storage section.
  • the control section controls the display portion that can be seen by an occupant of the vehicle. Due thereto, the relative positional relationships among the vehicle, the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions, are displayed on the display portion.
  • the automatic driving regions and the remote driving regions are included in the first region information, and the display portion can display the automatic driving regions and the remote driving regions.
  • the automatic driving regions and the remote driving regions are stored in the storage section as the first region information.
  • the automatic driving regions and the remote driving regions are displayed on the display portion.
  • the relative positional relationships among a position of the vehicle, the manual driving regions and the at least one type of region can be displayed in a planar form on the display portion.
  • the position of the vehicle, the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions are displayed in a planar form on the display portion.
  • a route of the vehicle to a destination, the manual driving regions, and at least the one type of region can be displayed on the display portion.
  • the route of the vehicle to the destination, the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions are displayed on the display portion.
  • the vehicle has a notification section that, in a state in which the vehicle is positioned in the at least one type of region on the route, issues a warning to an occupant of the vehicle when the vehicle is positioned at a position that is within a predetermined distance from a manual driving regions.
  • the notification section issues a warning to the occupant of the vehicle when the vehicle is positioned at a position that is within a predetermined distance from a manual driving region.
  • the control section in a state in which the vehicle is positioned on the route, can display, on a windshield glass of the vehicle, three-dimensional objects that enable an occupant of the vehicle to identify the manual driving regions and the at least one type of region, such that the three-dimensional objects run along the route and overlap a scene that can be seen from the windshield glass.
  • three-dimensional objects that enable an occupant of the vehicle to identify the manual driving regions and at least one type of region among the automatic driving regions and the remote driving regions are displayed on a windshield glass of the vehicle such that the three-dimensional objects run along the route and overlap a landscape that can be seen from the windshield glass.
  • the vehicle position sensing system of the first aspect has the excellent effect that an occupant of a vehicle can perceive regions in which manual driving is needed, and at least one type of region among regions in which automatic driving of the vehicle is possible and regions in which remote operation driving of the vehicle is possible.
  • the vehicle position sensing system of the second aspect has the excellent effect that an occupant of the vehicle can perceive regions where manual driving is needed, regions where automatic driving of the vehicle is possible, and regions where remote operation driving of the vehicle is possible.
  • the vehicle position sensing system of the third aspect has the excellent effect that an occupant of the vehicle can easily understand the relative positional relationships among the position of the vehicle, the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions.
  • the vehicle position sensing system of the fourth aspect has the excellent effect that an occupant of the vehicle can perceive the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions, on the route of the vehicle to the destination.
  • the vehicle position sensing system of the fifth aspect has the excellent effect that an occupant of the vehicle can, without looking at the display portion, know that the vehicle is approaching a manual driving region.
  • the vehicle position sensing system of the sixth aspect has the excellent effect that an occupant of the vehicle can, without looking away from the vehicle front side, perceive the manual driving regions and at least one type of region among the automatic driving regions and the remote driving regions.
  • FIG. 1 is a schematic drawing that schematically shows the structure of a vehicle position sensing system relating to a first embodiment
  • FIG. 2 is a functional block drawing showing the structure of the vehicle position sensing system relating to the first embodiment
  • FIG. 3 is a block drawing showing hardware structures of a vehicle in the vehicle position sensing system relating to the first embodiment
  • FIG. 4 is a conceptual drawing that schematically shows an example of a display screen of a display portion in the vehicle position sensing system relating to the first embodiment
  • FIG. 5 is a flowchart showing the flow of processings by the vehicle position sensing system relating to the first embodiment
  • FIG. 6 is a flowchart showing the flow of processings by the vehicle position sensing system relating to the first embodiment
  • FIG. 7 is a flowchart showing the flow of processings by the vehicle position sensing system relating to the first embodiment
  • FIG. 8 is a flowchart showing the flow of processings by the vehicle position sensing system relating to the first embodiment.
  • FIG. 9 is a conceptual drawing that schematically shows an example of a display screen of a display portion in a vehicle position sensing system relating to a second embodiment.
  • vehicle position sensing system 10 relating to a first embodiment is described hereinafter by using FIG. 1 through FIG. 8 .
  • the vehicle position sensing system 10 is structured to include a vehicle control device 14 , which is installed in a “vehicle 12 ”, and a server 16 .
  • the vehicle 12 can be operated remotely by a remote operation device 20 that has a remote control device 18 .
  • the vehicle control device 14 , the remote control device 18 and the server 16 are connected via a network N so as to be able to communicate with one another.
  • the vehicle 12 is structured such that automatic driving by the vehicle control device 14 and manual driving that is based on the operations of a driver (vehicle occupant) 22 of the vehicle 12 can be carried out in addition to remote operation driving by the remote operation device 20 .
  • the vehicle control device 14 is structured to include a CPU (Central Processing Unit) 14 A, a ROM (Read Only Memory) 14 B, a RAM (Random Access Memory) 14 C, a storage 14 D, a communication I/F (Inter Face) 14 E and an input/output I/F 14 F.
  • the CPU 14 A, the ROM 14 B, the RAM 14 C, the storage 14 D, the communication I/F 14 E and the input/output I/F 14 F are connected so as to be able to communicate with one another via a bus 14 G.
  • the CPU 14 A is a central computing processing unit, and can execute various types of programs and can control the respective sections of the vehicle 12 . Concretely, the CPU 14 A reads-out programs from the ROM 14 B, and can execute the programs by using the RAM 14 C as a workspace. Due to an execution program that is stored in the ROM 14 B being read-out by and executed by the CPU 14 A, the vehicle control device 14 can exhibit various functions as will be described later.
  • the ROM 14 B can temporarily store programs and data as a workspace.
  • the storage 14 D is structured to include an HDD (Hard Disk Drive) or an SSD (Solid State Drive), and stores various types of programs, including the operating system, and various types of data. As described later, the storage 14 D can store environment information that is needed for automatic driving of the vehicle 12 , and the like.
  • HDD Hard Disk Drive
  • SSD Solid State Drive
  • the communication I/F 14 E is an interface that is used in connecting the vehicle control device 14 and the network N, and communication with the remote control device 18 and the server 16 and the like is made possible thereby. Communication standards of, for example, the internet®, FDDI, Wi-Fi® and the like are used at the interface. Further, the communication I/F 14 E may have a wireless device.
  • the communication I/F 14 E can transmit and receive various information to and from the remote operation device 20 via the network N.
  • the communication I/F 14 E can receive environment information from the server 16 via the network N.
  • the environment information includes weather information such as air temperature, wind speed, amount of precipitation and the like, earthquake information such as magnitude, tsunami warnings and the like, traffic information such as traffic jams, accidents, road construction and the like, and map information and the like. These environment information are stored in the storage 14 D.
  • the input/output I/F 14 F is an interface for the vehicle control device 14 to communicate with the respective devices that are installed in the vehicle 12 .
  • the vehicle control device 14 is connected via the input/output I/F 14 F so as to be able to communicate with respective devices that are described later. Note that these devices may be directly connected to the bus 14 G.
  • a GPS (Global Positioning System) device 24 external sensors 26 , actuators 28 , internal sensors 30 , input devices 32 , a “head up display 34 (hereinafter called display 34 )” serving as a display portion, and an “alarm 36 ” serving as a notification section are examples of devices that are connected to the vehicle control device 14 .
  • the GPS device 24 has an antenna 24 A that receives signals from an artificial satellite (a GPS satellite) 38 , and can measure the current position of the vehicle 12 .
  • the position information of the vehicle 12 that is measured by the GPS device 24 is inputted to the storage 14 D and is stored temporarily in the storage 14 D.
  • the external sensors 26 are a group of sensors that are used in detecting the peripheral environment of the vehicle 12 .
  • the external sensors 26 include cameras (not illustrated) that capture images of predetermined ranges, millimeter wave radar (not illustrated) that transmits search waves in a predetermined range, and LIDAR (Laser Imaging Detection and Ranging) (not illustrated) that scans a predetermined range.
  • the data which is acquired by the external sensors 26 and is images captured by the cameras, is stored in the storage 14 D, and is sent from the communication I/F 14 E via the server 16 and transmitted to the remote operation device 20 .
  • the internal sensors 30 are a group of sensors that are used in detecting the traveling state of the vehicle 12 , and include at least one of a vehicle speed sensor, an acceleration sensor and a yaw rate sensor.
  • the data acquired by the internal sensors 30 is stored in the storage 14 D.
  • the actuators 28 are devices that control the traveling of the vehicle 12 in accordance with control signals from the vehicle control device 14 , and include a throttle actuator (not illustrated), a brake actuator (not illustrated), and a steering actuator (not illustrated).
  • the throttle actuator controls the acceleration devices on the basis of control signals from the vehicle control device 14 , and, by controlling the amount of air that is supplied to the engine (not illustrated) of the vehicle 12 (i.e., by controlling the throttle opening), can control the driving force of the vehicle 12 .
  • the driving force of the vehicle 12 may be controlled by controlling the motor that is the power source in accordance with control signals of the vehicle control device 14 .
  • the brake actuator controls the brake devices on the basis of control signals from the vehicle control device 14 , and can control the braking force that is applied to the wheels (not illustrated) of the vehicle 12 .
  • the steering actuator controls the driving force of an assist motor (not illustrated) that controls the steering torque among the steering devices. Due thereto, the steering actuator can control the steering torque of the vehicle 12 .
  • the input devices 32 include the steering wheel (not illustrated), the brake pedal (not illustrated), and the acceleration pedal (not illustrated).
  • the amounts by which these are operated are detected by operation amount sensors (not illustrated), and are transmitted to the vehicle control device 14 . Further, at the time of manual driving of the vehicle 12 , the vehicle control device 14 transmits control signals that are based on the aforementioned operation amounts to the actuators 28 , and can control the acceleration devices, the braking devices and the steering devices.
  • the display 34 is a liquid crystal monitor for displaying various types of information relating to the vehicle 12 . Concretely, as is described later, position information of the vehicle 12 and map information of the periphery of the vehicle 12 and the like are displayed on the display 34 . Note that the display 34 can be operated on the basis of input from a touch panel (not illustrated) that is connected to the vehicle control device 14 such that communication therebetween is possible.
  • a selection screen for selecting an automatic driving mode, a remote operation mode, and a manual driving mode can be displayed on the touch panel.
  • the operator 22 can select the driving mode of the vehicle 12 by operating the touch panel.
  • a status signal which expresses that the vehicle control device 14 and the remote control device 18 are in that mode, is transmitted from the touch panel to the vehicle control device 14 and the remote control device 18 .
  • the vehicle control device 14 and the remote control device 18 are set so as to detect the status signal each predetermined time period. Further, the destination of the vehicle 12 also can be inputted through the touch panel.
  • the alarm 36 is disposed within a vehicle cabin 12 A of the vehicle 12 , and can warn the operator 22 on the basis of a control signal from the vehicle control device 14 .
  • the functional structures of the vehicle control device 14 are described by using FIG. 2 . Due to the CPU 14 A reading-out an execution program that is stored in the ROM 14 B and executing the program, the vehicle control device 14 functions as an aggregate of a “position information acquiring section 140 ”, a remote operation information acquiring section 141 , an automatic driving information acquiring section 142 , a vehicle occupant operation information acquiring section 143 , a “storage section 144 ”, a communication section 145 , and a “control section 146 ”.
  • the position information acquiring section 140 acquires position information of the vehicle 12 that is measured by the GPS device 24 , and can transmit, to the control section 146 , a signal that is based on this position information.
  • the remote operation information acquiring section 141 acquires data relating to control of the actuators 28 , and transmits signals based on these data to the control section 146 . Further, the remote operation information acquiring section 141 also acquires data of captured images and the like that are acquired at the external sensors 26 , and transmits signals based on these data to the communication section 145 .
  • the vehicle occupant operation information acquiring section 143 acquires data relating to the amounts of operation by the operator 22 , and transmits signals based on these data to the control section 146 .
  • the above-described environment information is stored in the storage section 144 .
  • First regional information which expresses “automatic driving regions 40 ” in which the vehicle 12 can be driven automatically and “remote driving regions 42 ” in which the vehicle 12 can be driven by remote operation
  • second regional information which expresses “manual driving regions 44 ” (see FIG. 4 ) where only manual driving of the vehicle 12 is permitted, are included in the environment information.
  • the various types of data that are stored in the storage section 144 are transmitted to the control section 146 .
  • the communication section 145 receives the signals S transmitted from the remote operation device 20 , and transmits them to the remote operation information acquiring section 141 . Further, the communication section 145 transmits, to the server 16 , the data acquired by the external sensors 26 .
  • the automatic driving information acquiring section 142 acquires automatic driving information, i.e., data that is needed for automatic driving of the vehicle 12 .
  • Position information of the vehicle 12 that is measured by the GPS device 24 , data relating to the peripheral environment of the vehicle 12 that is obtained by the external sensors 26 , data relating to the traveling state of the vehicle 12 that is obtained by the internal sensors 30 , environment information obtained from the server 16 , and the like are included in the information that is acquired by the automatic driving information acquiring section 142 .
  • the above-described data that are acquired by the automatic driving information acquiring section 142 are transmitted to the control section 146 .
  • the control section 146 On the basis of a signal from the touch panel, the control section 146 transmits the status signal to the remote operation device 20 via the communication section 145 . Further, in a case in which the automatic driving mode of the vehicle 12 is selected at the touch panel, on the basis of the destination inputted at the touch panel and the information acquired by the automatic driving information acquiring section 142 , a “route 46 ” (see FIG. 4 ) along which the vehicle 12 is to travel is set by the control section 146 . Further, the control section 146 controls the actuators 28 , and automatic driving of the vehicle 12 can be carried out.
  • control section 146 controls the actuators 28 on the basis of the signals S from the remote operation device 20 that are received by the communication section 145 , and can control the traveling of the vehicle 12 .
  • control section 146 controls the actuators 28 on the basis of signals from the vehicle occupant operation information acquiring section 143 , and can control the traveling of the vehicle 12 .
  • the structure of the server 16 is described next.
  • the server 16 is structured to include a CPU, a ROM, a RAM, a storage and a communication I/F (not illustrated).
  • the CPU, the ROM, the RAM, the storage and the communication I/F are connected via a bus (not illustrated) so as to be able to communicate with one another.
  • the CPU, the ROM, the RAM, the storage and the communication I/F have functions that are basically similar to those of the corresponding components that structure the above-described vehicle control device 14 .
  • the server 16 can exhibit various functions due to an execution program that is stored in the ROM being read-out and executed by the CPU.
  • the server 16 functions as an aggregate of a server control section 160 and a communication section 161 .
  • the server control section 160 has the function of acquiring various information from outside the server 16 . Note that, in addition to the above-described environment information, news information and data that is based on the signals S transmitted from the remote operation device 20 are also included in the information acquired by the server control section 160 .
  • the communication section 161 receives the signals S transmitted from the remote operation device 20 . On the basis of the data acquired at the server control section 160 , the communication section 161 transmits the signals S and signals based on various data to the vehicle 12 , and transmits signals based on various data to the remote operation device 20 .
  • the remote operation device 20 has the remote control device 18 , a monitor 20 A and input devices 20 B.
  • the input devices 20 B of the remote operation device 20 have basically the same structures as the input devices 32 .
  • the hardware structures of the remote control device 18 are structures that are basically similar to those of the vehicle control device 14 .
  • the remote control device 18 functions as an aggregate of the communication section 180 and a remote operation terminal control section 181 . Further, the monitor 20 A and the input devices 20 B are connected to the remote control device 18 such that communication therebetween is possible.
  • the communication section 180 transmits, to the server 16 , the signals S that are based on the operation amounts of the input devices 20 B, and receives, from the server 16 , signals that are based on various data.
  • the environment information and data which is acquired by the external sensors 26 of the vehicle 12 and is the images captured by the cameras, is included in the data transmitted from the server 16 .
  • the remote operation terminal control section 181 acquires the data that is detected at the input devices 20 B, and transmits the data to the communication section 145 via the communication section 180 and the server 16 . Further, on the basis of the data acquired from the communication section 180 , the remote operation terminal control section 181 controls the monitor 20 A, and can display the images captured by the cameras of the vehicle 12 on the monitor 20 A.
  • the present embodiment has the features in the point that the automatic driving regions 40 , the remote driving regions 42 and the manual driving regions 44 can be displayed on the display 34 , and in the conditions for operating the alarm 36 .
  • the control section 146 controlling the display 34 on the basis of the position information of the vehicle 12 and the environment information and the like, the relative positional relationships among the vehicle 12 , the automatic driving regions 40 , the remote driving regions 42 and the manual driving regions 44 are displayed on the display 34 .
  • the portions at which only automatic driving and manual driving are permitted and the portions, which run along these portions and are within a predetermined distance thereof, are set as the automatic driving regions 40 , and these are shown in FIG. 4 by hatching of a pattern in which a single diagonal line is repeated.
  • the automatic driving regions 40 are marked by a predetermined color on the display surface of the display 34 .
  • the portions at which only remote operation driving and manual driving are permitted and the portions, which run along these portions and are within a predetermined distance thereof, are set as the remote driving regions 42 , and these are shown in FIG. 4 by hatching of a pattern in which two diagonal lines are repeated. Note that, in actuality, the remote driving regions 42 are marked by a predetermined color, which is other than the color of the automatic driving regions 40 , on the display surface of the display 34 .
  • the portions at which automatic driving, remote operation driving and manual driving are permitted and the portions, which run along these portions and are within a predetermined distance thereof, are shown in FIG. 4 by hatching of a pattern in which three diagonal lines are repeated. Note that, in actuality, these regions are marked by a color, which is a mixture of the color of the automatic driving regions 40 and the color of the remote driving regions 42 , on the display surface of the display 34 .
  • the portions at which only manual driving is permitted and the portions, which run along these portions and are within a predetermined distance thereof, are set as the manual driving regions 44 , and these are shown by the hatching of the dot pattern.
  • the manual driving regions 44 are marked by a predetermined color, which is other than the colors of the automatic driving regions 40 and the remote driving regions 42 , on the display surface of the display 34 .
  • the alarm 36 issues a warning to the operator 22 on the basis of a control signal from the vehicle control device 14 .
  • the vehicle 12 is positioned at a position that is within a predetermined distance of a region where remote operation driving is impossible, the alarm 36 issues a warning to the operator 22 on the basis of a control signal from the vehicle control device 14 .
  • an onboard camera (not illustrated) that monitors the state of the operator 22 is installed within the vehicle cabin 12 A. Data of the images captured by this onboard camera and the like are processed at the control section 146 . In a case in which the alarm 36 is operated in the midst of automatic driving of the vehicle 12 , if the control section 146 judges, from the data from the onboard camera and the like, that a return to manual driving by the operator 22 is impossible, the control section 146 moves the vehicle 12 off to a safe place.
  • the control section 146 transmits a warning from the communication section 145 to the remote operation device 20 .
  • the operator of the remote operation device 20 moves the vehicle 12 off to a safe place.
  • step S 100 the CPU 14 A of the vehicle control device 14 acquires the position of the vehicle 12 .
  • step S 101 on the basis of the results of detection in step S 100 , the CPU 14 A displays the position of the vehicle 12 on the display 34 .
  • step S 102 the CPU 14 A displays the route 46 to the destination on the display 34 .
  • step S 103 the CPU 14 A displays the automatic driving regions 40 , the remote driving regions 42 and the manual driving regions 44 on the display 34 .
  • step S 104 the CPU 14 A detects the driving mode of the vehicle 12 on the basis of the status signal.
  • step S 105 of FIG. 6 on the basis of the results of detection in step S 104 , the CPU 14 A judges whether or not the driving mode of the vehicle 12 is the automatic driving mode. In a case in which the driving mode of the vehicle 12 is the automatic driving mode (step S 105 : YES), the CPU 14 A moves on to step S 106 . In a case in which the driving mode of the vehicle 12 is not the automatic driving mode (step S 105 : NO), the CPU 14 A moves on to step S 107 of FIG. 7 .
  • step S 106 the CPU 14 A functions as the automatic driving information acquiring section 142 , and acquires the automatic driving information.
  • step S 108 the CPU 14 A functions as the control section 146 , and, on the basis of the automatic driving information acquired in step S 106 , controls the actuators 28 and carries out automatic driving of the vehicle 12 .
  • step S 109 on the basis of the position information of the vehicle 12 and the environment information, the CPU 14 A judges whether or not the position of the vehicle 12 is within a predetermined distance from a region where automatic driving is impossible. If the position of the vehicle 12 is within a predetermined distance from a region where automatic driving is impossible (step S 109 : YES), the CPU 14 A moves on to step S 110 . If the position of the vehicle 12 is not within a predetermined distance from a region where automatic driving is impossible (step S 109 : NO), the CPU 14 A moves on to step S 111 .
  • step S 110 the CPU 14 A operates the alarm 36 .
  • step S 112 the CPU 14 A functions as the control section 146 , and judges whether or not the operator 22 can return to manual driving. If the operator 22 can return to manual driving (step S 112 : YES), the CPU 14 A moves on to step S 113 of FIG. 8 . If the operator 22 cannot return to manual driving (step S 112 : NO), the CPU 14 A moves on to step S 114 .
  • step S 113 the CPU 14 A functions as the control section 146 , and controls the actuators 28 on the basis of the operations of the operator 22 .
  • step S 115 the CPU 14 A detects the driving mode of the vehicle 12 on the basis of the status signal, and judges whether or not the manual driving mode has ended. If the manual driving mode is being continued (step S 115 : NO), the CPU 14 A returns to step S 113 . If the manual driving mode has ended (step S 115 : YES), the CPU 14 A ends the control flow.
  • step S 114 the CPU 14 A functions as the control section 146 and controls the actuators 28 , and carries out automatic driving of the vehicle 12 and moves the vehicle 12 off to a safe place, and ends the control flow.
  • step S 111 the CPU 14 A detects the driving mode of the vehicle 12 on the basis of the status signal, and judges whether or not the automatic driving mode has ended. If the automatic driving mode is being continued (step S 111 : NO), the CPU 14 A returns to step S 106 . If the automatic driving mode has ended (step S 111 : YES), the CPU 14 A ends the control flow.
  • step S 107 the CPU 14 A judges whether or not the driving mode of the vehicle 12 is the remote operation mode. If the driving mode of the vehicle is the remote operation mode (step S 107 : YES), the CPU 14 A moves on to step S 116 . If the driving mode of the vehicle 12 is not the remote operation mode (step S 107 : NO), the CPU 14 A moves on to step S 113 of FIG. 8 .
  • step S 116 the CPU 14 A functions as the remote operation information acquiring section 141 , and acquires remote operation information from the communication section 145 .
  • step S 117 the CPU 14 A functions as the control section 146 , and controls the actuators 28 on the basis of the remote operation information.
  • step S 118 on the basis of the position information of the vehicle 12 and the environment information, the CPU 14 A judges whether or not the position of the vehicle 12 is within a predetermined distance from a region where remote operation driving is impossible. If the position of the vehicle 12 is within a predetermined distance from a region where remote operation driving is impossible (step S 118 : YES), the CPU 14 A moves on to step S 119 . If the position of the vehicle 12 is not within a predetermined distance from a region where remote operation driving is impossible (step S 118 : NO), the CPU 14 A moves on to step S 120 .
  • step S 119 the CPU 14 A operates the alarm 36 .
  • step S 121 the CPU 14 A carries out the same processing as in step S 112 . Then, if the operator 22 can return to manual driving (step S 121 : YES), the CPU 14 A moves on to step S 113 of FIG. 8 . If the operator 22 cannot return to manual driving (step S 121 : NO), the CPU 14 A moves on to step S 122 .
  • the CPU 14 A functions as the control section 146 , and transmits a warning from the communication section 145 toward the remote operation device 20 . Then, the operator of the remote operation device 20 moves the vehicle 12 off to a safe place and ends the remote operation driving, and the control flow ends.
  • step S 120 the CPU 14 A detects the driving mode of the vehicle 12 on the basis of the status signal, and judges whether or not the remote operation mode has ended. If the remote operation mode is continuing (step S 120 : NO), the CPU 14 A returns to step S 116 . If the remote operation mode has ended (step S 120 : YES), the CPU 14 A ends the control flow.
  • the operator 22 of the vehicle 12 can perceive regions in which manual driving is required, regions in which automatic driving of the vehicle 12 is possible, and regions in which remote operation driving of the vehicle 12 is possible.
  • the operator 22 can perceive the manual driving regions 44 , the automatic driving regions 40 and the remote driving regions 42 .
  • the operator 22 can, without looking at the display 34 , know that the vehicle 12 is approaching the manual driving region 44 .
  • FIG. 9 A “vehicle position sensing system 50 ” relating to a second embodiment of the present invention is described hereinafter by using FIG. 9 . Note that structural portions that are the same as those of the above-described first embodiment are denoted by the same reference numerals, and description thereof is omitted.
  • an “organic EL display 54 ” that serves as the display portion is affixed to the “windshield glass 52 ” of the vehicle 12 along a surface thereof on the vehicle cabin 12 A inner side.
  • This organic EL display 54 is transparent, and can display various images by being driven by signals outputted from the vehicle control device 14 .
  • a virtual wall portion 56 that runs along the automatic driving region 40 , a virtual wall portion 58 that runs along the remote driving region 42 , and a virtual wall portion 60 that runs along the manual driving region 44 can be displayed on the organic EL display 54 so as to overlap the scene that can be seen from the windshield glass 52 .
  • the operator 22 can see the manual driving regions 44 , the automatic driving regions 40 and the remote driving regions 42 without looking away from the vehicle front side.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

A vehicle position sensing system includes: a position information acquiring section that acquires position information of a vehicle; a storage section that stores first region information, which expresses automatic driving regions where automatic driving of the vehicle is possible and remote driving regions where remote operation driving of the vehicle is possible, and second region information that expresses manual driving regions where only manual driving of the vehicle is permitted; a head up display that can be viewed by a driver of the vehicle; and a control section that, on the basis of the position information of the vehicle, the first region information and the second region information, causes the head up display to display relative positional relationships among the vehicle, the automatic driving regions, the remote driving regions and the manual driving regions.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2019-135662 filed on Jul. 23, 2019, the disclosure of which is incorporated by reference herein.
  • BACKGROUND Technical Field
  • The present disclosure relates to a vehicle position sensing system.
  • Related Art
  • U.S. Pat. No. 9,964,948 (Patent Document 1) discloses an invention relating to remote control for a group of autonomous transport vehicles.
  • In this remote control for a group of autonomous transport vehicles, if an event occurs, the autonomous vehicle notifies a control center of that event, and starts interacting with the control center, or the control center takes over the driving of the autonomous vehicle.
  • SUMMARY
  • Depending on the road situation and the structural state, in a state in which a vehicle is being driven automatically or is being operated remotely, there are cases in which it becomes difficult to continue the traveling of the vehicle. Further, in a place in which the vehicle cannot be driven by automatic driving or remote operation, an occupant of the vehicle (the driver) must drive the vehicle himself/herself. Further, taking into consideration the point of being able to ensure time for the vehicle occupant to prepare for manual driving, and the like, it is preferable that the vehicle occupant be able to perceive regions in which manual driving is needed, and at least one type of region among regions in which automatic driving is possible and regions in which remote operation driving is possible.
  • With regard to this point, in the prior art technique of aforementioned Patent Document 1, the vehicle occupant cannot perceive regions in which the vehicle can be driven automatically and regions in which the vehicle can be driven by remote operation. Accordingly, in the prior art of aforementioned Patent Document 1, there is room for improvement with regard to the point of the vehicle occupant perceiving regions in which manual driving is required, and at least one type of region among regions in which automatic driving of the vehicle is possible and regions in which remote operation driving of the vehicle is possible.
  • In view of the above-described circumstances, an object of the present disclosure is to provide a vehicle position sensing system that enables an occupant of a vehicle to perceive regions in which manual driving is needed, and at least one type of region among regions in which automatic driving of the vehicle is possible and regions in which remote operation driving of the vehicle is possible.
  • A vehicle position sensing system of a first aspect includes: a position information acquiring section that acquires position information of a vehicle; a storage section that stores first region information, which expresses at least one type of region among automatic driving regions where automatic driving of the vehicle is possible and remote driving regions where remote operation driving of the vehicle is possible, and second region information that expresses manual driving regions where only manual driving of the vehicle is permitted; a display portion that can be viewed by an occupant of the vehicle; and a control section that, based on the position information, the first region information and the second region information, causes the display portion to display relative positional relationships among the vehicle, the manual driving regions and the at least one type of region.
  • In accordance with the vehicle position sensing system of the first aspect, position information of the vehicle is acquired by the position information acquiring section. On the other hand, first region information, which expresses at least one type of region among automatic driving regions where automatic driving of the vehicle is possible and remote driving regions where remote operation driving of the vehicle is possible, and second region information that expresses manual driving regions where only manual driving of the vehicle is permitted, are stored in the storage section.
  • On the basis of the position information of the vehicle, the first region information and the second region information, the control section controls the display portion that can be seen by an occupant of the vehicle. Due thereto, the relative positional relationships among the vehicle, the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions, are displayed on the display portion.
  • In a vehicle position sensing system of a second aspect, in the vehicle position sensing system of the first aspect, the automatic driving regions and the remote driving regions are included in the first region information, and the display portion can display the automatic driving regions and the remote driving regions.
  • In accordance with the vehicle position sensing system of the second aspect, the automatic driving regions and the remote driving regions are stored in the storage section as the first region information. The automatic driving regions and the remote driving regions are displayed on the display portion.
  • In a vehicle position sensing system of a third aspect, in the vehicle position sensing system of the first or second aspect, the relative positional relationships among a position of the vehicle, the manual driving regions and the at least one type of region, can be displayed in a planar form on the display portion.
  • In accordance with the vehicle position sensing system of the third aspect, the position of the vehicle, the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions are displayed in a planar form on the display portion.
  • In a vehicle position sensing system of a fourth aspect, in the vehicle position sensing system of the third aspect, a route of the vehicle to a destination, the manual driving regions, and at least the one type of region can be displayed on the display portion.
  • In accordance with the vehicle position sensing system of the fourth aspect, the route of the vehicle to the destination, the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions are displayed on the display portion.
  • In a vehicle position sensing system of a fifth aspect, in the vehicle position sensing system of the fourth aspect, the vehicle has a notification section that, in a state in which the vehicle is positioned in the at least one type of region on the route, issues a warning to an occupant of the vehicle when the vehicle is positioned at a position that is within a predetermined distance from a manual driving regions.
  • In accordance with the vehicle position sensing system of the fifth aspect, in a state in which the vehicle is positioned on the route to the destination in at least one type of region among the automatic driving regions and the remote driving regions, the notification section issues a warning to the occupant of the vehicle when the vehicle is positioned at a position that is within a predetermined distance from a manual driving region.
  • In a vehicle position sensing system of a sixth aspect, in the vehicle position sensing system of the fourth or fifth aspect, in a state in which the vehicle is positioned on the route, the control section can display, on a windshield glass of the vehicle, three-dimensional objects that enable an occupant of the vehicle to identify the manual driving regions and the at least one type of region, such that the three-dimensional objects run along the route and overlap a scene that can be seen from the windshield glass.
  • In accordance with the vehicle position sensing system of the sixth aspect, due to control of the control section, in a state in which the vehicle is positioned on the route to the destination, three-dimensional objects that enable an occupant of the vehicle to identify the manual driving regions and at least one type of region among the automatic driving regions and the remote driving regions, are displayed on a windshield glass of the vehicle such that the three-dimensional objects run along the route and overlap a landscape that can be seen from the windshield glass.
  • As described above, the vehicle position sensing system of the first aspect has the excellent effect that an occupant of a vehicle can perceive regions in which manual driving is needed, and at least one type of region among regions in which automatic driving of the vehicle is possible and regions in which remote operation driving of the vehicle is possible.
  • The vehicle position sensing system of the second aspect has the excellent effect that an occupant of the vehicle can perceive regions where manual driving is needed, regions where automatic driving of the vehicle is possible, and regions where remote operation driving of the vehicle is possible.
  • The vehicle position sensing system of the third aspect has the excellent effect that an occupant of the vehicle can easily understand the relative positional relationships among the position of the vehicle, the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions.
  • The vehicle position sensing system of the fourth aspect has the excellent effect that an occupant of the vehicle can perceive the manual driving regions, and at least one type of region among the automatic driving regions and the remote driving regions, on the route of the vehicle to the destination.
  • The vehicle position sensing system of the fifth aspect has the excellent effect that an occupant of the vehicle can, without looking at the display portion, know that the vehicle is approaching a manual driving region.
  • The vehicle position sensing system of the sixth aspect has the excellent effect that an occupant of the vehicle can, without looking away from the vehicle front side, perceive the manual driving regions and at least one type of region among the automatic driving regions and the remote driving regions.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Preferred embodiments will be described in detail based on the following figures, wherein:
  • FIG. 1 is a schematic drawing that schematically shows the structure of a vehicle position sensing system relating to a first embodiment;
  • FIG. 2 is a functional block drawing showing the structure of the vehicle position sensing system relating to the first embodiment;
  • FIG. 3 is a block drawing showing hardware structures of a vehicle in the vehicle position sensing system relating to the first embodiment;
  • FIG. 4 is a conceptual drawing that schematically shows an example of a display screen of a display portion in the vehicle position sensing system relating to the first embodiment;
  • FIG. 5 is a flowchart showing the flow of processings by the vehicle position sensing system relating to the first embodiment;
  • FIG. 6 is a flowchart showing the flow of processings by the vehicle position sensing system relating to the first embodiment;
  • FIG. 7 is a flowchart showing the flow of processings by the vehicle position sensing system relating to the first embodiment;
  • FIG. 8 is a flowchart showing the flow of processings by the vehicle position sensing system relating to the first embodiment; and
  • FIG. 9 is a conceptual drawing that schematically shows an example of a display screen of a display portion in a vehicle position sensing system relating to a second embodiment.
  • DETAILED DESCRIPTION First Embodiment
  • A “vehicle position sensing system 10” relating to a first embodiment is described hereinafter by using FIG. 1 through FIG. 8. As shown in FIG. 1, the vehicle position sensing system 10 is structured to include a vehicle control device 14, which is installed in a “vehicle 12”, and a server 16.
  • The vehicle 12 can be operated remotely by a remote operation device 20 that has a remote control device 18. The vehicle control device 14, the remote control device 18 and the server 16 are connected via a network N so as to be able to communicate with one another. Note that, although details thereof are described later, the vehicle 12 is structured such that automatic driving by the vehicle control device 14 and manual driving that is based on the operations of a driver (vehicle occupant) 22 of the vehicle 12 can be carried out in addition to remote operation driving by the remote operation device 20.
  • As shown in FIG. 3, the vehicle control device 14 is structured to include a CPU (Central Processing Unit) 14A, a ROM (Read Only Memory) 14B, a RAM (Random Access Memory) 14C, a storage 14D, a communication I/F (Inter Face) 14E and an input/output I/F 14F. The CPU 14A, the ROM 14B, the RAM 14C, the storage 14D, the communication I/F 14E and the input/output I/F 14F are connected so as to be able to communicate with one another via a bus 14G.
  • The CPU 14A is a central computing processing unit, and can execute various types of programs and can control the respective sections of the vehicle 12. Concretely, the CPU 14A reads-out programs from the ROM 14B, and can execute the programs by using the RAM 14C as a workspace. Due to an execution program that is stored in the ROM 14B being read-out by and executed by the CPU 14A, the vehicle control device 14 can exhibit various functions as will be described later.
  • More specifically, various types of programs and various types of data are stored in the ROM 14B. On the other hand, the RAM 14C can temporarily store programs and data as a workspace.
  • The storage 14D is structured to include an HDD (Hard Disk Drive) or an SSD (Solid State Drive), and stores various types of programs, including the operating system, and various types of data. As described later, the storage 14D can store environment information that is needed for automatic driving of the vehicle 12, and the like.
  • The communication I/F 14E is an interface that is used in connecting the vehicle control device 14 and the network N, and communication with the remote control device 18 and the server 16 and the like is made possible thereby. Communication standards of, for example, the internet®, FDDI, Wi-Fi® and the like are used at the interface. Further, the communication I/F 14E may have a wireless device.
  • The communication I/F 14E can transmit and receive various information to and from the remote operation device 20 via the network N. In detail, the communication I/F 14E can receive environment information from the server 16 via the network N. Note that the environment information includes weather information such as air temperature, wind speed, amount of precipitation and the like, earthquake information such as magnitude, tsunami warnings and the like, traffic information such as traffic jams, accidents, road construction and the like, and map information and the like. These environment information are stored in the storage 14D.
  • The input/output I/F 14F is an interface for the vehicle control device 14 to communicate with the respective devices that are installed in the vehicle 12. The vehicle control device 14 is connected via the input/output I/F 14F so as to be able to communicate with respective devices that are described later. Note that these devices may be directly connected to the bus 14G.
  • A GPS (Global Positioning System) device 24, external sensors 26, actuators 28, internal sensors 30, input devices 32, a “head up display 34 (hereinafter called display 34)” serving as a display portion, and an “alarm 36” serving as a notification section are examples of devices that are connected to the vehicle control device 14.
  • The GPS device 24 has an antenna 24A that receives signals from an artificial satellite (a GPS satellite) 38, and can measure the current position of the vehicle 12. The position information of the vehicle 12 that is measured by the GPS device 24 is inputted to the storage 14D and is stored temporarily in the storage 14D.
  • The external sensors 26 are a group of sensors that are used in detecting the peripheral environment of the vehicle 12. The external sensors 26 include cameras (not illustrated) that capture images of predetermined ranges, millimeter wave radar (not illustrated) that transmits search waves in a predetermined range, and LIDAR (Laser Imaging Detection and Ranging) (not illustrated) that scans a predetermined range. The data, which is acquired by the external sensors 26 and is images captured by the cameras, is stored in the storage 14D, and is sent from the communication I/F 14E via the server 16 and transmitted to the remote operation device 20.
  • The internal sensors 30 are a group of sensors that are used in detecting the traveling state of the vehicle 12, and include at least one of a vehicle speed sensor, an acceleration sensor and a yaw rate sensor. The data acquired by the internal sensors 30 is stored in the storage 14D.
  • The actuators 28 are devices that control the traveling of the vehicle 12 in accordance with control signals from the vehicle control device 14, and include a throttle actuator (not illustrated), a brake actuator (not illustrated), and a steering actuator (not illustrated).
  • The throttle actuator controls the acceleration devices on the basis of control signals from the vehicle control device 14, and, by controlling the amount of air that is supplied to the engine (not illustrated) of the vehicle 12 (i.e., by controlling the throttle opening), can control the driving force of the vehicle 12. Note that in a case in which the vehicle 12 is a hybrid vehicle or an electric automobile, the driving force of the vehicle 12 may be controlled by controlling the motor that is the power source in accordance with control signals of the vehicle control device 14.
  • The brake actuator controls the brake devices on the basis of control signals from the vehicle control device 14, and can control the braking force that is applied to the wheels (not illustrated) of the vehicle 12.
  • On the basis of control signals from the vehicle control device 14, the steering actuator controls the driving force of an assist motor (not illustrated) that controls the steering torque among the steering devices. Due thereto, the steering actuator can control the steering torque of the vehicle 12.
  • On the other hand, the input devices 32 include the steering wheel (not illustrated), the brake pedal (not illustrated), and the acceleration pedal (not illustrated). The amounts by which these are operated are detected by operation amount sensors (not illustrated), and are transmitted to the vehicle control device 14. Further, at the time of manual driving of the vehicle 12, the vehicle control device 14 transmits control signals that are based on the aforementioned operation amounts to the actuators 28, and can control the acceleration devices, the braking devices and the steering devices.
  • The display 34 is a liquid crystal monitor for displaying various types of information relating to the vehicle 12. Concretely, as is described later, position information of the vehicle 12 and map information of the periphery of the vehicle 12 and the like are displayed on the display 34. Note that the display 34 can be operated on the basis of input from a touch panel (not illustrated) that is connected to the vehicle control device 14 such that communication therebetween is possible.
  • A selection screen for selecting an automatic driving mode, a remote operation mode, and a manual driving mode can be displayed on the touch panel. The operator 22 can select the driving mode of the vehicle 12 by operating the touch panel.
  • Concretely, due to the operator 22 operating the touch panel and selecting one mode among the above-described plural modes, a status signal, which expresses that the vehicle control device 14 and the remote control device 18 are in that mode, is transmitted from the touch panel to the vehicle control device 14 and the remote control device 18. Note that the vehicle control device 14 and the remote control device 18 are set so as to detect the status signal each predetermined time period. Further, the destination of the vehicle 12 also can be inputted through the touch panel.
  • The alarm 36 is disposed within a vehicle cabin 12A of the vehicle 12, and can warn the operator 22 on the basis of a control signal from the vehicle control device 14.
  • The functional structures of the vehicle control device 14 are described by using FIG. 2. Due to the CPU 14A reading-out an execution program that is stored in the ROM 14B and executing the program, the vehicle control device 14 functions as an aggregate of a “position information acquiring section 140”, a remote operation information acquiring section 141, an automatic driving information acquiring section 142, a vehicle occupant operation information acquiring section 143, a “storage section 144”, a communication section 145, and a “control section 146”.
  • The position information acquiring section 140 acquires position information of the vehicle 12 that is measured by the GPS device 24, and can transmit, to the control section 146, a signal that is based on this position information.
  • On the basis of signals S transmitted from a communication section 180 that is described later of the remote operation device 20, the remote operation information acquiring section 141 acquires data relating to control of the actuators 28, and transmits signals based on these data to the control section 146. Further, the remote operation information acquiring section 141 also acquires data of captured images and the like that are acquired at the external sensors 26, and transmits signals based on these data to the communication section 145.
  • On the basis of signals inputted from the input devices 32, the vehicle occupant operation information acquiring section 143 acquires data relating to the amounts of operation by the operator 22, and transmits signals based on these data to the control section 146.
  • The above-described environment information is stored in the storage section 144. First regional information, which expresses “automatic driving regions 40” in which the vehicle 12 can be driven automatically and “remote driving regions 42” in which the vehicle 12 can be driven by remote operation, and second regional information, which expresses “manual driving regions 44” (see FIG. 4) where only manual driving of the vehicle 12 is permitted, are included in the environment information. Note that the various types of data that are stored in the storage section 144 are transmitted to the control section 146.
  • The communication section 145 receives the signals S transmitted from the remote operation device 20, and transmits them to the remote operation information acquiring section 141. Further, the communication section 145 transmits, to the server 16, the data acquired by the external sensors 26.
  • The automatic driving information acquiring section 142 acquires automatic driving information, i.e., data that is needed for automatic driving of the vehicle 12. Position information of the vehicle 12 that is measured by the GPS device 24, data relating to the peripheral environment of the vehicle 12 that is obtained by the external sensors 26, data relating to the traveling state of the vehicle 12 that is obtained by the internal sensors 30, environment information obtained from the server 16, and the like are included in the information that is acquired by the automatic driving information acquiring section 142. The above-described data that are acquired by the automatic driving information acquiring section 142 are transmitted to the control section 146.
  • On the basis of a signal from the touch panel, the control section 146 transmits the status signal to the remote operation device 20 via the communication section 145. Further, in a case in which the automatic driving mode of the vehicle 12 is selected at the touch panel, on the basis of the destination inputted at the touch panel and the information acquired by the automatic driving information acquiring section 142, a “route 46” (see FIG. 4) along which the vehicle 12 is to travel is set by the control section 146. Further, the control section 146 controls the actuators 28, and automatic driving of the vehicle 12 can be carried out.
  • In a case in which the remote operation mode of the vehicle 12 is selected at the touch panel, the control section 146 controls the actuators 28 on the basis of the signals S from the remote operation device 20 that are received by the communication section 145, and can control the traveling of the vehicle 12.
  • Moreover, in a case in which the manual driving mode of the vehicle 12 is selected at the touch panel, the control section 146 controls the actuators 28 on the basis of signals from the vehicle occupant operation information acquiring section 143, and can control the traveling of the vehicle 12.
  • The structure of the server 16 is described next. The server 16 is structured to include a CPU, a ROM, a RAM, a storage and a communication I/F (not illustrated). The CPU, the ROM, the RAM, the storage and the communication I/F are connected via a bus (not illustrated) so as to be able to communicate with one another. Note that the CPU, the ROM, the RAM, the storage and the communication I/F have functions that are basically similar to those of the corresponding components that structure the above-described vehicle control device 14. Further, the server 16 can exhibit various functions due to an execution program that is stored in the ROM being read-out and executed by the CPU.
  • Concretely, the server 16 functions as an aggregate of a server control section 160 and a communication section 161. The server control section 160 has the function of acquiring various information from outside the server 16. Note that, in addition to the above-described environment information, news information and data that is based on the signals S transmitted from the remote operation device 20 are also included in the information acquired by the server control section 160.
  • On the other hand, the communication section 161 receives the signals S transmitted from the remote operation device 20. On the basis of the data acquired at the server control section 160, the communication section 161 transmits the signals S and signals based on various data to the vehicle 12, and transmits signals based on various data to the remote operation device 20.
  • The structure of the remote operation device 20 is described next. As shown in FIG. 1 as well, the remote operation device 20 has the remote control device 18, a monitor 20A and input devices 20B. Note that the input devices 20B of the remote operation device 20 have basically the same structures as the input devices 32.
  • The hardware structures of the remote control device 18 are structures that are basically similar to those of the vehicle control device 14. The remote control device 18 functions as an aggregate of the communication section 180 and a remote operation terminal control section 181. Further, the monitor 20A and the input devices 20B are connected to the remote control device 18 such that communication therebetween is possible.
  • On the basis of signals received from the remote operation terminal control section 181, the communication section 180 transmits, to the server 16, the signals S that are based on the operation amounts of the input devices 20B, and receives, from the server 16, signals that are based on various data. Note that the environment information and data, which is acquired by the external sensors 26 of the vehicle 12 and is the images captured by the cameras, is included in the data transmitted from the server 16.
  • The remote operation terminal control section 181 acquires the data that is detected at the input devices 20B, and transmits the data to the communication section 145 via the communication section 180 and the server 16. Further, on the basis of the data acquired from the communication section 180, the remote operation terminal control section 181 controls the monitor 20A, and can display the images captured by the cameras of the vehicle 12 on the monitor 20A.
  • Here, the present embodiment has the features in the point that the automatic driving regions 40, the remote driving regions 42 and the manual driving regions 44 can be displayed on the display 34, and in the conditions for operating the alarm 36.
  • As shown in FIG. 4, due to the control section 146 controlling the display 34 on the basis of the position information of the vehicle 12 and the environment information and the like, the relative positional relationships among the vehicle 12, the automatic driving regions 40, the remote driving regions 42 and the manual driving regions 44 are displayed on the display 34. In the present embodiment, as an example, of a road 48,the portions at which only automatic driving and manual driving are permitted and the portions, which run along these portions and are within a predetermined distance thereof, are set as the automatic driving regions 40, and these are shown in FIG. 4 by hatching of a pattern in which a single diagonal line is repeated. Note that, in actuality, the automatic driving regions 40 are marked by a predetermined color on the display surface of the display 34.
  • Further, of the road 48, the portions at which only remote operation driving and manual driving are permitted and the portions, which run along these portions and are within a predetermined distance thereof, are set as the remote driving regions 42, and these are shown in FIG. 4 by hatching of a pattern in which two diagonal lines are repeated. Note that, in actuality, the remote driving regions 42 are marked by a predetermined color, which is other than the color of the automatic driving regions 40, on the display surface of the display 34.
  • Moreover, of the road 48, the portions at which automatic driving, remote operation driving and manual driving are permitted and the portions, which run along these portions and are within a predetermined distance thereof, are shown in FIG. 4 by hatching of a pattern in which three diagonal lines are repeated. Note that, in actuality, these regions are marked by a color, which is a mixture of the color of the automatic driving regions 40 and the color of the remote driving regions 42, on the display surface of the display 34.
  • In addition, of the road 48, the portions at which only manual driving is permitted and the portions, which run along these portions and are within a predetermined distance thereof, are set as the manual driving regions 44, and these are shown by the hatching of the dot pattern. Note that, in actuality, the manual driving regions 44 are marked by a predetermined color, which is other than the colors of the automatic driving regions 40 and the remote driving regions 42, on the display surface of the display 34.
  • Note that, in FIG. 4, the position of the vehicle 12 is shown by pointer P, and the route 46 along which the vehicle 12 travels is the portion shown by the solid lines on the road 48.
  • In the present embodiment, when, while the vehicle 12 is in the midst of being driven automatically on the route 46, the vehicle 12 is positioned at a position that is within a predetermined distance of a region where automatic driving is impossible, the alarm 36 issues a warning to the operator 22 on the basis of a control signal from the vehicle control device 14. On the other hand, when, while the vehicle 12 is in the midst of being driven by remote operation on the route 46, the vehicle 12 is positioned at a position that is within a predetermined distance of a region where remote operation driving is impossible, the alarm 36 issues a warning to the operator 22 on the basis of a control signal from the vehicle control device 14.
  • Note that an onboard camera (not illustrated) that monitors the state of the operator 22 is installed within the vehicle cabin 12A. Data of the images captured by this onboard camera and the like are processed at the control section 146. In a case in which the alarm 36 is operated in the midst of automatic driving of the vehicle 12, if the control section 146 judges, from the data from the onboard camera and the like, that a return to manual driving by the operator 22 is impossible, the control section 146 moves the vehicle 12 off to a safe place.
  • On the other hand, in a case in which the alarm 36 is operated in the midst of remote operation driving of the vehicle 12, if it is judged, from the data from the onboard camera and the like, that a return to manual driving by the operator 22 is impossible, the control section 146 transmits a warning from the communication section 145 to the remote operation device 20. In this case, the operator of the remote operation device 20 (the remote operation driver of the vehicle 12) moves the vehicle 12 off to a safe place.
  • Operation and Effects of Present Embodiment
  • Operation and effects of the present embodiment are described next.
  • An example of the flow of the control at the vehicle position sensing system 10 is described hereinafter mainly by using the flowcharts shown in FIG. 5 through FIG. 8.
  • As shown in FIG. 5, when this control flow is started, in step S100, the CPU 14A of the vehicle control device 14 acquires the position of the vehicle 12.
  • In step S101, on the basis of the results of detection in step S100, the CPU 14A displays the position of the vehicle 12 on the display 34.
  • In step S102, the CPU 14A displays the route 46 to the destination on the display 34.
  • In step S103, the CPU 14A displays the automatic driving regions 40, the remote driving regions 42 and the manual driving regions 44 on the display 34.
  • In step S104, the CPU 14A detects the driving mode of the vehicle 12 on the basis of the status signal.
  • In step S105 of FIG. 6, on the basis of the results of detection in step S104, the CPU 14A judges whether or not the driving mode of the vehicle 12 is the automatic driving mode. In a case in which the driving mode of the vehicle 12 is the automatic driving mode (step S105: YES), the CPU 14A moves on to step S106. In a case in which the driving mode of the vehicle 12 is not the automatic driving mode (step S105: NO), the CPU 14A moves on to step S107 of FIG. 7.
  • In step S106, the CPU 14A functions as the automatic driving information acquiring section 142, and acquires the automatic driving information.
  • In step S108, the CPU 14A functions as the control section 146, and, on the basis of the automatic driving information acquired in step S106, controls the actuators 28 and carries out automatic driving of the vehicle 12.
  • In step S109, on the basis of the position information of the vehicle 12 and the environment information, the CPU 14A judges whether or not the position of the vehicle 12 is within a predetermined distance from a region where automatic driving is impossible. If the position of the vehicle 12 is within a predetermined distance from a region where automatic driving is impossible (step S109: YES), the CPU 14A moves on to step S110. If the position of the vehicle 12 is not within a predetermined distance from a region where automatic driving is impossible (step S109: NO), the CPU 14A moves on to step S111.
  • In step S110, the CPU 14A operates the alarm 36.
  • In step S112, the CPU 14A functions as the control section 146, and judges whether or not the operator 22 can return to manual driving. If the operator 22 can return to manual driving (step S112: YES), the CPU 14A moves on to step S113 of FIG. 8. If the operator 22 cannot return to manual driving (step S112: NO), the CPU 14A moves on to step S114.
  • In step S113, the CPU 14A functions as the control section 146, and controls the actuators 28 on the basis of the operations of the operator 22.
  • In step S115, the CPU 14A detects the driving mode of the vehicle 12 on the basis of the status signal, and judges whether or not the manual driving mode has ended. If the manual driving mode is being continued (step S115: NO), the CPU 14A returns to step S113. If the manual driving mode has ended (step S115: YES), the CPU 14A ends the control flow.
  • On the other hand, in a case in which the CPU 14A moves on from step S112 to step S114, in step S114, the CPU 14A functions as the control section 146 and controls the actuators 28, and carries out automatic driving of the vehicle 12 and moves the vehicle 12 off to a safe place, and ends the control flow.
  • On the other hand, in a case in which the CPU 14A moves on from step S109 to step S111, in step S111, the CPU 14A detects the driving mode of the vehicle 12 on the basis of the status signal, and judges whether or not the automatic driving mode has ended. If the automatic driving mode is being continued (step S111: NO), the CPU 14A returns to step S106. If the automatic driving mode has ended (step S111: YES), the CPU 14A ends the control flow.
  • On the other hand, in a case in which the CPU 14A moves on from step S105 to step S107, in step S107, the CPU 14A judges whether or not the driving mode of the vehicle 12 is the remote operation mode. If the driving mode of the vehicle is the remote operation mode (step S107: YES), the CPU 14A moves on to step S116. If the driving mode of the vehicle 12 is not the remote operation mode (step S107: NO), the CPU 14A moves on to step S113 of FIG. 8.
  • In step S116, the CPU 14A functions as the remote operation information acquiring section 141, and acquires remote operation information from the communication section 145.
  • In step S117, the CPU 14A functions as the control section 146, and controls the actuators 28 on the basis of the remote operation information.
  • In step S118, on the basis of the position information of the vehicle 12 and the environment information, the CPU 14A judges whether or not the position of the vehicle 12 is within a predetermined distance from a region where remote operation driving is impossible. If the position of the vehicle 12 is within a predetermined distance from a region where remote operation driving is impossible (step S118: YES), the CPU 14A moves on to step S119. If the position of the vehicle 12 is not within a predetermined distance from a region where remote operation driving is impossible (step S118: NO), the CPU 14A moves on to step S120.
  • In step S119, the CPU 14A operates the alarm 36.
  • In step S121, the CPU 14A carries out the same processing as in step S112. Then, if the operator 22 can return to manual driving (step S121: YES), the CPU 14A moves on to step S113 of FIG. 8. If the operator 22 cannot return to manual driving (step S121: NO), the CPU 14A moves on to step S122.
  • In the case of moving on from step S121 to step S122, the CPU 14A functions as the control section 146, and transmits a warning from the communication section 145 toward the remote operation device 20. Then, the operator of the remote operation device 20 moves the vehicle 12 off to a safe place and ends the remote operation driving, and the control flow ends.
  • On the other hand, in the case of moving on from step S118 to step S120, in step S120, the CPU 14A detects the driving mode of the vehicle 12 on the basis of the status signal, and judges whether or not the remote operation mode has ended. If the remote operation mode is continuing (step S120: NO), the CPU 14A returns to step S116. If the remote operation mode has ended (step S120: YES), the CPU 14A ends the control flow.
  • As described above, in the present embodiment, the operator 22 of the vehicle 12 can perceive regions in which manual driving is required, regions in which automatic driving of the vehicle 12 is possible, and regions in which remote operation driving of the vehicle 12 is possible.
  • Further, in the present embodiment, on the route 46 to the destination of the vehicle 12, the operator 22 can perceive the manual driving regions 44, the automatic driving regions 40 and the remote driving regions 42.
  • In the present embodiment, the operator 22 can, without looking at the display 34, know that the vehicle 12 is approaching the manual driving region 44.
  • Second Embodiment
  • A “vehicle position sensing system 50” relating to a second embodiment of the present invention is described hereinafter by using FIG. 9. Note that structural portions that are the same as those of the above-described first embodiment are denoted by the same reference numerals, and description thereof is omitted.
  • In the present embodiment, an “organic EL display 54” that serves as the display portion is affixed to the “windshield glass 52” of the vehicle 12 along a surface thereof on the vehicle cabin 12A inner side. This organic EL display 54 is transparent, and can display various images by being driven by signals outputted from the vehicle control device 14.
  • Further, in the present embodiment, in a state in which the vehicle 12 is positioned on the route 46, due to the organic EL display 54 being controlled by the control section 146, a virtual wall portion 56 that runs along the automatic driving region 40, a virtual wall portion 58 that runs along the remote driving region 42, and a virtual wall portion 60 that runs along the manual driving region 44 can be displayed on the organic EL display 54 so as to overlap the scene that can be seen from the windshield glass 52.
  • In accordance with such a structure, the operator 22 can see the manual driving regions 44, the automatic driving regions 40 and the remote driving regions 42 without looking away from the vehicle front side.
  • <Supplementary Description of Above-Described Embodiments>
    • (1) In the above-described embodiments, the manual driving regions 44, the automatic driving regions 40 and the remote driving regions 42 can be displayed on the display portion. However, display may be carried out such that the manual driving regions 44, and either one of the automatic driving regions 40 and the remote driving regions 42, are displayed on the display portion. Namely, the automatic driving regions 40 or the remote driving regions 42 may be included in the first region information.
    • (2) The warning that the alarm 36 issues may be a voice to the operator 22, or light emitted toward the operator 22, or vibrations to the operator 22.

Claims (6)

What is claimed is:
1. A vehicle position sensing system comprising:
a position information acquiring section that acquires position information of a vehicle;
a storage section that stores first region information, which expresses at least one type of region among automatic driving regions where automatic driving of the vehicle is possible and remote driving regions where remote operation driving of the vehicle is possible, and second region information that expresses manual driving regions where only manual driving of the vehicle is permitted;
a display portion that can be viewed by an occupant of the vehicle; and
a control section that, based on the position information, the first region information and the second region information, causes the display portion to display relative positional relationships among the vehicle, the manual driving regions and the at least one type of region.
2. The vehicle position sensing system of claim 1, wherein:
the automatic driving regions and the remote driving regions are included in the first region information, and
the display portion can display the automatic driving regions and the remote driving regions.
3. The vehicle position sensing system of claim 1, wherein the relative positional relationships among a position of the vehicle, the manual driving regions and the at least one type of region can be displayed in a planar form on the display portion.
4. The vehicle position sensing system of claim 3, wherein a route of the vehicle to a destination, the manual driving regions, and the at least one type of region, can be displayed on the display portion.
5. The vehicle position sensing system of claim 4, wherein the vehicle has a notification section that, in a state in which the vehicle is positioned in the at least one type of region on the route, issues a warning to an occupant of the vehicle when the vehicle is positioned at a position that is within a predetermined distance from the manual driving regions.
6. The vehicle position sensing system of claim 4, wherein, in a state in which the vehicle is positioned on the route, the control section can display, on a windshield glass of the vehicle, three-dimensional objects that enable an occupant of the vehicle to identify the manual driving regions and the at least one type of region, such that the three-dimensional objects run along the route and overlap a scene that can be seen from the windshield glass.
US16/889,169 2019-07-23 2020-06-01 Vehicle position sensing system Active 2041-01-08 US11535279B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2019-135662 2019-07-23
JPJP2019-135662 2019-07-23
JP2019135662A JP7293945B2 (en) 2019-07-23 2019-07-23 Vehicle position detection system

Publications (2)

Publication Number Publication Date
US20210024098A1 true US20210024098A1 (en) 2021-01-28
US11535279B2 US11535279B2 (en) 2022-12-27

Family

ID=74189193

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/889,169 Active 2041-01-08 US11535279B2 (en) 2019-07-23 2020-06-01 Vehicle position sensing system

Country Status (3)

Country Link
US (1) US11535279B2 (en)
JP (1) JP7293945B2 (en)
CN (1) CN112298198A (en)

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3281188B2 (en) * 1994-08-09 2002-05-13 ヤマハ発動機株式会社 Unmanned car
JP2008051644A (en) * 2006-08-24 2008-03-06 Mazda Motor Corp Navigation system for vehicle
WO2009122372A2 (en) * 2008-04-04 2009-10-08 Koninklijke Philips Electronics N.V. A display managing system
JP5622818B2 (en) * 2012-09-28 2014-11-12 富士重工業株式会社 Gaze guidance system
JP6318741B2 (en) 2014-03-18 2018-05-09 アイシン・エィ・ダブリュ株式会社 Automatic driving support system, automatic driving support method, and computer program
JP6273976B2 (en) * 2014-03-31 2018-02-07 株式会社デンソー Display control device for vehicle
JP6201916B2 (en) 2014-07-04 2017-09-27 株式会社デンソー Vehicle operation mode control device
WO2016151750A1 (en) * 2015-03-24 2016-09-29 パイオニア株式会社 Map information storage device, automatic drive control device, control method, program, and storage medium
JP6474307B2 (en) 2015-04-27 2019-02-27 アイシン・エィ・ダブリュ株式会社 Automatic driving support system, automatic driving support method, and computer program
JP6381826B2 (en) * 2015-10-30 2018-08-29 三菱電機株式会社 Vehicle information display control device
US9964948B2 (en) 2016-04-20 2018-05-08 The Florida International University Board Of Trustees Remote control and concierge service for an autonomous transit vehicle fleet
JP6375568B2 (en) 2016-04-28 2018-08-22 本田技研工業株式会社 Vehicle control system, vehicle control method, and vehicle control program
JP6771566B2 (en) * 2016-08-05 2020-10-21 三菱電機株式会社 Operation authority management device
JP6923306B2 (en) 2016-11-09 2021-08-18 株式会社野村総合研究所 Vehicle driving support system
CN109891473A (en) 2016-11-11 2019-06-14 本田技研工业株式会社 Controller of vehicle, vehicle control system, control method for vehicle and vehicle control program
JP2018076027A (en) * 2016-11-11 2018-05-17 本田技研工業株式会社 Vehicle control system, vehicle control method, and vehicle control program
CN110249374B (en) * 2017-02-09 2022-07-26 索尼半导体解决方案公司 Driving assistance device, method for the same, and driving assistance system
JP6729530B2 (en) 2017-10-13 2020-07-22 株式会社デンソー Route setting device
WO2019077739A1 (en) * 2017-10-20 2019-04-25 株式会社日立製作所 Moving body control system
JP2019093879A (en) * 2017-11-22 2019-06-20 クラリオン株式会社 Driving support device and driving support method
US10401182B2 (en) * 2017-12-13 2019-09-03 Google Llc Systems and methods for avoiding location-dependent driving restrictions
JP6991053B2 (en) 2017-12-14 2022-01-12 フォルシアクラリオン・エレクトロニクス株式会社 In-vehicle device, information presentation method
JP2019113520A (en) 2017-12-22 2019-07-11 株式会社デンソー Onboard information display device and used language estimating device
CN108613683A (en) * 2018-06-26 2018-10-02 威马智慧出行科技(上海)有限公司 On-vehicle navigation apparatus, method and automobile
JP7224998B2 (en) 2019-03-28 2023-02-20 日産自動車株式会社 Information processing method, information processing device, and information processing system

Also Published As

Publication number Publication date
JP2021018209A (en) 2021-02-15
CN112298198A (en) 2021-02-02
JP7293945B2 (en) 2023-06-20
US11535279B2 (en) 2022-12-27

Similar Documents

Publication Publication Date Title
US10845796B2 (en) Electronic control units, vehicles, and methods for switching vehicle control from an autonomous driving mode
RU2709363C2 (en) Method and device for detection of dangerous wind conditions
US10017116B2 (en) Image display apparatus
US10452930B2 (en) Information display device mounted in vehicle including detector
EP3219568A1 (en) Auto driving control system
JP6402684B2 (en) Display device
WO2014208081A1 (en) Head-up display and program product
US20080015772A1 (en) Drive-assist information providing system for driver of vehicle
JP2010250478A (en) Driving support device
US10315648B2 (en) Personalized active safety systems
US20200294385A1 (en) Vehicle operation in response to an emergency event
CN114599566A (en) Remote assistance system, in-vehicle device, remote assistance method, and remote assistance program
US11636762B2 (en) Image display device
WO2021220845A1 (en) Vehicle recording device and information recording method
EP3835823A1 (en) Information processing device, information processing method, computer program, information processing system, and moving body device
US20230166755A1 (en) Vehicle display control device, vehicle display control system, and vehicle display control method
US11535279B2 (en) Vehicle position sensing system
US20210179115A1 (en) Method and apparatus for monitoring a yaw sensor
US20230168104A1 (en) Image processing device
CN111381592A (en) Vehicle control method and device and vehicle
KR102482613B1 (en) Dynamically-localized sensors for vehicles
JP7476263B2 (en) Vehicle control device, vehicle control method, and vehicle control program
JP7478285B2 (en) Vehicle, vehicle control method, and vehicle control program
US20230065761A1 (en) Remote driver support method, remote driver support system, and storage medium
US20230176572A1 (en) Remote operation system and remote operation control method

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ITO, YUKI;NAKANISHI, TSUKASA;MORIKAWA, YUTA;AND OTHERS;SIGNING DATES FROM 20200304 TO 20200316;REEL/FRAME:052802/0086

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE