US20200265250A1 - Driving support system and server device - Google Patents
Driving support system and server device Download PDFInfo
- Publication number
- US20200265250A1 US20200265250A1 US16/706,898 US201916706898A US2020265250A1 US 20200265250 A1 US20200265250 A1 US 20200265250A1 US 201916706898 A US201916706898 A US 201916706898A US 2020265250 A1 US2020265250 A1 US 2020265250A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- traffic light
- position information
- visual
- recognition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004364 calculation method Methods 0.000 claims description 2
- 230000015654 memory Effects 0.000 description 33
- 238000000034 method Methods 0.000 description 18
- 238000004891 communication Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 13
- 238000004590 computer program Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 5
- 239000000470 constituent Substances 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- G06K9/00825—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/123—Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18154—Approaching an intersection
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G06K9/00818—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09623—Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09626—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages where the origin of the information is within the own vehicle, e.g. a local storage device, digital map
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/60—Traffic rules, e.g. speed limits or right of way
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/25—Data precision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
Definitions
- the present disclosure relates to a driving support system and a server device.
- JP 2018-5827 A describes a driving support system configured to execute lane-changing without any burden on a driver. More specifically, JP 2018-5827 describes a technique in which a state of a host vehicle to be obtained if lane-changing is performed is predicted in advance, and when it is determined that it is difficult to visually recognize a traffic light, based on the height of the traffic light, the size of a forward vehicle, or the like, the driver is warned to promote cancellation of the lane-changing.
- the difficulty in visual recognition of a traffic light also occurs in cases other than lane-changing.
- a road winds or a road changes in height like a mountain area in a case where a large-size vehicle temporarily travels ahead of a host vehicle, or in a case where a power line or the like is under construction, the field of view ahead of the host vehicle decreases, so that it might become difficult to visually recognize a traffic light.
- an object of the present disclosure is to provide a driving support system and a server device each of which can accurately grasp a position where a traffic light is visually recognizable and accurately notify a driver or the like that a traffic light that should be visually recognizable originally is not visually recognizable.
- the driving support system includes an acquisition portion, an image acquisition portion, a traffic-light recognition portion, and a notification portion.
- the acquisition portion is configured to acquire visual-recognition position information on a position where a driver of a vehicle visually recognizes a traffic light.
- the image acquisition portion is configured to acquire a forward image ahead of the vehicle.
- the traffic-light recognition portion is configured to recognize a traffic light included in the forward image.
- the notification portion is configured to notify the driver of warning when the traffic light is not recognized from the forward image, in a case where the vehicle is present at a position based on the visual-recognition position information.
- the driving support system may further include an output portion configured to output, to an outer part, route information on a route where the vehicle is planned to travel; and a second acquisition portion configured to acquire pieces of visual-recognition position information of a plurality of traffic lights on the route indicated by the route information. Further, the driving support system may further include a second output portion configured to output, to an outer part, position information on a position of the vehicle where the traffic light is recognized by the traffic-light recognition portion. Further, the present disclosure may be applied to a road sign instead of the traffic light.
- the visual-recognition position information may be latitude information and longitude information indicative of a position where the traffic light is visually recognizable, or the visual-recognition position information may be relative position information based on a predetermined intersection or the like. Further, the visual-recognition position information may be information indicating that a present vehicle position is a visual-recognition position where the traffic light is visually recognizable. In this case, information to be received may be information of one bit. In the driving support system, when this information is received, position information of the vehicle at that time can be acquired as the position where the traffic light is visually recognizable.
- the server device includes an acquisition portion and a receiving portion.
- the acquisition portion is configured to acquire, from a plurality of vehicles, pieces of visual-recognition position information on respective positions where drivers of the vehicles visually recognize a traffic light.
- the receiving portion is configured to receive, from the vehicles, pieces of position information on respective positions of the vehicles where the traffic light is recognized.
- the server device may transmit visual-recognition position information to the vehicle.
- the server device may acquire vehicle-type information of the vehicle and transmit, to the vehicle, visual-recognition position information corrected in accordance with the vehicle type.
- the server device may calculate the visual-recognition position information based on the pieces of position information of the vehicles that are received from the vehicles.
- the server device may be associated with a predetermined traffic light and placed under a road surface or the like near the traffic light. Accordingly, a plurality of traffic lights may be provided such that respective server devices are provided to the traffic lights. In this case, transmission and reception of information may be performed between the vehicle and the server device by road-to-vehicle communication.
- FIG. 1 is a block diagram illustrating a schematic hardware configuration of a vehicle 100 ;
- FIG. 2 is a schematic view of a server device 200 and a plurality of vehicles 100 communicable with the server device 200 ;
- FIG. 3 is a flowchart illustrating a driving support method according to the present embodiment
- FIG. 4 is a conception diagram to describe visual-recognition position information
- FIG. 5A is a schematic view of a forward image
- FIG. 5B is a schematic view of a forward image
- FIG. 6 is a flowchart illustrating a driving support method according to a second embodiment.
- FIG. 1 is a block diagram illustrating a schematic hardware configuration of a vehicle 100 .
- FIG. 2 illustrates a system including a server device 200 connected to a plurality of vehicles 100 via a network N. Note that, when a specific vehicle 100 is mentioned, it is referred to as a vehicle 100 A, a vehicle 100 B, or the like, and when the vehicles 100 are generally mentioned, they are just referred to as the vehicles 100 .
- the vehicle 100 includes a control device 110 and a communications device 120 , a sensor device 130 , a radar device 140 , a camera device 150 , a navigation device 160 , a driving device 170 , and an input-output device 180 that are connected to the control device 110 via a bus or the like.
- the control device 110 receives predetermined signals from the devices connected thereto, performs a computing process or the like, and outputs control signals to drive the devices.
- the control device 110 includes a processor 110 A and a memory 110 B.
- the control device 110 can function as a driving support system according to the present embodiment by the processor 110 A executing a computer program stored in the memory 110 B.
- the processor 110 A executes a predetermined computing process in accordance with a computer program such as firmware stored in the memory 110 B.
- the processor 110 A is implemented by one or more central processing units (CPU), a micro processing unit (MPU), a GPU, a microprocessor, a processor core, a multiprocessor, an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), and so on.
- the memory 110 B includes a nonvolatile memory such as an MRAM, a NAND flash memory, a NOR flash memory, an SSD, or a hard disk drive, and a volatile memory such as an SRAM or a DRAM.
- a nonvolatile memory such as an MRAM, a NAND flash memory, a NOR flash memory, an SSD, or a hard disk drive
- a volatile memory such as an SRAM or a DRAM.
- the nonvolatile memory corresponds to a non-transitory tangible medium.
- the volatile memory provides a working area in which a computer program loaded from the nonvolatile memory and various pieces of data generated while the processor 110 A executes a computer program are temporarily stored. Note that a computer program or data acquired from the communications device 120 may be stored in the nonvolatile memory.
- the communications device 120 includes means that transmits and receives information to and from an external device such as the server device 200 and includes one or more communication means such as WiFi (a wireless communication method based on the 802.11 standard defined by IEEE).
- the external device may be other vehicles 100 or may be infrastructure equipment provided under a road surface or in a power pole, a building, or the like. Further, the communications device 120 receives a GPS signal and outputs position information of the vehicle 100 to the control device 110 .
- the sensor device 130 is a sensor configured to detect the behavior of the vehicle 100 and includes a rotary encoder configured to detect the vehicle speed of the vehicle and a gyro sensor configured to detect the inclination of the vehicle. Further, the sensor device 130 may include a magnetic sensor configured to detect a marker or the like embedded in a road.
- the radar device 140 includes a LiDAR ranging system including a millimeter wave radar to avoid a collision with a pedestrian or the like.
- the camera device 150 includes a plurality of cameras including an imaging sensor such as a CCD or CMOS image sensor so as to capture images ahead of and behind the vehicle 100 .
- the control device 110 can receive signals acquired by the sensor device 130 , the radar device 140 , and the camera device 150 and output a control signal based on them to a corresponding device.
- the control device 110 can acquire an imaging signal of an image captured by the camera device 150 and execute image recognition so as to recognize an obstacle or the like included in the image thus captured, and the control device 110 can accordingly output, to the driving device 170 , a control signal to stop the vehicle 100 , for example.
- the camera device 150 may be equipped with a semiconductor IC for image processing such as GPU that enables image recognition or the like, so that the camera device 150 recognizes a driving lane where the vehicle 100 should travel or an obstacle such as a pedestrian based on an image captured by a camera or the like of the camera device 150 and outputs information on the driving lane or the obstacle to the control device 110 .
- a semiconductor IC for image processing such as GPU that enables image recognition or the like
- the navigation device 160 calculates a route to a predetermined destination based on an input from a driver or the like and performs a guidance.
- the navigation device 160 may include a nonvolatile memory (not shown) and store map data in the nonvolatile memory.
- the navigation device 160 may acquire map data stored in the memory 110 B or may acquire map data from the communications device 120 .
- the map data includes information on road types and information about road signs, traffic lights, and the like. Further, the map data includes position information on a specific point called a node of a facility, an address, an intersection of a road, or the like, and information corresponding to a link corresponding to a road that connects nodes to each other.
- the position information is indicated by latitude, longitude, and altitude, for example.
- a processor configured to calculate a route may be provided in the navigation device 160 , but the processor 110 A may execute the calculation.
- current position information of the vehicle 100 may be acquired such that position information acquired based on a GPS signal received by the communications device 120 is acquired from the control device 110 or the navigation device 160 itself receives a GPS signal.
- the navigation device 160 may be constituted by an information processing terminal owned by the driver or the like.
- the information processing terminal may be connected to the vehicle 100 by a Bluetooth (registered trademark) device or the like of the communications device 120 so that route guidance information or the like to give route guidance is output from the input-output device 180 of the vehicle 100 .
- the driving device 170 includes a motor and an actuator for operations of an engine, a brake, and a steering wheel of the vehicle 100 and operates based on a control signal received from the control device 110 .
- the vehicle 100 may be configured such that the control device outputs a control signal to the driving device 170 or the like based on the operations of an accelerator pedal, a brake pedal, the steering wheel, and the like by the driver or the like, but the vehicle 100 may have an automatic driving function to output, from the control device 110 to the driving device 170 or the like, a control signal to autonomously drive the vehicle 100 based on signals acquired from the radar device 140 , the camera device 150 , and the like.
- the vehicle 100 may be an electric vehicle including a battery and an electric motor.
- the input-output device 180 includes an input device such as a touch panel or a microphone via which the driver or the like inputs information into the vehicle 100 , and sound recognition process software.
- the input-output device 180 is configured to receive information necessary to control the vehicle 100 based on a pressing operation on the touch panel by the driver or an utterance made by the driver.
- the input-output device 180 includes an output device such as a liquid crystal display, a HUD, or other displays configured to output image information and one or more speakers configured to output voice information.
- FIG. 2 illustrates the server device 200 of the present embodiment and many vehicles from a vehicle 100 A to a vehicle 100 N connected to the server device 200 via the communication network N.
- the server device 200 includes a processor 200 A and a memory 200 B.
- the configurations of the processor 200 A and the memory 200 B can be achieved by configurations similar to those of the processor 110 A and the memory 110 B, so detailed descriptions thereof are omitted herein.
- computer programs to execute various computing processes, executed by the server device 200 described in this disclosure, are stored in the memory 200 B.
- each vehicle 100 is provided with the driving support system of the present embodiment.
- the communication network N may be the Internet, a LAN, a movable body communication network, Bluetooth (registered trademark), wireless fidelity (WiFi), other communication lines, or a combination of any of them.
- At least a part of the server device 200 may be implemented by cloud computing constituted by one or more computers.
- at least some of the processes in the control device 110 may be executed by the server device 200 .
- the following describes the driving support method according to the present embodiment, with reference to FIG. 3 .
- the navigation device 160 of the vehicle 100 calculates and determines a route from a predetermined start point to a predetermined goal point (step S 301 ).
- the start point may be input by the driver or the like by use of the input-output device 180 , or a current position of the vehicle 100 may be set as the start point based on the GPS signal received by the communications device 120 .
- the navigation device 160 may calculate a plurality of candidate routes from the start point to the goal point and show the candidate routes to the driver or the like, and the driver or the like may determine the route.
- the control device 110 When the route is determined, the control device 110 outputs route information to the server device 200 by use of the communications device 120 (step S 302 ).
- the route information includes information on a plurality of links that connect nodes from a departure place to a destination.
- the processor 200 A of the server device 200 receives the route information from the vehicle 100 (step S 303 ).
- the processor 200 A of the server device 200 reads, from the memory 200 B, pieces of visual-recognition position information indicative of positions where the traffic lights provided on the route can be visually recognized (step S 304 ).
- the processor 200 A of the server device 200 stores visual-recognition position information per traffic light and per route to the traffic light in the nonvolatile memory of the memory 200 B in advance.
- the processor 200 A and the memory 200 B may not necessarily be placed at the same position.
- the processor 200 A may read visual-recognition position information from the memory 200 B provided in an area near the route.
- a plurality of memories 200 B may be provided in a dispersed manner.
- FIG. 4 is a conception diagram to describe visual-recognition position information stored in the memory 200 B of the server device 200 .
- a traffic light S is provided at an intersection where three routes R 1 to R 3 intersect with each other
- a position where the driver can visually recognize the traffic light S is a position P 1 distanced from a node corresponding to the intersection by a distance D 1 .
- the position where the driver can visually recognize the traffic light S is a position P 2 distanced from the intersection by a distance D 2 .
- the position where the driver can see the traffic light S is a position P 3 distanced from the intersection by a distance D 3 .
- the position where the traffic light can be visually recognized may be different depending on the route where the vehicle 100 A travels. For example, in a case where a route from the position P 2 to the intersection where the traffic light S is provided is a downhill slope, the distance D 2 becomes long in comparison with a case where the route is not a downhill slope. Meanwhile, in a case where a route from the position P 1 to the same intersection is an uphill slope, the distance D 1 becomes short in comparison with a case where the route is not an uphill slope.
- the traffic light S may be visually recognizable only when the vehicle 100 A passes through the railroad bridge. In such a situation, the distance D 3 may become extremely short. Note that the traffic light S is referred to a traffic light that the vehicle 100 traveling on the route should recognize visually. Because of this, even at the same intersection, a traffic light that should be recognized visually may be different depending on a route where the vehicle 100 travels.
- the processor 200 A of the server device 200 can determine that the vehicle 100 A approaches the intersection where the traffic light S is provided from the route R 1 among the route R 1 to the route R 3 , for example, based on route information received from the vehicle 100 A. Accordingly, the processor 200 A of the server device 200 can transmit, to the vehicle 100 A, information including latitude and longitude information of the position P 1 as visual-recognition position information indicative of a position where the traffic light S is visually recognizable. Note that the visual-recognition position information is not limited only to the information indicative of the position P 1 such as the latitude and longitude information of the position P 1 .
- information indicative of one or more links included between the position P 1 and the node of the intersection may be transmitted as the visual-recognition position information.
- information indicative of the distance D 1 based on the node of the intersection may be transmitted as the visual-recognition position information.
- the processor 200 A of the server device 200 transmits, to the vehicle 100 A, a plurality of pieces of visual-recognition position information corresponding to a plurality of traffic lights through which the vehicle 100 A is to pass.
- the control device 110 causes the communications device 120 to receive, from the server device 200 , the pieces of visual-recognition position information corresponding to the traffic lights provided on the route (step S 305 ) and stores the pieces of visual-recognition position information in the memory 110 B.
- the vehicle 100 A starts traveling.
- the camera device 150 takes a forward image ahead of the vehicle 100 A at a predetermined cycle and outputs it to the control device 110 .
- the control device 110 recognizes a traffic light by use of a technique such as image recognition from the forward image thus received. Accordingly, the control device 110 of the vehicle 100 A can repeatedly execute a step of determining whether a traffic light is present or not, based on the forward image at the predetermined cycle. Note that the presence of a pedestrian, an obstacle, or the like other than a traffic light can be also recognized in a similar manner, and a control signal can be output to the driving device 170 as needed.
- the control device 110 can output a control signal to stop the vehicle 100 A to the driving device 170 .
- the camera device 150 may acquire a moving image as the forward image.
- the camera device 150 may include GPU or the like for image recognition, so that the camera device 150 can recognize a traffic light or the like.
- the vehicle 100 A does not need to always recognize a traffic light or the like and may be configured to recognize a traffic light or the like only in a predetermined case.
- the control device 110 is configured to cyclically determine whether the vehicle 100 A is placed at a visual-recognition position or not, while the vehicle 100 A is traveling (step S 306 ).
- latitude and longitude information of a position where a traffic light is visually recognizable is received as the visual-recognition position information
- when the vehicle 100 A is present between this position and a position of a node of a corresponding intersection or a position sufficiently close to the node of the corresponding intersection (e.g., several meters before the intersection) it is determined that the vehicle 100 A is placed at the visual-recognition position.
- link information is received as the visual-recognition position information
- the control device 110 determines whether the vehicle 100 A is present at the visual-recognition position or not, based on whether the vehicle 100 A travels on the link or not.
- control device 110 determines whether the traffic light is recognized from a forward image or not (step S 307 ).
- FIGS. 5A and 5B are views schematically illustrating forward images ahead of the vehicle 100 A, taken by the camera device 150 .
- the driver can visually recognize the traffic light S and a stop-line L.
- the control device 110 can recognize the traffic light S and the stop-line L from the forward image.
- the field of view is blocked by the large-size vehicle, so that the driver and the control device 110 cannot recognize the traffic light S and the stop-line L from the forward image.
- the driver falls into such a state that, although the traffic light S is present ahead of the vehicle 100 A, the driver does not recognize the presence of the traffic light S.
- the traffic light S changes from yellow light to red light and the large-size vehicle passes through the intersection in a last-minute timing, for example, the driver of the vehicle 100 A might follow the large-size vehicle and attempt to pass through the intersection though the traffic light S is changing to red.
- the control device 110 of the vehicle 100 A provided with the driving support system of the present embodiment is configured to output, to the input-output device 180 , a control signal to warn the driver or the like that the traffic light is present (step S 308 ).
- the input-output device 180 may be caused to output a voice message indicative of the presence of the traffic light, e.g., “There is a traffic light. Please be careful.”
- a text indicative of the presence of the traffic light or an illustration of the traffic light may be displayed on the HUD.
- a warning sound to promote attention may be just output.
- the control device 110 transmits, to the server device 200 , position information of a position where the traffic light is recognized (step S 309 ). In this case, the control device 110 does not cause the input-output device 180 to notify the driver or the like of the warning indicating that the traffic light is present.
- the processor 200 A of the server device 200 receives, from the vehicle 100 A, the position information of the position where the traffic light is recognized (step S 310 ) and stores the position information in the memory 200 B. Similarly, the processor 200 A of the server device 200 can receive, from a plurality of vehicles 100 , pieces of recognition position information indicative of a position where a predetermined traffic light is recognized. The processor 200 A of the server device 200 can determine traffic-light visual-recognition position information based on these pieces of recognition position information and store the traffic-light visual-recognition position information in the memory 200 B.
- step S 306 After the warning is notified in step S 308 , or after the recognition position information indicative of the position where the traffic light is recognized is transmitted in step S 309 , when the vehicle 100 A passes through the traffic light S, the process returns to step S 306 again. Accordingly, when the vehicle 100 A approaches a next traffic light, the process after step S 306 is repeated.
- step S 307 when it is determined that the control device 110 recognizes the traffic light a predetermined number of times or more or for a predetermined period of time or more, it may be determined that the traffic light is recognized, and when the control device 110 recognizes the traffic light only for an instant, it may be determined that the traffic light is not recognized.
- control device 110 may be configured to execute step S 306 only when the vehicle 100 A approaches a target traffic light by a predetermined distance or less. For example, there is such a case where a traffic light placed 200 meters ahead of the vehicle 100 A may be visually recognizable in a straight road with a good view. In such a case, the control device 110 may be configured to execute step S 306 when the vehicle 100 A approaches the target traffic light by 50 meters or less, for example.
- step S 304 in a case where the vehicle 100 A is distanced by a predetermined distance or more from a position where a traffic light is visually recognizable, the server device 200 may transmit, as visual-recognition position information about the traffic light, information just indicating that the vehicle 100 A is distanced from the traffic light by the predetermined distance or more or information indicating that the traffic light is visually recognizable from a sufficiently distant place.
- the server device 200 may transmit, as visual-recognition position information about the traffic light, information just indicating that the vehicle 100 A is distanced from the traffic light by the predetermined distance or more or information indicating that the traffic light is visually recognizable from a sufficiently distant place.
- the control device 110 can target only the traffic light close to the vehicle 100 A for the determination in step S 307 . More specifically, the sizes and the like of the traffic lights are determined based on the number of pixels or the like by image recognition or the like, so that the traffic light in a distant place can be excluded from the target for determination.
- step S 307 When only the traffic light within a predetermined distance or less from the vehicle 100 A is targeted for the process in step S 307 by employing such a configuration, it is possible to reduce such a possibility that warning is not notified because the traffic signal in a distance place is recognized without visually recognizing the traffic light close to the vehicle 100 A due to a large-size vehicle or the like ahead of the vehicle 100 A or due to a winding road.
- step S 310 the processor 200 A of the server device 200 receives position information of a position where a traffic light is recognized, from the vehicle 100 that actually recognizes the traffic light. Accordingly, it is possible to acquire accurate traffic-light visual-recognition position information based on the position information.
- the processor 200 A of the server device 200 can acquire visual-recognition position information in consideration of the change of a road in height or the like. By updating the visual-recognition position information based on recognition position information acquired newly in terms of time, it is possible to acquire more accurate visual-recognition position information.
- the processor 200 A of the server device 200 may acquire statistically accurate visual-recognition position information based on pieces of recognition position information acquired from many vehicles 100 .
- identification information indicative of a vehicle type of the vehicle 100 may be received in step S 302 and S 310 , and different visual-recognition position information may be output to the vehicle 100 depending on the vehicle type in step S 304 .
- the recognition position information received in step S 310 may be associated with information indicative of the vehicle type or the vehicle height of the vehicle 100 and stored in the memory 200 B, and based on identification information indicative of the vehicle type of the vehicle 100 that is received in step S 302 , recognition position information corresponding to the vehicle height or the vehicle type may be transmitted to the vehicle 100 in step S 304 .
- visual-recognition position information corrected in consideration of the vehicle height or the like may be acquired.
- the server device 200 may store recognition position information of the same vehicle type as the vehicle 100 B in association with the vehicle type of the vehicle 100 B and transmit, to the vehicle 100 B, visual-recognition position information based on one or more pieces of recognition position information acquired from the same vehicle type.
- a position where the driver can view a traffic light by visual inspection or the like may be acquired at first as the visual-recognition position information.
- the height of the viewpoint of the driver may be calculated with a camera that captures an image of the driver, and a visual-recognition position may be corrected.
- the vehicle 100 receives, from the server device 200 , specific position information indicative of a position where a traffic light is visually recognizable.
- the server device 200 may output, to the vehicle 100 , information indicating that the vehicle 100 has entered a position where the traffic light is visually recognizable. Note that, in the following description, descriptions are omitted or simplified in terms of parts overlapping with the first embodiment. Further, except for a peculiar process to be described in the second embodiment, constituent components that perform processes similar to the processes in the first embodiment have the same reference signs as in the first embodiment, and detail descriptions thereof are omitted.
- the server device 200 is provided in a facility around the traffic light S. Note that the server device 200 may be buried under a road surface around the traffic light S.
- the processor 200 A of the server device 200 acquires pieces of recognition position information of the traffic light S from the vehicles 100 traveling through an intersection where the traffic light S is present, acquires statistically accurate visual-recognition position information based on the pieces of recognition position information, and stores it in the memory 200 B.
- the vehicle 100 is configured to acquire the visual-recognition position information from the server device 200 during traveling. More specifically, when the vehicle 100 approaches a predetermined intersection after the vehicle 100 starts traveling (step S 601 ), the vehicle 100 transmits its own position information to the server device 200 directly or indirectly (step S 602 ).
- the processor 200 A of the server device 200 receives the position information from the vehicle 100 (step S 603 )
- the processor 200 A reads out visual-recognition position information stored in the memory 200 B and compares it with the position information received from the vehicle 100 , so as to determine whether the vehicle 100 enters a visual-recognition position in a route where the vehicle 100 is traveling (step S 604 ).
- the server device 200 transmits, to the vehicle 100 , information indicating that the vehicle 100 has entered a position where the traffic light is visually recognizable (step S 605 ).
- This information may be information of one bit.
- Step S 607 When the control device 110 of the vehicle 100 receives information from the server device 200 , the control device 110 determines that the vehicle 100 is placed at the position where the traffic light is visually recognizable, and the control device 110 determines whether the traffic light is recognized or not from a forward image (step S 607 ). Steps S 607 to S 610 are similar to steps S 307 to S 310 , and therefore, detailed descriptions thereof are omitted.
- a protocol is set in advance such that, when the vehicle 100 and the server device 200 receive a predetermined signal, it can be determined that the vehicle 100 is placed at the visual-recognition position where the traffic light is visually recognizable.
- it is also possible to accurately notify the driver or the like in a case where the traffic light that should be visually recognizable originally is not visually recognized, similarly to the first embodiment.
- the amount of information received from the server device 200 can be also reduced.
- pieces of traffic-light position information on a plurality of traffic lights around the vehicle 100 may be received from the server device 200 .
- the vehicle 100 may be configured to execute a step of determining whether the vehicle 100 is placed at the visual-recognition position, similarly to the first embodiment.
- embodiments of the present disclosure can be modified variously without deviating from the gist of the present disclosure.
- some constituents in a given embodiment or modification can be added to other embodiments.
- some constituents in a given embodiment or modification can be substituted with corresponding constituents in other embodiments.
- the first embodiment and the second embodiment deal with the driving support system targeted for a traffic light.
- the present disclosure can be also applied to a road sign instead of the traffic light.
- a road sign is a display board provided besides a road or an air space above the road so as to provide information necessary for a user. Even in the case of such a road sign, the size and the shape of the road sign are defined, similarly to the traffic light, so that it is possible to recognize the road sign with accuracy by the control device 110 . Further, an adverse effect caused when a traffic light that should be visually recognizable originally is not visually recognized is not small.
- the present disclosure can be applied to a road sign indicative of a guidance in an express highway or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Atmospheric Sciences (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Description
- The disclosure of Japanese Patent Application No. 2019-026866 filed on Feb. 18, 2019 including the specification, drawings and abstract is incorporated herein by reference in its entirety.
- The present disclosure relates to a driving support system and a server device.
- Japanese Unexamined Patent Application Publication No. 2018-5827 (JP 2018-5827 A) describes a driving support system configured to execute lane-changing without any burden on a driver. More specifically, JP 2018-5827 describes a technique in which a state of a host vehicle to be obtained if lane-changing is performed is predicted in advance, and when it is determined that it is difficult to visually recognize a traffic light, based on the height of the traffic light, the size of a forward vehicle, or the like, the driver is warned to promote cancellation of the lane-changing.
- However, even when visual recognizability is calculated only based on the height of the traffic light, it is not always possible to perform the prediction with high accuracy. For example, in a case where the vehicle travels on a three-dimensional road, even if the height of a traffic light is the same, the visibility of the traffic light changes depending on a traveling position.
- Further, the difficulty in visual recognition of a traffic light also occurs in cases other than lane-changing. In a case where a road winds or a road changes in height like a mountain area, in a case where a large-size vehicle temporarily travels ahead of a host vehicle, or in a case where a power line or the like is under construction, the field of view ahead of the host vehicle decreases, so that it might become difficult to visually recognize a traffic light.
- In view of this, an object of the present disclosure is to provide a driving support system and a server device each of which can accurately grasp a position where a traffic light is visually recognizable and accurately notify a driver or the like that a traffic light that should be visually recognizable originally is not visually recognizable.
- This disclosure relates to a driving support system. The driving support system includes an acquisition portion, an image acquisition portion, a traffic-light recognition portion, and a notification portion. The acquisition portion is configured to acquire visual-recognition position information on a position where a driver of a vehicle visually recognizes a traffic light. The image acquisition portion is configured to acquire a forward image ahead of the vehicle. The traffic-light recognition portion is configured to recognize a traffic light included in the forward image. The notification portion is configured to notify the driver of warning when the traffic light is not recognized from the forward image, in a case where the vehicle is present at a position based on the visual-recognition position information.
- The driving support system may further include an output portion configured to output, to an outer part, route information on a route where the vehicle is planned to travel; and a second acquisition portion configured to acquire pieces of visual-recognition position information of a plurality of traffic lights on the route indicated by the route information. Further, the driving support system may further include a second output portion configured to output, to an outer part, position information on a position of the vehicle where the traffic light is recognized by the traffic-light recognition portion. Further, the present disclosure may be applied to a road sign instead of the traffic light.
- Here, the visual-recognition position information may be latitude information and longitude information indicative of a position where the traffic light is visually recognizable, or the visual-recognition position information may be relative position information based on a predetermined intersection or the like. Further, the visual-recognition position information may be information indicating that a present vehicle position is a visual-recognition position where the traffic light is visually recognizable. In this case, information to be received may be information of one bit. In the driving support system, when this information is received, position information of the vehicle at that time can be acquired as the position where the traffic light is visually recognizable.
- Further, this disclosure provides a server device. The server device includes an acquisition portion and a receiving portion. The acquisition portion is configured to acquire, from a plurality of vehicles, pieces of visual-recognition position information on respective positions where drivers of the vehicles visually recognize a traffic light. The receiving portion is configured to receive, from the vehicles, pieces of position information on respective positions of the vehicles where the traffic light is recognized.
- More specifically, the server device may transmit visual-recognition position information to the vehicle. Note that the server device may acquire vehicle-type information of the vehicle and transmit, to the vehicle, visual-recognition position information corrected in accordance with the vehicle type. Further, the server device may calculate the visual-recognition position information based on the pieces of position information of the vehicles that are received from the vehicles. Note that the server device may be associated with a predetermined traffic light and placed under a road surface or the like near the traffic light. Accordingly, a plurality of traffic lights may be provided such that respective server devices are provided to the traffic lights. In this case, transmission and reception of information may be performed between the vehicle and the server device by road-to-vehicle communication.
- Features, advantages, and technical and industrial significance of exemplary embodiments of the present disclosure will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and wherein:
-
FIG. 1 is a block diagram illustrating a schematic hardware configuration of avehicle 100; -
FIG. 2 is a schematic view of aserver device 200 and a plurality ofvehicles 100 communicable with theserver device 200; -
FIG. 3 is a flowchart illustrating a driving support method according to the present embodiment; -
FIG. 4 is a conception diagram to describe visual-recognition position information; -
FIG. 5A is a schematic view of a forward image; -
FIG. 5B is a schematic view of a forward image; and -
FIG. 6 is a flowchart illustrating a driving support method according to a second embodiment. - The following describes embodiments of the present disclosure with reference to the drawings. The following embodiments are examples to describe the present disclosure and are not intended to limit a scope of the present disclosure to the embodiments.
-
FIG. 1 is a block diagram illustrating a schematic hardware configuration of avehicle 100.FIG. 2 illustrates a system including aserver device 200 connected to a plurality ofvehicles 100 via a network N. Note that, when aspecific vehicle 100 is mentioned, it is referred to as avehicle 100A, avehicle 100B, or the like, and when thevehicles 100 are generally mentioned, they are just referred to as thevehicles 100. - As illustrated in
FIG. 1 , thevehicle 100 includes acontrol device 110 and acommunications device 120, asensor device 130, aradar device 140, acamera device 150, anavigation device 160, adriving device 170, and an input-output device 180 that are connected to thecontrol device 110 via a bus or the like. - The
control device 110 receives predetermined signals from the devices connected thereto, performs a computing process or the like, and outputs control signals to drive the devices. Thecontrol device 110 includes aprocessor 110A and amemory 110B. Thecontrol device 110 can function as a driving support system according to the present embodiment by theprocessor 110A executing a computer program stored in thememory 110B. - The
processor 110A executes a predetermined computing process in accordance with a computer program such as firmware stored in thememory 110B. Theprocessor 110A is implemented by one or more central processing units (CPU), a micro processing unit (MPU), a GPU, a microprocessor, a processor core, a multiprocessor, an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), and so on. - The
memory 110B includes a nonvolatile memory such as an MRAM, a NAND flash memory, a NOR flash memory, an SSD, or a hard disk drive, and a volatile memory such as an SRAM or a DRAM. In the nonvolatile memory, computer programs to execute various computing processes illustrated in the flowchart or the like in this disclosure, map data, and various other pieces of data necessary in this disclosure are stored. The nonvolatile memory corresponds to a non-transitory tangible medium. The volatile memory provides a working area in which a computer program loaded from the nonvolatile memory and various pieces of data generated while theprocessor 110A executes a computer program are temporarily stored. Note that a computer program or data acquired from thecommunications device 120 may be stored in the nonvolatile memory. - The
communications device 120 includes means that transmits and receives information to and from an external device such as theserver device 200 and includes one or more communication means such as WiFi (a wireless communication method based on the 802.11 standard defined by IEEE). The external device may beother vehicles 100 or may be infrastructure equipment provided under a road surface or in a power pole, a building, or the like. Further, thecommunications device 120 receives a GPS signal and outputs position information of thevehicle 100 to thecontrol device 110. - The
sensor device 130 is a sensor configured to detect the behavior of thevehicle 100 and includes a rotary encoder configured to detect the vehicle speed of the vehicle and a gyro sensor configured to detect the inclination of the vehicle. Further, thesensor device 130 may include a magnetic sensor configured to detect a marker or the like embedded in a road. Theradar device 140 includes a LiDAR ranging system including a millimeter wave radar to avoid a collision with a pedestrian or the like. Thecamera device 150 includes a plurality of cameras including an imaging sensor such as a CCD or CMOS image sensor so as to capture images ahead of and behind thevehicle 100. Thecontrol device 110 can receive signals acquired by thesensor device 130, theradar device 140, and thecamera device 150 and output a control signal based on them to a corresponding device. For example, thecontrol device 110 can acquire an imaging signal of an image captured by thecamera device 150 and execute image recognition so as to recognize an obstacle or the like included in the image thus captured, and thecontrol device 110 can accordingly output, to thedriving device 170, a control signal to stop thevehicle 100, for example. Note that thecamera device 150 may be equipped with a semiconductor IC for image processing such as GPU that enables image recognition or the like, so that thecamera device 150 recognizes a driving lane where thevehicle 100 should travel or an obstacle such as a pedestrian based on an image captured by a camera or the like of thecamera device 150 and outputs information on the driving lane or the obstacle to thecontrol device 110. - The
navigation device 160 calculates a route to a predetermined destination based on an input from a driver or the like and performs a guidance. Thenavigation device 160 may include a nonvolatile memory (not shown) and store map data in the nonvolatile memory. Alternatively, thenavigation device 160 may acquire map data stored in thememory 110B or may acquire map data from thecommunications device 120. The map data includes information on road types and information about road signs, traffic lights, and the like. Further, the map data includes position information on a specific point called a node of a facility, an address, an intersection of a road, or the like, and information corresponding to a link corresponding to a road that connects nodes to each other. The position information is indicated by latitude, longitude, and altitude, for example. - Further, a processor configured to calculate a route may be provided in the
navigation device 160, but theprocessor 110A may execute the calculation. Further, current position information of thevehicle 100 may be acquired such that position information acquired based on a GPS signal received by thecommunications device 120 is acquired from thecontrol device 110 or thenavigation device 160 itself receives a GPS signal. Note that thenavigation device 160 may be constituted by an information processing terminal owned by the driver or the like. In this case, the information processing terminal may be connected to thevehicle 100 by a Bluetooth (registered trademark) device or the like of thecommunications device 120 so that route guidance information or the like to give route guidance is output from the input-output device 180 of thevehicle 100. - The
driving device 170 includes a motor and an actuator for operations of an engine, a brake, and a steering wheel of thevehicle 100 and operates based on a control signal received from thecontrol device 110. Note that thevehicle 100 may be configured such that the control device outputs a control signal to thedriving device 170 or the like based on the operations of an accelerator pedal, a brake pedal, the steering wheel, and the like by the driver or the like, but thevehicle 100 may have an automatic driving function to output, from thecontrol device 110 to thedriving device 170 or the like, a control signal to autonomously drive thevehicle 100 based on signals acquired from theradar device 140, thecamera device 150, and the like. Further, thevehicle 100 may be an electric vehicle including a battery and an electric motor. - The input-
output device 180 includes an input device such as a touch panel or a microphone via which the driver or the like inputs information into thevehicle 100, and sound recognition process software. The input-output device 180 is configured to receive information necessary to control thevehicle 100 based on a pressing operation on the touch panel by the driver or an utterance made by the driver. Further, the input-output device 180 includes an output device such as a liquid crystal display, a HUD, or other displays configured to output image information and one or more speakers configured to output voice information. -
FIG. 2 illustrates theserver device 200 of the present embodiment and many vehicles from avehicle 100A to avehicle 100N connected to theserver device 200 via the communication network N. Theserver device 200 includes aprocessor 200A and a memory 200B. The configurations of theprocessor 200A and the memory 200B can be achieved by configurations similar to those of theprocessor 110A and thememory 110B, so detailed descriptions thereof are omitted herein. Further, computer programs to execute various computing processes, executed by theserver device 200 described in this disclosure, are stored in the memory 200B. - In
FIG. 2 , eachvehicle 100 is provided with the driving support system of the present embodiment. The communication network N may be the Internet, a LAN, a movable body communication network, Bluetooth (registered trademark), wireless fidelity (WiFi), other communication lines, or a combination of any of them. At least a part of theserver device 200 may be implemented by cloud computing constituted by one or more computers. In addition, at least some of the processes in thecontrol device 110 may be executed by theserver device 200. - The following describes the driving support method according to the present embodiment, with reference to
FIG. 3 . - The
navigation device 160 of thevehicle 100 calculates and determines a route from a predetermined start point to a predetermined goal point (step S301). The start point may be input by the driver or the like by use of the input-output device 180, or a current position of thevehicle 100 may be set as the start point based on the GPS signal received by thecommunications device 120. Thenavigation device 160 may calculate a plurality of candidate routes from the start point to the goal point and show the candidate routes to the driver or the like, and the driver or the like may determine the route. - When the route is determined, the
control device 110 outputs route information to theserver device 200 by use of the communications device 120 (step S302). The route information includes information on a plurality of links that connect nodes from a departure place to a destination. - Subsequently, the
processor 200A of theserver device 200 receives the route information from the vehicle 100 (step S303). In terms of a plurality of traffic lights (devices that show a signal indicative of an advance permission, a stop instruction, or the like for a vehicle on a road) present on the route for thevehicle 100, theprocessor 200A of theserver device 200 reads, from the memory 200B, pieces of visual-recognition position information indicative of positions where the traffic lights provided on the route can be visually recognized (step S304). Theprocessor 200A of theserver device 200 stores visual-recognition position information per traffic light and per route to the traffic light in the nonvolatile memory of the memory 200B in advance. Note that theprocessor 200A and the memory 200B may not necessarily be placed at the same position. For example, theprocessor 200A may read visual-recognition position information from the memory 200B provided in an area near the route. Further, a plurality of memories 200B may be provided in a dispersed manner. -
FIG. 4 is a conception diagram to describe visual-recognition position information stored in the memory 200B of theserver device 200. For example, in a case where a traffic light S is provided at an intersection where three routes R1 to R3 intersect with each other, when thevehicle 100A moves from the route R1 toward the intersection, a position where the driver can visually recognize the traffic light S is a position P1 distanced from a node corresponding to the intersection by a distance D1. Further, when thevehicle 100A moves from the route R2 toward the intersection, the position where the driver can visually recognize the traffic light S is a position P2 distanced from the intersection by a distance D2. When thevehicle 100A moves from the route R3 toward the intersection, the position where the driver can see the traffic light S is a position P3 distanced from the intersection by a distance D3. Even in terms of the traffic light S present at the same intersection, the position where the traffic light can be visually recognized may be different depending on the route where thevehicle 100A travels. For example, in a case where a route from the position P2 to the intersection where the traffic light S is provided is a downhill slope, the distance D2 becomes long in comparison with a case where the route is not a downhill slope. Meanwhile, in a case where a route from the position P1 to the same intersection is an uphill slope, the distance D1 becomes short in comparison with a case where the route is not an uphill slope. Further, in a case where there is a railroad bridge where trains or the like pass over the route R3, the traffic light S may be visually recognizable only when thevehicle 100A passes through the railroad bridge. In such a situation, the distance D3 may become extremely short. Note that the traffic light S is referred to a traffic light that thevehicle 100 traveling on the route should recognize visually. Because of this, even at the same intersection, a traffic light that should be recognized visually may be different depending on a route where thevehicle 100 travels. - The
processor 200A of theserver device 200 can determine that thevehicle 100A approaches the intersection where the traffic light S is provided from the route R1 among the route R1 to the route R3, for example, based on route information received from thevehicle 100A. Accordingly, theprocessor 200A of theserver device 200 can transmit, to thevehicle 100A, information including latitude and longitude information of the position P1 as visual-recognition position information indicative of a position where the traffic light S is visually recognizable. Note that the visual-recognition position information is not limited only to the information indicative of the position P1 such as the latitude and longitude information of the position P1. For example, information indicative of one or more links included between the position P1 and the node of the intersection may be transmitted as the visual-recognition position information. Further, information indicative of the distance D1 based on the node of the intersection may be transmitted as the visual-recognition position information. - Thus, the
processor 200A of theserver device 200 transmits, to thevehicle 100A, a plurality of pieces of visual-recognition position information corresponding to a plurality of traffic lights through which thevehicle 100A is to pass. Thecontrol device 110 causes thecommunications device 120 to receive, from theserver device 200, the pieces of visual-recognition position information corresponding to the traffic lights provided on the route (step S305) and stores the pieces of visual-recognition position information in thememory 110B. - After that, the
vehicle 100A starts traveling. During the traveling, thecamera device 150 takes a forward image ahead of thevehicle 100A at a predetermined cycle and outputs it to thecontrol device 110. Thecontrol device 110 recognizes a traffic light by use of a technique such as image recognition from the forward image thus received. Accordingly, thecontrol device 110 of thevehicle 100A can repeatedly execute a step of determining whether a traffic light is present or not, based on the forward image at the predetermined cycle. Note that the presence of a pedestrian, an obstacle, or the like other than a traffic light can be also recognized in a similar manner, and a control signal can be output to thedriving device 170 as needed. For example, in a case where thecontrol device 110 recognizes the presence of a pedestrian ahead of the vehicle based on a forward image taken by thecamera device 150, thecontrol device 110 can output a control signal to stop thevehicle 100A to thedriving device 170. Note that thecamera device 150 may acquire a moving image as the forward image. Further, thecamera device 150 may include GPU or the like for image recognition, so that thecamera device 150 can recognize a traffic light or the like. Further, thevehicle 100A does not need to always recognize a traffic light or the like and may be configured to recognize a traffic light or the like only in a predetermined case. - The
control device 110 is configured to cyclically determine whether thevehicle 100A is placed at a visual-recognition position or not, while thevehicle 100A is traveling (step S306). In a case where latitude and longitude information of a position where a traffic light is visually recognizable is received as the visual-recognition position information, when thevehicle 100A is present between this position and a position of a node of a corresponding intersection or a position sufficiently close to the node of the corresponding intersection (e.g., several meters before the intersection), it is determined that thevehicle 100A is placed at the visual-recognition position. In a case where link information is received as the visual-recognition position information, thecontrol device 110 determines whether thevehicle 100A is present at the visual-recognition position or not, based on whether thevehicle 100A travels on the link or not. - When it is determined that the
vehicle 100A is present at the visual-recognition position in step S306, thecontrol device 110 determines whether the traffic light is recognized from a forward image or not (step S307). -
FIGS. 5A and 5B are views schematically illustrating forward images ahead of thevehicle 100A, taken by thecamera device 150. In a case where other vehicles or the like are not present ahead of thevehicle 100A as illustrated inFIG. 5A , the driver can visually recognize the traffic light S and a stop-line L. Similarly, thecontrol device 110 can recognize the traffic light S and the stop-line L from the forward image. Meanwhile, in a case where a large-size vehicle is present ahead of thevehicle 100A as illustrated inFIG. 5B , the field of view is blocked by the large-size vehicle, so that the driver and thecontrol device 110 cannot recognize the traffic light S and the stop-line L from the forward image. On this account, the driver falls into such a state that, although the traffic light S is present ahead of thevehicle 100A, the driver does not recognize the presence of the traffic light S. In such a state, in a case where the traffic light S changes from yellow light to red light and the large-size vehicle passes through the intersection in a last-minute timing, for example, the driver of thevehicle 100A might follow the large-size vehicle and attempt to pass through the intersection though the traffic light S is changing to red. - However, in a case where the traffic light is not recognized from the forward image though the
vehicle 100A is present at a position where the traffic light is visually recognizable, thecontrol device 110 of thevehicle 100A provided with the driving support system of the present embodiment is configured to output, to the input-output device 180, a control signal to warn the driver or the like that the traffic light is present (step S308). As a tendency, more specifically, the input-output device 180 may be caused to output a voice message indicative of the presence of the traffic light, e.g., “There is a traffic light. Please be careful.” Alternatively, a text indicative of the presence of the traffic light or an illustration of the traffic light may be displayed on the HUD. Alternatively, a warning sound to promote attention may be just output. - Meanwhile, when the traffic light is recognized in step S307, the
control device 110 transmits, to theserver device 200, position information of a position where the traffic light is recognized (step S309). In this case, thecontrol device 110 does not cause the input-output device 180 to notify the driver or the like of the warning indicating that the traffic light is present. - The
processor 200A of theserver device 200 receives, from thevehicle 100A, the position information of the position where the traffic light is recognized (step S310) and stores the position information in the memory 200B. Similarly, theprocessor 200A of theserver device 200 can receive, from a plurality ofvehicles 100, pieces of recognition position information indicative of a position where a predetermined traffic light is recognized. Theprocessor 200A of theserver device 200 can determine traffic-light visual-recognition position information based on these pieces of recognition position information and store the traffic-light visual-recognition position information in the memory 200B. - After the warning is notified in step S308, or after the recognition position information indicative of the position where the traffic light is recognized is transmitted in step S309, when the
vehicle 100A passes through the traffic light S, the process returns to step S306 again. Accordingly, when thevehicle 100A approaches a next traffic light, the process after step S306 is repeated. - Note that, in step S307, when it is determined that the
control device 110 recognizes the traffic light a predetermined number of times or more or for a predetermined period of time or more, it may be determined that the traffic light is recognized, and when thecontrol device 110 recognizes the traffic light only for an instant, it may be determined that the traffic light is not recognized. - Further, the
control device 110 may be configured to execute step S306 only when thevehicle 100A approaches a target traffic light by a predetermined distance or less. For example, there is such a case where a traffic light placed 200 meters ahead of thevehicle 100A may be visually recognizable in a straight road with a good view. In such a case, thecontrol device 110 may be configured to execute step S306 when thevehicle 100A approaches the target traffic light by 50 meters or less, for example. - With such a configuration, it is possible to limit a traffic light targeted for the process only to a neighboring traffic light with a high necessity for safe driving.
- Note that, instead of the above configuration, in step S304, in a case where the
vehicle 100A is distanced by a predetermined distance or more from a position where a traffic light is visually recognizable, theserver device 200 may transmit, as visual-recognition position information about the traffic light, information just indicating that thevehicle 100A is distanced from the traffic light by the predetermined distance or more or information indicating that the traffic light is visually recognizable from a sufficiently distant place. With such a configuration, the amount of information stored in theserver device 200 and transmitted to thevehicle 100 can be reduced. - Further, there is such a case where a plurality of traffic lights may be visually recognizable. For example, there is such a case where a traffic light close to the
vehicle 100A and a traffic light far from thevehicle 100A may be both visually recognizable in a straight road with a good view. In such a case, thecontrol device 110 can target only the traffic light close to thevehicle 100A for the determination in step S307. More specifically, the sizes and the like of the traffic lights are determined based on the number of pixels or the like by image recognition or the like, so that the traffic light in a distant place can be excluded from the target for determination. When only the traffic light within a predetermined distance or less from thevehicle 100A is targeted for the process in step S307 by employing such a configuration, it is possible to reduce such a possibility that warning is not notified because the traffic signal in a distance place is recognized without visually recognizing the traffic light close to thevehicle 100A due to a large-size vehicle or the like ahead of thevehicle 100A or due to a winding road. - With the driving support system described above, in a case where a traffic light that should be visually recognizable originally is not visually recognized, it is possible to accurately notify the driver or the like of the presence of the traffic light.
- Further, in step S310, the
processor 200A of theserver device 200 receives position information of a position where a traffic light is recognized, from thevehicle 100 that actually recognizes the traffic light. Accordingly, it is possible to acquire accurate traffic-light visual-recognition position information based on the position information. For example, theprocessor 200A of theserver device 200 can acquire visual-recognition position information in consideration of the change of a road in height or the like. By updating the visual-recognition position information based on recognition position information acquired newly in terms of time, it is possible to acquire more accurate visual-recognition position information. For example, it is possible to acquire, as a visual-recognition position, a point where a traffic light becomes visually recognizable because a building or the like that disturbed visual recognition of the traffic light before is demolished. Conversely, it is possible to prevent such a situation that a point where visual recognition of a traffic light is disturbed by a building or the like newly built is wrongly acquired as a visual-recognition position. This makes it possible to increase the accuracy of warning to be notified. Further, in a case where a traffic light is recognized, it is preferable not to notify that the traffic light is recognized. With such a configuration, it is possible to reduce the frequency of notification, thereby making it possible to restrain such a situation that the driver disregards the notification of warning in step S308. However, this does not prevent such a configuration that the driver or the like sets the notification of warning to be performed even when a traffic light is recognized, for example. - Note that it is preferable for the
processor 200A of theserver device 200 to acquire statistically accurate visual-recognition position information based on pieces of recognition position information acquired frommany vehicles 100. Further, identification information indicative of a vehicle type of thevehicle 100 may be received in step S302 and S310, and different visual-recognition position information may be output to thevehicle 100 depending on the vehicle type in step S304. For example, the recognition position information received in step S310 may be associated with information indicative of the vehicle type or the vehicle height of thevehicle 100 and stored in the memory 200B, and based on identification information indicative of the vehicle type of thevehicle 100 that is received in step S302, recognition position information corresponding to the vehicle height or the vehicle type may be transmitted to thevehicle 100 in step S304. Alternatively, visual-recognition position information corrected in consideration of the vehicle height or the like may be acquired. For example, in a case of a trailer bus or the like illustrated as thevehicle 100B inFIG. 2 , a driver seat or the like as well as the vehicle height is also different from that in a normal passenger car. Accordingly, theserver device 200 may store recognition position information of the same vehicle type as thevehicle 100B in association with the vehicle type of thevehicle 100B and transmit, to thevehicle 100B, visual-recognition position information based on one or more pieces of recognition position information acquired from the same vehicle type. However, a position where the driver can view a traffic light by visual inspection or the like may be acquired at first as the visual-recognition position information. Further, the height of the viewpoint of the driver may be calculated with a camera that captures an image of the driver, and a visual-recognition position may be corrected. - In the first embodiment, the
vehicle 100 receives, from theserver device 200, specific position information indicative of a position where a traffic light is visually recognizable. However, like the second embodiment described below, when thevehicle 100 enters a region where a traffic light is visually recognizable, theserver device 200 may output, to thevehicle 100, information indicating that thevehicle 100 has entered a position where the traffic light is visually recognizable. Note that, in the following description, descriptions are omitted or simplified in terms of parts overlapping with the first embodiment. Further, except for a peculiar process to be described in the second embodiment, constituent components that perform processes similar to the processes in the first embodiment have the same reference signs as in the first embodiment, and detail descriptions thereof are omitted. - In the second embodiment, the
server device 200 is provided in a facility around the traffic light S. Note that theserver device 200 may be buried under a road surface around the traffic light S. Similarly to the first embodiment, theprocessor 200A of theserver device 200 acquires pieces of recognition position information of the traffic light S from thevehicles 100 traveling through an intersection where the traffic light S is present, acquires statistically accurate visual-recognition position information based on the pieces of recognition position information, and stores it in the memory 200B. - In the second embodiment, the
vehicle 100 is configured to acquire the visual-recognition position information from theserver device 200 during traveling. More specifically, when thevehicle 100 approaches a predetermined intersection after thevehicle 100 starts traveling (step S601), thevehicle 100 transmits its own position information to theserver device 200 directly or indirectly (step S602). When theprocessor 200A of theserver device 200 receives the position information from the vehicle 100 (step S603), theprocessor 200A reads out visual-recognition position information stored in the memory 200B and compares it with the position information received from thevehicle 100, so as to determine whether thevehicle 100 enters a visual-recognition position in a route where thevehicle 100 is traveling (step S604). When it is determined that thevehicle 100 has entered the visual-recognition position, theserver device 200 transmits, to thevehicle 100, information indicating that thevehicle 100 has entered a position where the traffic light is visually recognizable (step S605). This information may be information of one bit. - When the
control device 110 of thevehicle 100 receives information from theserver device 200, thecontrol device 110 determines that thevehicle 100 is placed at the position where the traffic light is visually recognizable, and thecontrol device 110 determines whether the traffic light is recognized or not from a forward image (step S607). Steps S607 to S610 are similar to steps S307 to S310, and therefore, detailed descriptions thereof are omitted. - As described above, a protocol is set in advance such that, when the
vehicle 100 and theserver device 200 receive a predetermined signal, it can be determined that thevehicle 100 is placed at the visual-recognition position where the traffic light is visually recognizable. With such a configuration, it is also possible to accurately notify the driver or the like in a case where the traffic light that should be visually recognizable originally is not visually recognized, similarly to the first embodiment. - Further, the amount of information received from the
server device 200 can be also reduced. Note that pieces of traffic-light position information on a plurality of traffic lights around thevehicle 100 may be received from theserver device 200. In this case, thevehicle 100 may be configured to execute a step of determining whether thevehicle 100 is placed at the visual-recognition position, similarly to the first embodiment. - Note that embodiments of the present disclosure can be modified variously without deviating from the gist of the present disclosure. For example, within a range of normal creativity of a person skilled in the art, some constituents in a given embodiment or modification can be added to other embodiments. Further, some constituents in a given embodiment or modification can be substituted with corresponding constituents in other embodiments.
- The first embodiment and the second embodiment deal with the driving support system targeted for a traffic light. However, the present disclosure can be also applied to a road sign instead of the traffic light. Here, a road sign is a display board provided besides a road or an air space above the road so as to provide information necessary for a user. Even in the case of such a road sign, the size and the shape of the road sign are defined, similarly to the traffic light, so that it is possible to recognize the road sign with accuracy by the
control device 110. Further, an adverse effect caused when a traffic light that should be visually recognizable originally is not visually recognized is not small. For example, in addition to the traffic light, the present disclosure can be applied to a road sign indicative of a guidance in an express highway or the like.
Claims (7)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019026866A JP7205695B2 (en) | 2019-02-18 | 2019-02-18 | driving support system |
JPJP2019-026866 | 2019-02-18 | ||
JP2019-026866 | 2019-12-03 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200265250A1 true US20200265250A1 (en) | 2020-08-20 |
US11508161B2 US11508161B2 (en) | 2022-11-22 |
Family
ID=72043222
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/706,898 Active 2040-04-08 US11508161B2 (en) | 2019-02-18 | 2019-12-09 | Driving support system and server device |
Country Status (3)
Country | Link |
---|---|
US (1) | US11508161B2 (en) |
JP (1) | JP7205695B2 (en) |
CN (1) | CN111583697B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114333289A (en) * | 2020-09-28 | 2022-04-12 | 沃尔沃汽车公司 | Vehicle starting reminding device, system and method |
CN115083205A (en) * | 2022-04-27 | 2022-09-20 | 一汽奔腾轿车有限公司 | AEB with traffic light identification function |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112270830B (en) * | 2020-10-15 | 2022-08-19 | 北京小马慧行科技有限公司 | Method and device for determining parking position of vehicle and automatic driving vehicle |
CN113034946B (en) * | 2021-02-26 | 2022-04-01 | 广汽本田汽车有限公司 | Vehicle movement control system |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8892356B1 (en) * | 2003-06-19 | 2014-11-18 | Here Global B.V. | Method and system for representing traffic signals in a road network database |
JP4631750B2 (en) * | 2006-03-06 | 2011-02-16 | トヨタ自動車株式会社 | Image processing system |
JP2011108175A (en) * | 2009-11-20 | 2011-06-02 | Alpine Electronics Inc | Driving support system, driving support method and driving support program |
JP5551236B2 (en) | 2010-03-03 | 2014-07-16 | パナソニック株式会社 | Road condition management system and road condition management method |
JP5729176B2 (en) * | 2011-07-01 | 2015-06-03 | アイシン・エィ・ダブリュ株式会社 | Movement guidance system, movement guidance apparatus, movement guidance method, and computer program |
US9158980B1 (en) * | 2012-09-19 | 2015-10-13 | Google Inc. | Use of relationship between activities of different traffic signals in a network to improve traffic signal state estimation |
JP6325806B2 (en) * | 2013-12-06 | 2018-05-16 | 日立オートモティブシステムズ株式会社 | Vehicle position estimation system |
JP2015146076A (en) * | 2014-01-31 | 2015-08-13 | キヤノンマーケティングジャパン株式会社 | Navigation system, and processing method and program of the same |
JP2016112984A (en) * | 2014-12-12 | 2016-06-23 | 日本精機株式会社 | Virtual image display system for vehicle, and head up display |
MX367068B (en) * | 2015-07-13 | 2019-08-05 | Nissan Motor | Traffic light recognition device and traffic light recognition method. |
CN105185140B (en) * | 2015-09-30 | 2018-07-06 | 上海修源网络科技有限公司 | A kind of auxiliary driving method and system |
US9990548B2 (en) * | 2016-03-09 | 2018-06-05 | Uber Technologies, Inc. | Traffic signal analysis system |
JP2018005827A (en) | 2016-07-08 | 2018-01-11 | 株式会社デンソーテン | Lane change support device and lane change support method |
JP6971020B2 (en) * | 2016-11-14 | 2021-11-24 | 株式会社日立製作所 | Anomaly detection device and anomaly detection method |
CN107316488B (en) * | 2017-08-23 | 2021-01-12 | 苏州豪米波技术有限公司 | Signal lamp identification method, device and system |
US10381212B1 (en) * | 2018-02-08 | 2019-08-13 | Shimadzu Corporation | Time-of-flight mass spectrometer |
US20190282004A1 (en) * | 2018-03-16 | 2019-09-19 | E & E Co., Ltd. | Comforter divided into sections with differentiated properties |
-
2019
- 2019-02-18 JP JP2019026866A patent/JP7205695B2/en active Active
- 2019-12-09 US US16/706,898 patent/US11508161B2/en active Active
- 2019-12-12 CN CN201911288803.0A patent/CN111583697B/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114333289A (en) * | 2020-09-28 | 2022-04-12 | 沃尔沃汽车公司 | Vehicle starting reminding device, system and method |
CN115083205A (en) * | 2022-04-27 | 2022-09-20 | 一汽奔腾轿车有限公司 | AEB with traffic light identification function |
Also Published As
Publication number | Publication date |
---|---|
CN111583697A (en) | 2020-08-25 |
CN111583697B (en) | 2022-08-05 |
JP2020135321A (en) | 2020-08-31 |
JP7205695B2 (en) | 2023-01-17 |
US11508161B2 (en) | 2022-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11269352B2 (en) | System for building a vehicle-to-cloud real-time traffic map for autonomous driving vehicles (ADVS) | |
US11156474B2 (en) | ADAS horizon and vision supplemental V2X | |
US11508161B2 (en) | Driving support system and server device | |
US10943133B2 (en) | Vehicle control device, vehicle control method, and storage medium | |
US20190276027A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
JP6520687B2 (en) | Driving support device | |
US20200211379A1 (en) | Roundabout assist | |
US20200296334A1 (en) | Information processing device and automatic traveling control system including information processing device | |
CN111766866B (en) | Information processing apparatus and automatic travel control system including the same | |
JP7502047B2 (en) | COMMUNICATION DEVICE, VEHICLE, PROGRAM, AND COMMUNICATION METHOD | |
JP2022139009A (en) | Drive support device, drive support method, and program | |
JP7203907B1 (en) | CONTROL DEVICE, MOBILE BODY, CONTROL METHOD, AND TERMINAL | |
CN115176296A (en) | Travel assist device, travel assist method, and travel assist program | |
JP7449206B2 (en) | Communication control device, vehicle, program, and communication control method | |
JP2015114931A (en) | Vehicle warning device, server device and vehicle warning system | |
US11548521B2 (en) | Systems, methods and vehicles for determining wrong direction driving | |
JP7203123B2 (en) | Communication system, communication terminal, control method, program, and storage medium for storing program | |
CN112567427B (en) | Image processing device, image processing method, and program | |
JP2022048829A (en) | Communication control device, vehicle, program, and communication control method | |
CN111381592A (en) | Vehicle control method and device and vehicle | |
JP6962685B2 (en) | Information display device and relay device | |
JP2010146334A (en) | On-vehicle apparatus and information providing system | |
KR20220155530A (en) | Apparatus and Method for Controlling Advanced Driver Assistance System | |
JP2022187709A (en) | Control device, moving vehicle, control method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OE, YOSHIHIRO;KAMIMARU, HIROFUMI;SIGNING DATES FROM 20191016 TO 20191024;REEL/FRAME:051213/0089 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |