CN110709912A - Travel assist device, travel assist system, and travel assist method - Google Patents

Travel assist device, travel assist system, and travel assist method Download PDF

Info

Publication number
CN110709912A
CN110709912A CN201780091216.5A CN201780091216A CN110709912A CN 110709912 A CN110709912 A CN 110709912A CN 201780091216 A CN201780091216 A CN 201780091216A CN 110709912 A CN110709912 A CN 110709912A
Authority
CN
China
Prior art keywords
vehicle
flying object
information
unit
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780091216.5A
Other languages
Chinese (zh)
Other versions
CN110709912B (en
Inventor
门田尚树
荒井兼秀
副岛嘉人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Corp
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Corp filed Critical Mitsubishi Corp
Publication of CN110709912A publication Critical patent/CN110709912A/en
Application granted granted Critical
Publication of CN110709912B publication Critical patent/CN110709912B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Abstract

The driving assistance device according to the present invention includes: a wireless communication unit (107) that establishes wireless communication with a flying object (200) on which a flying object camera (202) is mounted; an emission control unit (106) that controls the emission of the flying object (200); an oncoming lane state determination unit (108) that determines the state of an oncoming lane that the vehicle has traversed by turning left or right at an intersection, based on a first image captured by a flight object camera (202) and acquired via a wireless communication unit (107); and a notification unit (109) that notifies the determined state of the opposite lane.

Description

Travel assist device, travel assist system, and travel assist method
Technical Field
The present invention relates to a technique for assisting vehicle travel.
Background
Various technologies for assisting the vehicle in traveling have been proposed as technologies for assisting the vehicle in traveling by crossing an opposite lane to turn right or left at an intersection. In addition, whether to turn right or left across the oncoming vehicle differs depending on the country or region.
For example, patent document 1 discloses a driving assistance device that includes: when it is determined that the vehicle is located in the intersection, display information indicating what kind of display is being performed by the traffic signal in the opposite lane provided at the intersection is obtained by image processing based on the shape and color of the traffic signal image captured by the rear camera mounted on the vehicle, and the obtained display information is notified to the driver of the vehicle.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open No. 2006-155319
Disclosure of Invention
Technical problem to be solved by the invention
The travel support device described in patent document 1 has the following problems: since the lighting state of the traffic signal provided in the opposite lane at the intersection is photographed by the rear camera of the vehicle, the lighting state of the traffic signal cannot be determined if the vehicle does not enter the intersection.
The present invention has been made to solve the above-described problems, and an object thereof is to enable acquisition of information relating to an oncoming lane even when the own vehicle is not entering a crossroad.
Technical scheme for solving technical problem
The driving assistance device according to the present invention includes: a wireless communication unit that establishes wireless communication with a flying object having a flying object camera mounted thereon; an emission control unit that controls emission of the flying object; an oncoming lane state determination unit that determines an oncoming lane state that the vehicle has traversed by turning left or right at the intersection, based on a first image captured by the flight object camera acquired via the wireless communication unit; and a notification unit that notifies the state of the oncoming lane determined by the oncoming lane state determination unit.
Effects of the invention
According to the present invention, the oncoming lane-related information can be acquired even when the own vehicle does not enter the intersection.
Drawings
Fig. 1 is a block diagram showing a configuration of a driving assistance device according to embodiment 1.
Fig. 2A and 2B are diagrams showing an example of the hardware configuration of the driving assistance device.
Fig. 3 is a flowchart showing an operation of the driving assistance device according to embodiment 1.
Fig. 4 is a flowchart showing an operation of the emission determination unit of the travel assistance device according to embodiment 1.
Fig. 5A, 5B, and 5C are explanatory diagrams illustrating flight control by the launch control unit of the driving assistance device according to embodiment 1.
Fig. 6 is a diagram showing an example of display of a notification unit of the driving assistance device according to embodiment 1.
Fig. 7 is a diagram showing an example of display of a notification unit of the driving assistance device according to embodiment 1.
Fig. 8 is a diagram for explaining the effects of the driving assistance device according to embodiment 1.
Detailed Description
Hereinafter, embodiments for carrying out the present invention will be described with reference to the drawings in order to explain the present invention in more detail.
Embodiment mode 1 ∙
Fig. 1 is a block diagram showing a configuration of a driving assistance device 100 according to embodiment 1.
First, the travel support device 100 determines whether or not a flying object such as an unmanned aerial vehicle mounted on the vehicle is to be launched when the vehicle is to travel across the oncoming lane at the intersection. When it is determined that a flight vehicle is to be launched, the travel support device 100 controls the launch and flight of the flight vehicle, and determines the lighting state of the traffic signal in the straight traveling direction of the opposite lane from the image captured by the flight vehicle camera mounted on the flight vehicle. The driving assistance device 100 notifies the driver of the determination result.
In addition, traveling across the opposite lane at the intersection described above changes to a right turn or a left turn depending on the country or region. In the following, a case where the host vehicle turns right by crossing the oncoming lane at the intersection will be described as an example, but the driving assistance apparatus 100 can also be applied to a case where the host vehicle turns left by crossing the oncoming lane at the intersection.
The travel support device 100 includes a position information acquisition unit 101, a vehicle information acquisition unit 102, a travel direction detection unit 103, an oncoming lane information detection unit 104, an emission determination unit 105, an emission control unit 106, a wireless communication unit 107, an oncoming lane state determination unit 108, a notification unit 109, and a collection determination unit 110.
The travel support device 100 is communicatively connected to the flight vehicle 200 via the wireless communication unit 107 and the wireless communication device 201. The flight vehicle 200 is mounted on the roof, the rack, or the like of the vehicle, and is launched under the control of the driving assistance device 100. The flight vehicle camera 202 mounted on the flight vehicle 200 captures images in a state of being mounted on the own vehicle and in a flying state of being emitted from the own vehicle. As shown in fig. 1, the driving assistance system 300 is composed of a driving assistance device 100 and a flying object 200.
The driving assistance device 100 is connected to at least one of the display 401, the speaker 402, the notification lamp 403, and the lamp body (notification device) 203. Display 401, speaker 402, and notification lamp 403 are mounted on the vehicle. The display 401 is a display such as a navigation system, a head-up display, or a meter of the own vehicle. The lamp body 203 is mounted on the flying object 200. The notification lamp 403 and the lamp body 203 irradiate light. The notification lamp 403 and the lamp body 203 irradiate light of a color that is the same as or similar to the lighting color of the traffic signal, for example, and notify the driver of the lighting state of the traffic signal. The notification lamp 403 is provided at a position in front of the vehicle such as the vicinity of the display 401 and viewable by the driver. The lamp body 203 is disposed at a position where the driver can observe the light irradiated while the flying object 200 is flying.
The position information acquisition unit 101 acquires map information, current position information of the vehicle equipped with the travel support device 100, a direction of the vehicle, and route information of the vehicle, based on a navigation system (not shown). The position information acquisition section 101 outputs the acquired information as position information to the emission determination section 105.
The vehicle information acquisition unit 102 acquires vehicle speed information of the host vehicle from a vehicle-mounted radar (not shown) mounted on the host vehicle. The vehicle information acquisition unit 102 acquires an image in which the periphery of the host vehicle is captured (hereinafter, referred to as a captured image of the periphery of the host vehicle) from an in-vehicle camera (not shown) mounted on the host vehicle. The vehicle information acquisition unit 102 outputs the acquired traveling direction of the own vehicle, the acquired vehicle speed information of the own vehicle, and the captured image to the emission determination unit 105 as vehicle information.
The traveling direction detection unit 103 acquires information indicating the operation state of the direction indicator from a sensor (not shown) that detects the operation of the direction indicator of the host vehicle. When the acquired information indicates, for example, that the right direction indicator is turned on, the traveling direction detection unit 103 outputs the turning-on information of the right direction indicator to the emission determination unit 105. On the other hand, when the acquired information indicates that the left direction indicator is turned on, for example, the traveling direction detecting section 103 outputs the turn-on information of the left direction indicator to the emission judging section 105.
The opposite-direction lane information detecting unit 104 acquires a second image captured by at least one of an in-vehicle camera mounted on the own vehicle and a flight vehicle camera 202 mounted on the flight vehicle 200. The second image acquired by the opposite-direction lane information detecting unit 104 from the flight vehicle camera 202 mounted on the flight vehicle 200 is an image captured in a state where the flight vehicle 200 is mounted on the own vehicle. The oncoming lane information detection unit 104 detects information indicating whether or not there is an oncoming lane of an oncoming vehicle traveling straight in the oncoming lane at the intersection, based on the acquired second image. The opposite-direction lane information detecting unit 104 outputs the detected information of the opposite-direction lane to the emission determining unit 105.
The emission determination unit 105 determines whether or not to emit the flying object 200 based on the position information input from the position information acquisition unit 101, the vehicle information input from the vehicle information acquisition unit 102, the lighting information of the direction indicator input from the traveling direction detection unit 103, and the information of the oncoming lane input from the oncoming lane information detection unit 104.
Specifically, the emission determination unit 105 determines whether or not the host vehicle is scheduled to turn right at the intersection based on the position information and the vehicle information, and whether or not the host vehicle approaches the intersection scheduled to cross the opposite lane and turn right (hereinafter, referred to as a right turn), or whether or not the host vehicle is ready to turn right at the intersection. The emission determination unit 105 determines whether or not the right direction indicator is on, with reference to the lighting information of the direction indicator. The emission determination unit 105 determines whether or not there is an oncoming vehicle traveling straight in the oncoming lane, with reference to the information of the oncoming lane.
When it is determined that the host vehicle is scheduled to turn right at the intersection, it is determined that the host vehicle has lit the right direction indicator, and it is determined that there is an oncoming vehicle traveling straight in the oncoming lane, the launch determination unit 105 determines to launch the flying object 200. When it is determined to fire the flying object 200, the fire determination unit 105 issues a fire instruction of the flying object 200 to the fire control unit 106.
The emission determination section 105 performs the above determination before the own vehicle turns right.
The emission determination unit 105 may determine whether or not to emit the flying object 200 by referring to at least one of weather information and user operation information in addition to the position information, the vehicle information, the turn-on information of the direction indicator, and the information of the opposite lane. In this case, the emission determination unit 105 refers to weather information acquired from the outside, and determines to emit the flying object 200 when the weather information is suitable for the emission of the flying object 200. In addition, the emission determination unit 105 determines to emit the flying object 200 when the user permits the emission of the flying object 200 with reference to the operation information of the user.
The emission control unit 106 controls emission of the flying object 200 based on the emission instruction input from the emission determination unit 105. The transmission control unit 106 controls the flight of the flight vehicle 200 so that the flight vehicle camera 202 mounted on the flight vehicle 200 captures a traffic signal of the opposite lane. The emission control unit 106 controls the flight of the flying object 200 so that at least one of the oncoming lane and the road ahead of the host vehicle turning right is imaged in addition to the traffic signal of the oncoming lane. The transmission control unit 106 outputs information on the transmission control of the flying object 200 and information on the flight control of the flying object 200 to the wireless communication unit 107.
The wireless communication unit 107 establishes wireless communication with the wireless communication device 201 of the aircraft 200. The wireless communication unit 107 transmits the transmitted control information and the flight control information to the flight object 200. The wireless communication unit 107 receives a first image, which is transmitted from the wireless communication device 201 of the aircraft 200 and captured by the aircraft camera 202 mounted on the aircraft 200. The wireless communication unit 107 outputs the received first image to the oncoming lane state determination unit 108.
The oncoming lane state determination unit 108 determines the state of the oncoming lane based on the input first image. The oncoming lane state determination unit 108 determines the lighting state of the traffic signal in the straight traveling direction of the oncoming lane at the intersection as the state of the oncoming lane.
The oncoming lane state determination unit 108 may determine the traveling state of the vehicle in the oncoming lane as the state of the oncoming lane. The oncoming lane state determination unit 108 determines, as a traveling state of the vehicle, specifically: the traffic jam state of the oncoming lane, the presence or absence of the oncoming vehicle traveling straight in the oncoming lane, the possibility of the oncoming vehicle traveling straight in the oncoming lane stopping, and the possibility of the oncoming vehicle stopping starting traveling straight in the oncoming lane. The oncoming lane state determination unit 108 may determine the traffic jam state of the traveling lane ahead of the host vehicle in the right turn as the state of the oncoming lane. As the state of the oncoming lane, in addition to the lighting state of the traffic signal in the straight traveling direction of the oncoming lane at the intersection, it is arbitrarily settable whether the oncoming lane state determination unit 108 determines the traveling state of the vehicle or determines the congestion state of the traveling vehicle ahead of the right turn of the own vehicle. The oncoming lane state determination unit 108 outputs the determination result to the notification unit 109.
The notification unit 109 generates support information when the host vehicle turns right at the intersection based on the determination result input from the oncoming lane state determination unit 108. The support information is information indicating a lighting color of a traffic signal of an oncoming lane, information indicating whether or not the host vehicle can turn right, information indicating a predicted movement of an oncoming vehicle, or the like. The notification unit 109 performs processing for displaying the generated auxiliary information on the display 401, outputting a sound from the speaker 402, and turning on the notification lamp 403 or the lamp body 203.
The wireless communication unit 107 may output the first image received from the flight object camera 202 mounted on the flight object 200 to the notification unit 109 without passing through the oncoming lane state determination unit 108. In this case, the notification unit 109 performs processing for displaying the input first image on the display 401.
The collection determination unit 110 acquires the position information input from the position information acquisition unit 101 and the captured image of the vicinity of the host vehicle input from the vehicle information acquisition unit 102. The collection determination unit 110 determines whether or not the emitted flying object 200 is collected, based on the acquired position information and the captured image of the vicinity of the own vehicle. Specifically, the collection determination unit 110 determines to collect the flying object 200 when the own vehicle passes through an intersection where a right turn is planned. Here, the case where the host vehicle passes through an intersection where a right turn is planned includes a case where a right turn is made at the intersection, and a case where the host vehicle does not turn right but travels straight or turns left at the intersection. When determining that the flying object 200 is to be collected, the collection determination unit 110 outputs a collection instruction of the flying object 200 to the emission control unit 106.
The launch control unit 106 controls the flight of the flying object 200 such that the flying object 200 is reloaded in the own vehicle, based on the recovery instruction of the flying object 200 input from the recovery determination unit 110. The transmission control unit 106 outputs information on the flight control of the flying object 200 to the wireless communication unit 107. The wireless communication unit 107 transmits information for collecting flight control of the flight vehicle 200 to the flight vehicle 200.
Next, an example of the hardware configuration of the driving assistance device 100 will be described.
Fig. 2A and 2B are diagrams showing an example of the hardware configuration of the driving assistance device 100.
The wireless communication unit 107 in the travel support device 100 is a transmission/reception device 100a that performs wireless communication with the flying object 200. Each function of the position information acquisition unit 101, the vehicle information acquisition unit 102, the traveling direction detection unit 103, the oncoming lane information detection unit 104, the emission determination unit 105, the emission control unit 106, the oncoming lane state determination unit 108, the notification unit 109, and the collection determination unit 110 in the travel support device 100 is realized by a processing circuit. That is, the driving assistance device 100 includes a processing circuit for realizing each of the above functions. The processing circuit may be a dedicated hardware processing circuit 100B as shown in fig. 2A, or may be a processor 100c that executes a program stored in a memory 100d as shown in fig. 2B.
As shown in fig. 2A, when the position information acquisition unit 101, the vehicle information acquisition unit 102, the traveling direction detection unit 103, the oncoming lane information detection unit 104, the emission determination unit 105, the emission control unit 106, the oncoming lane state determination unit 108, the notification unit 109, and the collection determination unit 110 are dedicated hardware, the processing Circuit 100b corresponds to, for example, a single Circuit, a composite Circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination thereof. The functions of each of the position information acquisition unit 101, the vehicle information acquisition unit 102, the traveling direction detection unit 103, the oncoming lane information detection unit 104, the emission determination unit 105, the emission control unit 106, the oncoming lane state determination unit 108, the notification unit 109, and the collection determination unit 110 may be implemented by a processing circuit, or may be implemented by a single processing circuit in combination with the functions of each unit.
As shown in fig. 2B, when the position information acquisition unit 101, the vehicle information acquisition unit 102, the traveling direction detection unit 103, the oncoming lane information detection unit 104, the emission determination unit 105, the emission control unit 106, the oncoming lane state determination unit 108, the notification unit 109, and the collection determination unit 110 are the processor 100c, the functions of the respective units are realized by software, firmware, or a combination of software and firmware. The software or firmware is described as a program and stored in the memory 100 d. The processor 100c reads and executes the program stored in the memory 100d, thereby realizing each function of the position information acquisition unit 101, the vehicle information acquisition unit 102, the traveling direction detection unit 103, the oncoming lane information detection unit 104, the emission determination unit 105, the emission control unit 106, the oncoming lane state determination unit 108, the notification unit 109, and the collection determination unit 110. That is, the position information acquisition unit 101, the vehicle information acquisition unit 102, the traveling direction detection unit 103, the oncoming lane information detection unit 104, the emission determination unit 105, the emission control unit 106, the oncoming lane state determination unit 108, the notification unit 109, and the collection determination unit 110 include a memory 100d for storing a program for executing the steps shown in fig. 3 and 4 described later as a result when executed by the processor 100 c. It is to be noted that these programs may be a program or a method for causing a computer to execute the position information acquisition unit 101, the vehicle information acquisition unit 102, the traveling direction detection unit 103, the oncoming lane information detection unit 104, the emission determination unit 105, the emission control unit 106, the oncoming lane state determination unit 108, the notification unit 109, and the collection determination unit 110.
Here, the Processor 100c is, for example, a CPU (Central Processing Unit), a Processing device, an arithmetic device, a Processor, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), or the like.
The Memory 100d may be a nonvolatile or volatile semiconductor Memory such as a RAM (Random Access Memory), a ROM (read only Memory), a flash Memory, an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Erasable EPROM), a magnetic disk such as a hard disk or a flexible disk, a Compact disk, a CD (Compact disk), or an optical disk such as a DVD (Digital Versatile disk).
The functions of the position information acquisition unit 101, the vehicle information acquisition unit 102, the traveling direction detection unit 103, the oncoming lane information detection unit 104, the emission determination unit 105, the emission control unit 106, the oncoming lane state determination unit 108, the notification unit 109, and the collection determination unit 110 may be partially implemented by dedicated hardware, and partially implemented by software or firmware. Thus, the processing circuit 100b in the driving assistance device 100 can realize the above-described functions by hardware, software, firmware, or a combination thereof.
Next, the operation of the driving assistance device 100 will be described.
Fig. 3 is a flowchart showing the operation of the driving assistance device 100 according to embodiment 1.
The position information acquiring unit 101 acquires map information, the current position of the vehicle, and route information of the vehicle as position information from the navigation system (step ST 1). The position information acquisition section 101 outputs the acquired position information to the emission determination section 105. The vehicle information acquisition unit 102 acquires, as vehicle information, the traveling direction of the own vehicle, the vehicle speed information of the own vehicle, and a captured image of the vicinity of the own vehicle from the vehicle-mounted radar (step ST 2). The vehicle information acquisition portion 102 outputs the acquired vehicle information to the emission determination portion 105. The traveling direction detection unit 103 acquires the direction indicator operation information from the sensor and detects lighting information of the direction indicator (step ST 3). The traveling direction detection unit 103 outputs the detected turn-on information of the direction indicator to the emission determination unit 105.
The opposite lane information detecting unit 104 acquires the second image from at least one of the onboard camera of the vehicle and the flying object camera 202 of the flying object 200, and detects the information of the opposite lane (step ST 4). The opposite-direction lane information detecting unit 104 outputs the detected information of the opposite-direction lane to the emission determining unit 105. The launch determination unit 105 determines whether or not to launch the flying object 200 with reference to the information input in steps ST1 to ST4 (step ST 5). If it is determined that the flying object 200 is not to be launched (no in step ST5), the process ends.
On the other hand, when determining to fire the flying object 200 (step ST 5: yes), the fire determination unit 105 outputs a fire command for the flying object 200 to the fire control unit 106 (step ST 6). The launch control unit 106 performs launch control of the flying object 200 and flight control of the flying object 200 based on the launch command of the flying object 200 input at step ST6 (step ST 7). The transmission control unit 106 transmits the control information based on the control at step ST7 to the flying object 200 via the wireless communication unit 107 and the wireless communication device 201 (step ST 8).
The wireless communication unit 107 receives the first image transmitted from the wireless communication device 201 of the flying object 200 (step ST9), and outputs the received first image to the oncoming lane state determination unit 108. The oncoming lane state determination unit 108 determines the state of the oncoming lane from the first image (step ST 10). The notification unit 109 generates the support information when the own vehicle turns right at the intersection based on the determination result at step ST10 (step ST 11). The notification unit 109 performs processing for notifying the driver of the assist information generated at step ST11 (step ST 12).
The collection determination unit 110 determines whether or not the host vehicle has passed through the intersection at which a right turn is planned, based on the position information input from the position information acquisition unit 101 and the captured image of the vicinity of the host vehicle input from the vehicle information acquisition unit 102 (step ST 13). When the own vehicle has not passed through the intersection (no in step ST13), the flow returns to the process in step ST 9.
On the other hand, when the own vehicle passes through the intersection (yes in step ST13), the collection determination unit 110 outputs an instruction to collect the flying object 200 to the emission control unit 106 (step ST 14). The launch control unit 106 performs flight control of the flight vehicle 200 and control of recovering the flight vehicle 200 based on the recovery instruction of the flight vehicle 200 input at step ST14 (step ST 15). The transmission control unit 106 transmits the control information based on the control at step ST15 to the flying object 200 via the wireless communication unit 107 and the wireless communication device 201 (step ST 16). After that, the flow ends the processing.
Next, the determination process of step ST5 in the flow of fig. 3 will be described in more detail.
Fig. 4 is a flowchart showing the operation of the emission determination unit 105 of the travel assistance device 100 according to embodiment 1.
The emission determination unit 105 determines whether or not the own vehicle has approached the intersection scheduled to turn right, or whether or not the own vehicle is ready to turn right at the intersection, with reference to the position information input at step ST1 and the vehicle information input at step ST2 (step ST 21). Here, the emission determination unit 105 determines whether or not the own vehicle has approached the intersection at which the right turn is planned, or the own vehicle is located in the right-turn lane, based on the position information and the vehicle information, and performs the determination processing of step ST 21.
When the host vehicle has approached the intersection or the host vehicle is ready to turn right at the intersection (yes in step ST21), the emission determination unit 105 determines whether or not the right direction indicator of the host vehicle is on, with reference to the lighting information of the direction indicator input in step ST3 (step ST 22). When the right direction indicator of the host vehicle is turned on (yes in step ST22), the emission determination unit 105 determines whether or not there is an oncoming vehicle traveling straight in the oncoming lane at the intersection, with reference to the oncoming lane information input in step ST4 (step ST 23). If there is an oncoming vehicle traveling straight in the oncoming lane (yes in step ST23), the launch determination unit 105 determines that the flying object 200 is launched (step ST24), and returns to the process of step ST6 in the flow of fig. 3.
On the other hand, when the host vehicle is not close to the intersection at which the right turn is scheduled, and the host vehicle is not ready to turn right at the intersection (no in step ST21), or when the right direction indicator of the host vehicle is not lit (no in step ST22), or when there is no oncoming vehicle traveling straight in the oncoming lane (no in step ST23), the emission determination unit 105 determines not to emit the flying object 200 (step ST25), and ends the processing.
In the flowchart of fig. 4, whether or not to fire the flying object 200 may be determined by further referring to at least one of weather information and user operation information. In this case, in the subsequent stage of the process of yes in step ST23, the emission determination unit 105 refers to the weather information acquired from the outside, and determines to emit the flying object 200 when the weather information is suitable for the emission of the flying object 200. The emission determination unit 105 determines to emit the flying object 200 when the user permits the emission of the flying object 200 with reference to the operation information of the user.
Next, the flight control of the flight vehicle 200 in step ST7 of the flow of fig. 3 will be described in more detail.
Various configurations are possible for the launch control unit 106 to which location the flying object 200 flies. Three control methods are shown below. The transmission control unit 106 may use any one of three control methods, or may use two or more control methods. When two or more control methods are used, the transmission control unit 106 may determine the control method based on the priority set in advance in each control information.
Fig. 5A and 5B are explanatory diagrams illustrating flight control of the flight vehicle 200 by the launch control unit 106 of the driving assistance device 100 according to embodiment 1. Fig. 5A to 5C show an intersection B where the own vehicle a makes a right-turn.
Fig. 5A shows an example in which the driving assistance device 100 acquires position information of the traffic signal C of the opposite lane of the intersection B from the navigation system. The transmission control section 106 acquires the position information and the map information of the traffic signal C via the position information acquisition section 101 and the transmission determination section 105. The transmission control unit 106 acquires the position information of the position Pa at which the lighting state of the traffic signal C can be imaged, using the acquired position information of the traffic signal C and the map information. The transmission control unit 106 transmits the acquired position information of the position Pa and the control information instructing the flight to the position Pa to the flight vehicle 200 via the wireless communication unit 107 and the wireless communication device 201.
Next, the example of fig. 5B shows a case where the transmission control section 106 determines a main cause (another vehicle, another flying object, an electric wire, a logo, or bad weather) that hinders shooting, which cannot be determined from the position information input from the navigation system. The emission control unit 106 acquires a captured image of the vehicle camera D mounted on the host vehicle a via the vehicle information acquisition unit 102 and the emission determination unit 105. The emission control unit 106 determines the oncoming vehicle E having a high vehicle height as a factor that inhibits imaging, based on the captured image. The transmission control portion 106 calculates a position Pb suitable for avoiding the shooting of the oncoming vehicle E as a factor of the obstruction.
The emission control unit 106 acquires the position information of the calculated position Pb using the map information acquired via the position information acquisition unit 101 and the emission determination unit 105. The transmission control unit 106 transmits the acquired position information of the position Pa and the control information instructing the flight to the position Pb to the flight vehicle 200 via the wireless communication unit 107 and the wireless communication device 201. In the example of fig. 5B, the position Pb is a position which is located on the traveling lane side of the host vehicle a which avoids the oncoming vehicle E and is closer than fig. 5A, and which can photograph the traffic signal C of the oncoming lane.
Next, fig. 5C shows an example in which the emission control unit 106 uses position information specified by the driver or the like via an input device (not shown) such as a touch panel. When the driver designates a position Pc on the display 401 having a touch panel function, the emission control unit 106 acquires position information of the position Pc using the map information acquired via the position information acquisition unit 101 and the emission determination unit 105. The transmission control unit 106 transmits the acquired position information of the position Pc and the control information instructing the flight to the position Pc to the flight object 200 via the wireless communication unit 107 and the wireless communication device 201.
The emission control unit 106 may be configured to register position information of the position Pc designated by the driver or the like in a storage area (not shown) together with position information of the intersection B in advance, and perform flight control to cause the flying object 200 to fly to the registered position Pc when the host vehicle a travels to the same intersection B next.
Next, the generation of the auxiliary information and the notification of the auxiliary information in step ST11 and step ST12 in the flow of fig. 3 will be described in detail.
First, how the notification unit 109 generates the assist information to notify the driver of the assist information can be configured in various ways. The following describes four methods for generating and notifying the auxiliary information. The notification unit 109 may use any one of four methods, or may use two or more methods. When two or more methods are used, the notification unit 109 may determine the generation and notification method according to a priority set in advance in each method.
1 st Generation and Notification method
The notification unit 109 generates the auxiliary information for causing the notification lamp 403 or the display 401 to light the lighting color of the traffic signal in the straight direction of the opposite lane at the intersection based on the determination result of the state of the opposite lane. The notification unit 109 causes the notification lamp 403 or the display 401 to light the auxiliary information.
When the blue color of the signal machine is turned on in the notification lamp 403 or the display 401, the driver determines that the oncoming vehicle can enter the intersection. When the yellow or red color of the traffic signal is displayed on the notification lamp 403 or the display 401, the driver determines that the oncoming vehicle is in a state in which it cannot enter the intersection.
2 nd Generation and Notification method
The notification unit 109 generates, based on the determination result of the state of the oncoming lane, assist information in which information to be notified to the driver in the following case is indicated by text: the lighting color of the traffic signal in the straight traveling direction of the oncoming lane at the intersection, the presence or absence of the oncoming vehicle traveling straight in the oncoming lane, the possibility of the oncoming vehicle traveling straight in the oncoming lane stopping, and the possibility of the oncoming vehicle stopping starting to travel straight in the oncoming lane. The notification unit 109 causes the display 401 to display the auxiliary information.
Fig. 6 and 7 are diagrams showing display examples of the notification unit 109 of the driving assistance device 100 according to embodiment 1.
For example, when the traffic signal in the straight traveling direction of the opposite lane at the intersection is yellow and the possibility of the opposite vehicle traveling straight in the opposite lane is notified, the notification unit 109 displays, as shown in fig. 6, the support information 401a that "the signal of the opposite lane is changed to yellow and the opposite vehicle is about to stop" on the map displayed on the display 401. In addition, the lighting state of the traffic signal C that is heading for the lane may be represented by color.
Although not shown, for example, when the traffic signal in the straight traveling direction of the opposite lane at the intersection is red and the vehicle traveling straight in the opposite lane is notified of the stop, the notification unit 109 displays the support information that "the signal of the opposite lane is changed to the stop of the red opposite vehicle".
Further, for example, when the traffic signal in the straight traveling direction of the oncoming lane at the intersection is blue and the possibility of the oncoming vehicle that is stopping starting traveling straight in the oncoming lane is notified, the notification unit 109 causes the map displayed on the display 401 to display the assist information 401b of "the possibility that the oncoming vehicle is stopping but the oncoming vehicle starts traveling straight", as shown in fig. 7. In addition, the lighting state of the traffic signal C that is heading for the lane may be represented by color.
3 rd Generation and Notification method
The notification unit 109 generates, based on the determination result of the state of the oncoming lane, assist information that indicates, by sound, information to be notified to the driver in the following cases: the lighting color of the traffic signal in the straight traveling direction of the oncoming lane at the intersection, the presence or absence of the oncoming vehicle traveling straight in the oncoming lane, the possibility of the oncoming vehicle traveling straight in the oncoming lane stopping, and the possibility of the oncoming vehicle stopping starting to travel straight in the oncoming lane. The notification unit 109 outputs the auxiliary information from the speaker 402 by voice.
Specifically, for example, when the traffic signal in the straight traveling direction of the opposite lane at the intersection is yellow and the possibility of the opposite vehicle traveling straight in the opposite lane is notified, the notification unit 109 changes the signal of the opposite lane to yellow and outputs a sound that the opposite vehicle is about to stop from the speaker 402.
For example, the notification unit 109 notifies that the oncoming vehicle traveling straight in the oncoming lane at the intersection is red, and when the oncoming vehicle traveling straight in the oncoming lane stops, the sound of "the signal of the oncoming lane is red and the oncoming vehicle stops" is output from the speaker 402.
For example, when the traffic signal in the straight traveling direction of the oncoming lane at the intersection is blue and the possibility of the oncoming vehicle stopping traveling straight in the oncoming lane is notified, the notification unit 109 outputs a sound of "the oncoming vehicle is stopped but the possibility of the oncoming vehicle traveling straight" from the speaker 402.
4 th Generation and Notification method
The notification unit 109 generates auxiliary information for lighting colors of traffic signals in a straight direction for lighting the lane at the intersection with the lamp body 203, based on the determination result of the state of the lane. The notification unit 109 causes the lamp body 203 to light the auxiliary information.
When the signal machine is turned on in blue in the lamp body 203, the driver determines that the oncoming vehicle can enter the intersection. When the yellow or red color of the traffic signal is displayed on the lamp body 203, the driver determines that the oncoming vehicle is in a state in which it cannot enter the intersection.
In addition, when the display (not shown) is provided in the flight vehicle 200, the notification unit 109 may generate the assist information for causing the display to display a right arrow for notifying that the vehicle can turn right.
Next, the effect of the travel assistance device 100 having the above-described configuration will be described with reference to fig. 8.
Fig. 8 is a diagram for explaining the effects of the driving assistance device 100 according to embodiment 1.
In the example of fig. 8, the traffic signal F of the traveling lane of the host vehicle a is a blue signal, and the host vehicle a is traveling toward the intersection B. Further, the wall portion G is provided in the center bank between the traveling lane and the oncoming lane, and the driver of the host vehicle a cannot grasp the situation H of the oncoming lane.
When the oncoming vehicle E decelerates, the driver of the host vehicle a cannot conventionally determine whether the oncoming vehicle E has stopped because the traffic signal C of the oncoming lane becomes a red signal or has stopped because the situation H of the oncoming lane is congested. Therefore, after the host vehicle a enters the intersection B, it is necessary to wait for the stop of the oncoming vehicle E or the confirmation signal C. Thus, the own vehicle a must decelerate.
On the other hand, when the configuration of the driving assistance device 100 according to embodiment 1 is applied, the driver of the host vehicle a can determine that the oncoming vehicle E is stopped due to the traffic signal C becoming a red signal, for example, based on the assistance information J displayed on the display 401. Therefore, the driver can pass through the intersection B without having to decelerate the own vehicle a and enter the intersection B. This can suppress unnecessary deceleration of the vehicle a. By this means, the driver can recognize the detailed information of the opposite lane by displaying the assist information J.
Thus, according to embodiment 1, the wireless communication unit 107 that establishes wireless communication with the flying object 200 on which the flying object camera 202 is mounted, the emission control unit 106 that controls emission of the flying object 200, the oncoming lane state determination unit 108 that determines the state of the oncoming lane that the vehicle crosses at the intersection by turning left or right based on the first image captured by the flying object camera 202, and the notification unit 109 that notifies the determined state of the oncoming lane are provided, and therefore, even when the vehicle does not enter the intersection, it is possible to acquire the relevant information of the oncoming lane. In addition, even in the case where an obstacle or the like exists, the information on the opposite lane can be acquired. This can suppress unnecessary deceleration of the vehicle.
Further, according to embodiment 1, the present embodiment is configured to include the emission determination unit 105, and the emission determination unit 105 determines whether or not the flying object 200 is emitted before the host vehicle enters the intersection based on the map information, the current position of the host vehicle, the position information that is the route information of the host vehicle, the vehicle information of the host vehicle, and the lighting information of the direction indicator indicating which direction the host vehicle is expected to travel in, so that the information on the oncoming lane can be acquired even when the host vehicle does not enter the intersection. This can suppress unnecessary deceleration of the vehicle.
Further, according to embodiment 1, since the present embodiment is configured to include the oncoming lane information detecting unit 104, the oncoming lane information detecting unit 104 detects information on the oncoming lane from the second image captured by the vehicle-mounted camera mounted on the host vehicle or the flying object camera 202 of the flying object before the launch, and the launch determining unit 105 determines not to launch the flying object 200 when detecting that there is no oncoming vehicle traveling straight in the oncoming lane at the intersection, it is possible to suppress unnecessary launching of the flying object.
Further, according to embodiment 1, the oncoming lane state determination unit 108 is configured to determine the lighting state of the traffic signal in the straight traveling direction of the oncoming lane of the intersection and the traveling state of the oncoming vehicle that is straight traveling in the oncoming lane, based on the first image captured by the flight vehicle camera 202 of the flight vehicle emitted by the control of the emission control unit 106, and therefore, in addition to the lighting color of the traffic signal in the straight traveling direction of the oncoming lane of the intersection, it is possible to determine the presence or absence of an oncoming vehicle that is straight traveling in the oncoming lane, the possibility of a oncoming vehicle that is straight traveling in the oncoming lane, or the possibility of a oncoming vehicle that is stopping starting to travel straight.
Further, according to embodiment 1, since the notification unit is configured to generate the support information indicating the information to be notified to the driver by the sound, the driver can determine whether or not the vehicle can travel while crossing the opposite lane at the intersection without shifting the line of sight and the focus from the traveling direction of the vehicle.
Further, according to embodiment 1, the lamp body 203 of the flying object 200 is configured to light the lighting color of the traffic signal in the straight traveling direction of the opposite lane of the intersection, and therefore, it is possible to notify the information on whether or not the oncoming vehicle enters the intersection, in the flying object, without communicating with the travel support device. Further, since the lamp body 203 of the flying object 200 is positioned above the traveling direction of the vehicle, the driver can determine whether or not the vehicle can travel while crossing the opposite lane at the intersection without shifting the line of sight and the focus from the traveling direction.
Embodiment 2 ∙
In embodiment 1 described above, the processing of steps ST21 to ST23 in the flow of fig. 4 is performed, and the processing of determining whether or not to fire the flying object 200 is shown. Embodiment 2 shows a configuration for determining whether or not to fire the flying object 200 based on more detailed conditions.
In embodiment 2, similarly to embodiment 1, the case where the host vehicle turns right by crossing the oncoming lane at the intersection will be described as an example, but the driving assistance device 100 can also be applied to the case where the host vehicle turns left by crossing the oncoming lane at the intersection.
Hereinafter, the determination conditions of the plurality of emission determination units 105 are shown, but any one determination condition may be applied, or a plurality of determination conditions may be applied. The determination conditions applied to the emission determination section 105 can be arbitrarily set.
1 st determination Condition
The emission determination unit 105 determines whether or not the host vehicle enters the right-turn-only lane with reference to the captured image of the periphery of the host vehicle acquired by the vehicle information acquisition unit 102. When the host vehicle is entering the right-turn-only lane, the emission determination unit 105 determines that the flying object 200 is emitted.
On the other hand, when the own vehicle does not enter the right-turn-only lane, it is determined that the flying object 200 is not to be launched even if it is determined that the right direction indicator of the own vehicle is turned on (step ST 22: yes).
The emission determination unit 105 can reliably determine whether or not the host vehicle is turning right by determining whether or not the host vehicle enters the right-turn exclusive lane.
Determination Condition 2
When the emission determination unit 105 determines that the vehicle is scheduled to turn right at the intersection (step ST 21: yes), but the right direction indicator of the vehicle is not turned on (step ST 22: no), the emission determination unit refers to the travel history of the vehicle. The emission determination unit 105 determines whether or not the number of right turns at the intersection determined as a right turn scheduled, which is recorded in the travel history of the host vehicle, is equal to or greater than a threshold value. When the number of right turns at the intersection is equal to or greater than the threshold value, the emission determination unit 105 determines that the flying object 200 is emitted. On the other hand, when the number of right turns at the intersection is less than the threshold value, the emission determination unit 105 determines not to emit the flying object 200.
The emission determination unit 105 refers to the travel history of the host vehicle stored in a recording area (not shown) of the travel support apparatus 100 or the external apparatus. The travel history of the host vehicle records information indicating the number of times the host vehicle travels at each intersection and the number of times the host vehicle has turned right.
Even when it is determined that the vehicle is scheduled to turn right at the intersection, but the driver of the vehicle forgets to turn the direction indicator, the emission determination unit 105 can determine that the possibility that the vehicle turns right at the intersection is high based on the travel history of the vehicle, and emit the flying object 200.
3 rd decision Condition
The emission determination unit 105 determines whether or not the emission requirement of the flying object 200 is set, with reference to information indicating whether or not the emission requirement of the flying object 200 is set in advance by a driver or the like. When the information indicating that the flying object 200 needs to be launched is input, the launch determination unit 105 determines that the flying object needs to be launched without performing the determinations at step ST21 to step ST 23. When the information indicating that the flying object 200 does not need to be launched is input, the launch determination unit 105 determines that the flying object is launched without performing the determinations at step ST21 to step ST 23.
The information indicating whether the flying object 200 needs to be launched is set to, for example, "[ ○○ intersection ] launch: required (○○ m ahead)", "[ ○○ intersection ] launch: not required".
When the transmission is necessary, the transmission determination unit 105 can acquire information indicating the location where the flying object 200 is transmitted.
By setting in advance whether the flight vehicle 200 is to be launched at a specific intersection by the driver, the following determination can be reliably made at the intersection assumed by the driver: the flying body 200 is launched at the correct timing or the flying body 200 is not launched.
4 th judgment Condition
When the driver himself inputs whether or not the flying object 200 needs to be launched via an input device (not shown) before the own vehicle enters the intersection for the next passage, the launch determination unit 105 determines whether or not the flying object 200 needs to be launched based on the input information without performing the determination of steps ST21 to ST 23.
Examples of an input method of whether the flight object 200 is to be radiated include an input operation using a touch panel, an input operation using a radiation button provided in the vehicle, and an input operation using voice.
Thus, the driver can specify whether or not the flying object 200 needs to be launched and the launch timing of the flying object 200 as desired even before the driver operates the direction indicator.
5 th judgment Condition
When the information that the opposite lane cannot be detected is notified from the opposite lane information detecting unit 104, the emission determining unit 105 determines to emit the flying object 200.
Embodiment 1 shows a configuration of the oncoming lane information detection unit 104, in which: whether an oncoming vehicle that is running straight in an oncoming lane at the intersection exists is detected from the second image. However, when a large vehicle is stopping in a right-turn lane of an opposite lane at an intersection, or when an obstacle is present in the opposite lane, the opposite lane information detection unit 104 may not be able to detect whether or not there is an opposite vehicle traveling straight in the opposite lane from the second image. In this case, the oncoming lane information detecting unit 104 notifies the transmission determining unit 105 that the information of the oncoming lane cannot be detected.
When the notification of the fact that the information of the opposite lane cannot be detected is received, the emission determination unit 105 determines that there is a possibility of an oncoming vehicle traveling straight in the opposite lane at the intersection in step ST23 of the flowchart of fig. 4 in embodiment 1 (step ST 23: yes), and determines that the flying object 200 is being emitted.
Thus, even if it is not possible to detect whether or not there is an oncoming vehicle traveling straight in the oncoming lane at the intersection, the launch determination unit 105 determines that the launch flying object 200 is launched, and the oncoming lane state determination unit 108 can acquire the first image for determining whether or not there is an oncoming vehicle traveling straight in the oncoming lane.
6 th judgment Condition
The emission determination unit 105 determines whether or not the intersection where the host vehicle is scheduled to enter is congested, with reference to the captured image of the periphery of the host vehicle acquired by the vehicle information acquisition unit 102. When the intersection where the own vehicle is scheduled to enter is congested, the emission determination unit 105 determines not to emit the flying object 200 without performing the determination of step ST21 to step ST 23.
In the case where the intersection where the own vehicle is scheduled to enter is congested, even if the traffic signal in the straight traveling direction of the opposite lane of the intersection is a red signal, the own vehicle cannot turn right, and therefore unnecessary emission of the flying object 200 can be suppressed.
7 th judgment Condition
When the lighting information of the traffic signal in the straight traveling direction of the oncoming lane can be acquired at the intersection where the host vehicle is scheduled to enter based on the traffic basic information, the emission determination unit 105 determines not to emit the flying object 200. On the other hand, when the lighting information of the traffic signal in the straight traveling direction of the oncoming lane cannot be acquired from the traffic basic information, the emission determination unit 105 performs the processing shown in embodiment 1.
In addition, when the oncoming lane state determination unit 108 determines the traveling state of the vehicle in the oncoming lane as the state of the oncoming lane, or determines the traffic congestion state of the traveling lane ahead of the vehicle turning right as the state of the oncoming lane, the launch determination unit 105 may determine that the flying object 200 is launched.
As the use of the traffic basic information, for example, a TSPS (signal information utilization driving support system) is used via an optical beacon. By using the traffic basic information, lighting information of a traffic signal in the straight traveling direction of the opposite lane can be acquired at the intersection where the own vehicle is expected to enter, and thus unnecessary emission of the flying object 200 can be suppressed.
In addition, if the driving support device is mounted on a vehicle that can receive a signal from an optical beacon, it is possible to use traffic basic information relatively easily.
8 th judgment Condition
The emission determination unit 105 determines not to emit a flying object when it is possible to receive information for determining the state of the oncoming lane from a flying object other than the flying object 200 whose flight is controlled by the travel assist device 100.
The travel assist device 100 receives information for determining the state of the oncoming lane from the other flying object through wireless communication between the travel assist devices, inter-vehicle communication between the host vehicle and the other vehicle, and inter-flying object communication between the flying object 200 and the other flying object.
Thus, when another flying object is flying, it is possible to suppress the flying objects controlled by the travel support device 100 of the host vehicle from flying in a messy manner, prevent the flying objects from coming into contact with each other, and suppress energy consumption due to the flying of the flying objects.
In addition, it is possible to receive an image that has been acquired by another flying object, and the emission determination unit 105 can confirm the state of the oncoming lane at an earlier timing.
9 th judgment Condition
The emission determination unit 105 determines not to emit the flying object 200 without performing the determination of steps ST21 to ST23 when the traffic signal of the driving lane of the intersection into which the host vehicle is scheduled to enter is a yellow signal or a red signal with reference to the captured image of the periphery of the host vehicle acquired by the vehicle information acquisition unit 102.
This can suppress unnecessary radiation from the flying object 200.
As described above, according to embodiment 2, the emission determination unit 105 is configured to determine whether or not the host vehicle is traveling in the right-turn lane, determine whether or not the information on the oncoming lane is detected, acquire the congestion status of the intersection where the host vehicle is scheduled to enter, acquire the traffic infrastructure information, acquire the information on the oncoming lane from another vehicle, or determine whether the traffic signal of the traveling lane of the host vehicle is yellow or red, by referring to the travel history of the host vehicle, thereby making it possible to reliably emit the vehicle at a timing when the information on the oncoming lane is required.
In the above-described embodiments 1 and 2, the flight vehicle 200 may be configured to include the oncoming lane state determination unit 108 and the notification unit 109, instead of the travel assistance device 100 being configured to include the oncoming lane state determination unit 108 and the notification unit 109. Both the travel assist device 100 and the flight vehicle 200 may be provided with the oncoming lane state determination unit 108 and the notification unit 109.
In addition to the above, the present invention may be freely combined with the respective embodiments, or may be modified in any of the components of the respective embodiments, or may be omitted from the respective embodiments, within the scope of the present invention.
Industrial applicability of the invention
The driving assistance device according to the present invention is applied to a navigation system or the like, and is suitable for providing driving assistance information based on information acquired by a flight vehicle.
Description of the reference symbols
100 of the traveling assistance devices, the traveling assistance device,
101 a position information acquiring section for acquiring position information of the object,
102 a vehicle information acquisition unit for acquiring vehicle information,
103 a traveling direction detecting section for detecting a traveling direction of the vehicle,
104 an opposite-direction lane information detecting section,
105 an emission judging section for judging whether the emission of the light source is a single light,
106 a control part for the emission of the light,
107 a wireless communication unit for performing a wireless communication,
108 an opposite-direction lane state determination unit,
109 a notification unit for notifying that the user has requested the information,
110 a recovery determining part for recovering the waste water,
200 of the flying object, and the flying object,
201 a wireless communication device is provided with a wireless communication device,
202 of the flying object camera,
203 of the lamp body, the light source,
300 travel assist system.

Claims (7)

1. A driving assistance apparatus characterized by comprising:
a wireless communication unit that establishes wireless communication with a flying object having a flying object camera mounted thereon;
a launch control unit that controls launching of the flying object;
an oncoming lane state determination unit that determines a state of an oncoming lane that the vehicle has traversed by turning left or right at an intersection, based on a first image captured by the flight volume camera acquired via the wireless communication unit; and
a notification unit that notifies the state of the oncoming lane determined by the oncoming lane state determination unit.
2. The driving assistance apparatus according to claim 1,
the vehicle information processing apparatus includes an emission determination unit that determines whether or not the flying object is emitted before the host vehicle enters the intersection based on map information, a current position of the host vehicle, route information of the host vehicle, vehicle information of the host vehicle, and lighting information of a direction indicator of the host vehicle,
the emission control unit controls emission of the flying object based on a determination result of the emission determination unit.
3. The driving assistance apparatus according to claim 2,
includes an opposite lane information detection unit that detects information of the opposite lane based on a second image captured by an in-vehicle camera mounted on the host vehicle or the flying object camera of the flying object before the transmission,
the emission determination unit determines not to emit the flying object when the opposite lane information detection unit detects that there is no oncoming vehicle traveling straight in the opposite lane at the intersection.
4. The driving assistance apparatus according to claim 1,
the oncoming lane state determination unit determines a lighting state of a traffic signal in a straight traveling direction of the oncoming lane at the intersection based on the first image captured by the flight body camera of the flight body emitted by the emission control unit.
5. The driving assistance apparatus according to claim 4,
the oncoming lane state determination unit determines a traveling state of an oncoming vehicle traveling straight in the oncoming lane at the intersection based on the first image captured by the flight object camera of the flight object emitted by the emission control unit.
6. A travel assist system characterized by comprising:
the driving assistance apparatus according to claim 1; and
the flying object includes the flying object camera, a wireless communication device that establishes wireless communication with the travel support device, and a notification device that notifies the notification unit of the state of the oncoming lane.
7. A driving assistance method characterized by comprising the steps of:
establishing wireless communication between a wireless communication unit and a flying object equipped with a flying object camera;
a step in which an emission control unit controls emission of the flying object;
an oncoming lane state determination unit that determines a state of an oncoming lane that the vehicle has traversed by a left or right turn at an intersection, based on the first image captured by the flight body camera; and
and a step in which the notification unit notifies the determined state of the opposite lane.
CN201780091216.5A 2017-06-02 2017-06-02 Travel assist device, travel assist system, and travel assist method Active CN110709912B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/020621 WO2018220822A1 (en) 2017-06-02 2017-06-02 Travel assistance device, travel assistance system, and travel assistance method

Publications (2)

Publication Number Publication Date
CN110709912A true CN110709912A (en) 2020-01-17
CN110709912B CN110709912B (en) 2022-02-22

Family

ID=64455238

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780091216.5A Active CN110709912B (en) 2017-06-02 2017-06-02 Travel assist device, travel assist system, and travel assist method

Country Status (3)

Country Link
JP (1) JP6741360B2 (en)
CN (1) CN110709912B (en)
WO (1) WO2018220822A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008074275A (en) * 2006-09-21 2008-04-03 Aisin Aw Co Ltd Operation assistant device, operation assistant system and operation assistant method
CN103000035A (en) * 2012-11-22 2013-03-27 北京交通大学 Information acquisition release system and method for guiding left-hand turning vehicle to pass through intersection
JP5819555B1 (en) * 2015-04-01 2015-11-24 ライトブレインラボ合同会社 Vehicle driving support system
JP2017021756A (en) * 2015-07-15 2017-01-26 三菱自動車工業株式会社 Vehicular operation support apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6711565B2 (en) * 2015-07-07 2020-06-17 キヤノン株式会社 Communication device, control method thereof, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008074275A (en) * 2006-09-21 2008-04-03 Aisin Aw Co Ltd Operation assistant device, operation assistant system and operation assistant method
CN103000035A (en) * 2012-11-22 2013-03-27 北京交通大学 Information acquisition release system and method for guiding left-hand turning vehicle to pass through intersection
JP5819555B1 (en) * 2015-04-01 2015-11-24 ライトブレインラボ合同会社 Vehicle driving support system
JP2017021756A (en) * 2015-07-15 2017-01-26 三菱自動車工業株式会社 Vehicular operation support apparatus

Also Published As

Publication number Publication date
CN110709912B (en) 2022-02-22
JPWO2018220822A1 (en) 2019-11-07
WO2018220822A1 (en) 2018-12-06
JP6741360B2 (en) 2020-08-19

Similar Documents

Publication Publication Date Title
CN106816021B (en) System for influencing a vehicle system by taking into account an associated signal transmitter
CN108883777B (en) Driving support system and display device
CN109791735B (en) Vehicle control device
JP6573594B2 (en) Automatic operation control device
US11347233B2 (en) Vehicle control apparatus, vehicle control system, and image sensor
CN110276986B (en) Vehicle control device and vehicle control method
CN110497919B (en) Object position history playback for automatic vehicle transition from autonomous mode to manual mode
JP2008250503A (en) Operation support device
JP4777805B2 (en) Vehicle warning guidance device
CN113874267B (en) Vehicle control method and vehicle control device
US11279352B2 (en) Vehicle control device
JP2007276733A (en) Driving operation auxiliary device for vehicle, and vehicle with driving operation assisting device for vehicle
CN111727468A (en) Vehicle control device and vehicle control method
CN112334372A (en) Vehicle control device
JP7064357B2 (en) Vehicle control unit
JP2019144689A (en) Vehicle control device
JP2019049812A (en) Traveling position evaluation system
JP2009129290A (en) Traffic signal detection apparatus, traffic signal detection method and program
US20150227801A1 (en) Method and device for determining a distance of a vehicle from a traffic-regulating object
JP2011025862A (en) Drive control apparatus
CN110709912B (en) Travel assist device, travel assist system, and travel assist method
CN110320905B (en) Vehicle control device
JP6699593B2 (en) Glass heating equipment
CN110869992B (en) Right/left turn determination method and right/left turn determination device for driving support vehicle
JP2009075767A (en) Drive support device and drive support method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant