WO2008038376A1 - Dispositif de reconnaissance de signal, procédé de reconnaissance de signal, programme de reconnaissance de signal, et support d'enregistrement - Google Patents

Dispositif de reconnaissance de signal, procédé de reconnaissance de signal, programme de reconnaissance de signal, et support d'enregistrement Download PDF

Info

Publication number
WO2008038376A1
WO2008038376A1 PCT/JP2006/319342 JP2006319342W WO2008038376A1 WO 2008038376 A1 WO2008038376 A1 WO 2008038376A1 JP 2006319342 W JP2006319342 W JP 2006319342W WO 2008038376 A1 WO2008038376 A1 WO 2008038376A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
signal
display state
traffic
traffic light
Prior art date
Application number
PCT/JP2006/319342
Other languages
English (en)
Japanese (ja)
Inventor
Hiroaki Shibasaki
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Priority to PCT/JP2006/319342 priority Critical patent/WO2008038376A1/fr
Priority to JP2008536257A priority patent/JP4926182B2/ja
Publication of WO2008038376A1 publication Critical patent/WO2008038376A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • B60K28/066Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver actuating a signalling device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18172Preventing, or responsive to skidding of wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/04Monitoring the functioning of the control system

Definitions

  • Signal recognition apparatus signal recognition method, signal recognition program, and recording medium
  • the present invention relates to a signal recognition device that recognizes a display state of a traffic light, a signal recognition method, a signal recognition program, and a recording medium.
  • a signal recognition device that recognizes a display state of a traffic light
  • a signal recognition method that recognizes a display state of a traffic light
  • a signal recognition program that recognizes a signal recognition program
  • a recording medium that stores data
  • the use of the present invention is not limited to the signal recognition device, the signal recognition method, the signal recognition program, and the recording medium described above. Background art
  • an in-vehicle display device that captures the front of a vehicle with a camera and converts a portion other than red in the captured image to monochrome.
  • this in-vehicle display device when the lighting color of the traffic light in the image displayed on the display screen is red, the lighting color of the traffic light is easily visually enhanced by highlighting red. Can be easily determined to be red (for example, see Patent Document 1 below).
  • Patent Document 1 Japanese Unexamined Patent Publication No. 2006-115376
  • the lighting color of the front traffic light can be displayed in real time.
  • the lighting color of the front traffic light is not captured.
  • the lighting color of the front traffic light changes after a predetermined time has elapsed.
  • One example is the problem of not being able to notify passengers of information that is expected to be prepared and prompting the vehicle to start preparations.
  • the apparatus includes: a signal information acquisition unit that acquires information on a display state of a traffic signal; and an output form of notification information that is notified to a passenger of the moving body based on the information on the display state of the traffic signal acquired by the signal information acquisition unit.
  • Judgment means for judging, and notifying means for notifying the notice information by the output form judged by the judging means are provided.
  • the signal recognition method includes a signal information acquisition step of acquiring information on a display state of a traffic signal, and information on the display state of the traffic signal acquired by the signal information acquisition step. Based on a determination step of determining an output form of notification information to be notified to a passenger of a moving body, and a notification step of notifying notification information based on the output form determined by the determination step. .
  • a signal recognition program according to the invention of claim 8 causes a computer to execute the signal recognition method according to claim 7.
  • a recording medium according to the invention of claim 9 is characterized in that the signal recognition program according to claim 8 is recorded in a computer-readable state.
  • FIG. 1 is a block diagram showing a functional configuration of a signal recognition apparatus according to the present embodiment.
  • FIG. 2 is a flowchart showing a signal recognition processing procedure of the signal recognition apparatus according to the present embodiment.
  • FIG. 3 is an explanatory view showing an example of the vicinity of a dashboard of a vehicle in which a navigation device that is effective in the present embodiment is installed.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of a navigation apparatus that is effective in the present embodiment.
  • FIG. 5 is a flowchart showing the contents of the processing of the navigation device which is effective in the present embodiment.
  • FIG. 6 is an explanatory diagram showing the confirmation order of traffic lights at an intersection. Explanation of symbols
  • FIG. 1 is a block diagram showing a functional configuration of the signal recognition apparatus 100 according to the present embodiment.
  • the signal recognition apparatus 100 includes a signal information acquisition unit 101, a behavior information acquisition unit 102, a prediction unit 103, a determination unit 104, and a notification unit 105.
  • the signal information acquisition unit 101 acquires information on the display state of the traffic light.
  • the display state of the traffic light is, for example, information on the color of the light that is lit on the traffic light and information on whether or not it is blinking.
  • the lighting color is one of blue, yellow, and red information
  • any one of blue lighting, blue flashing, and red lighting is on.
  • the signal information acquisition unit 101 may determine the display state by, for example, acquiring the image information of the traffic signal! /, Not shown !, the communication unit included in the management server or the traffic signal Information on the display state of the traffic signal to be transmitted may be acquired.
  • the behavior information acquisition unit 102 acquires the behavior information of the driver of the moving object.
  • the behavior information is, for example, information on the behavior performed by the driver to move the moving object. Specifically, for example, information on depression of an accelerator pedal or a brake pedal, information on operation of a side brake or a steering wheel, current position information of a vehicle, information on an output value of G sensor force, and the like.
  • the prediction unit 103 calculates the prediction information of the display state of the traffic light positioned in the traveling direction of the mobile body. To do.
  • the traffic light located on the side of the moving body is, for example, a traffic light other than the traffic light according to the moving body on which the passenger is on board.
  • the prediction of the display state is a relatively short time prediction that can be expressed by, for example, “coming soon” until the lighting color of the traffic light in the traveling direction changes to blue.
  • the prediction unit 103 selects a forward moving body, an opposing moving body, or a route that intersects the traveling route. You may calculate the prediction information of the display state of the traffic signal located in the advancing direction of the moving body from the behavior information of the moving moving body.
  • the determination unit 104 determines the output form of the notification information to be notified to the passenger of the moving body. Specifically, for example, when the lighting color of the traffic light located in the traveling direction of the moving body changes to blue, a signal for performing an operation for notifying the passenger of the moving body is output to the notification unit 105 described later. In addition, the determination unit 104 determines the output form of the notification information to be notified to the passenger of the moving body according to the prediction information of the display state of the traffic signal located in the traveling direction of the moving body calculated by the prediction unit 103. . Specifically, for example, a signal for performing a foreseeing operation to the passenger of the moving body is output to the notification unit 105.
  • the determination unit 104 is based on the traffic signal display state information acquired by the signal information acquisition unit 101 and the driver behavior information acquired by the behavior information acquisition unit 102. To determine the output form of the notification information to be notified. Specifically, for example, if the driver does not show a behavior to move the moving object even after a predetermined time has elapsed since the lighting color of the traffic light located in the traveling direction of the moving object turns blue, the moving object A signal is output to perform a prompting action for the passenger.
  • the volume, light intensity, and action amount may be increased every time a predetermined time elapses, or only the action, action and sound, action and sound and light, etc. It may be changed.
  • the determination unit 104 acquires the information on the display state of the traffic signal by the signal information acquisition unit 101! /, In this case, acquires the information on the display state of the traffic signal! /, Mobile body
  • the output form of the notification information notified to the passenger is determined. More specifically, for example, a signal for performing an action indicating an intention of notifying a search object is output. This unrecognizable operation is, for example, an operation that appears as if the notification unit 105 is powered to the left and right in small increments or is squeezed.
  • the notification information may be stored in a storage unit (not shown).
  • the determination unit 104 outputs the optimum notification information output form from the stored notification information.
  • the output form of the broadcast information to be determined may be a combination of multiple output forms. Specifically, for example, light output, sound output, and operation output may be classified, and a combination of these may be notified to the passenger of the moving body.
  • the notification unit 105 notifies the notification information determined by the determination unit 104.
  • the notification unit 105 notifies the notification information determined by the determination unit 104 by at least one output of light output, operation output, and sound output.
  • the notification unit 105 may be a vehicle-mounted robot that imitates the shape of a person or animal.
  • the vehicle-mounted robot may express a pseudo-intention or emotion that the vehicle-mounted robot expresses the intention or emotion by combining light output, motion output, and sound output.
  • the pseudo-intention and the pseudo-emotion are, for example, an intention to notify something or an emotion of joy.
  • FIG. 2 is a flowchart showing a signal recognition processing procedure of the signal recognition apparatus 100 according to the present embodiment.
  • the information on the display state of the traffic light is acquired by the signal information acquisition unit 101 (step S201).
  • step S 201 as the information on the display state of the traffic light, specifically, image information around the moving body is acquired by a camera mounted on the moving body.
  • step S 202 based on the information on the display state of the traffic light acquired in step S 201, the display state of the traffic signal in the traveling direction is predicted by the prediction unit 103 (step S 202). In step S202, it may be omitted if the information on the display state acquired in step S201 is only the traffic light located in the traveling direction. Further, the behavior information acquisition unit 102 acquires the behavior information of the driver (step S203). [0026] Next, either or both of the display state information acquired in step S201 and the traffic signal display state information predicted in step S202, and the operation acquired in step S203. Based on the person's behavior information, the output form of the notification information notified to the passenger of the moving body is determined (step S204).
  • step S201 when the display state information acquired in step S201 is only the traffic light located in the traveling direction, the display state information of the traffic light acquired in step S201 and the information acquired in step S204. Based on the behavior information of the driver, the output form of the notification information to notify the passenger of the moving body is determined. Then, the notification unit 105 notifies the notification information according to the output form determined in step S204 (step S205), and the series of processing ends.
  • the display state information acquired in step S201 is only the traffic light located in the traveling direction
  • the display state information of the traffic light located beside the moving body cannot be acquired. If the information on the display status of the traffic light located in the direction of travel is acquired before the information on the display status of the traffic signal located on the side of the moving object is acquired, There may be a case where the output form can be determined. In these cases, since the prediction state of the traffic signal located in the traveling direction cannot be predicted by the prediction unit 103, the display state information acquired in step S201 and the display state information acquired in step S203. Based on the behavior information, the output form of the notification information is determined.
  • step S201 if the display state information acquired in step S201 is only the display state information of the traffic light located on the side of the moving body, the traffic signal display predicted in step S202 is performed. Based on the state information and the behavior information acquired in step S203, the output form of the notification information is determined. Note that when the acquired display state information is only the display state information of the traffic light located on the side of the moving body, the display state information of the traffic light located in the traveling direction may not be obtained. Conceivable.
  • the display state information acquired in step S201 is the display state information of the traffic light located in the traveling direction and the display state information of the traffic light located on the side of the moving body. If the information on the display status of the traffic light located on the side of the moving body is obtained first, and the information on the display status of the traffic light located in the traveling direction is obtained later, the display status of the traffic light located in the traveling direction is obtained. Information (display obtained in step S201) Status information), the display state information of the traffic light predicted in step S202, and the behavior information acquired in step S203, the output form of the notification information is determined.
  • step S201 if the information on the display state of the misplaced traffic signal cannot be acquired, for example, the operation of the invisible operation described later by detecting that the moving body has traveled is detected. Even if you do it.
  • the force for acquiring the driver's behavior information in step S203 is not limited to this.
  • driver behavior information need not be acquired.
  • step S204 the information on the display state of the traffic signal obtained in step S201 and / or the information on the display state of the traffic signal predicted in step S202, or both of them. Based on this, the output form of the notification information to notify the passenger of the moving body is determined.
  • the force for reporting the notification information in step S205 is not limited to this. Specifically, for example, after a predetermined time elapses, the driver's behavior information may be further acquired, and the notification information may be continuously notified until the driver shows the behavior of starting. Further, every time a predetermined time elapses, the notification information may be changed and notified.
  • the signal recognition apparatus, signal recognition method, signal recognition program, and recording medium readable by the computer according to the present invention have the functions realized by the signal recognition apparatus 100 shown in FIG.
  • the signal recognition device 100 it is not limited to the signal recognition device 100, and a plurality of devices may be used as long as the configuration includes the functional unit shown in FIG.
  • the connection between the devices may be established by performing communication by Bluetooth (registered trademark) or the like, regardless of wired or wireless.
  • the passenger of the moving body is determined by the determination unit 104 according to the information on the display state of the traffic signal acquired by the signal information acquisition unit 101. It is possible to determine the output form of the notification information to be notified, and to notify the notification information based on the determined output form. Therefore, it is possible to recognize the change in the lighting color of the traffic light and make the passenger aware of it. This allows the passenger to keep an eye on the traffic light, for example. When the traffic light is on, you can notice even if the lighting color of the traffic light changes to blue, and you can drive smoothly and comfortably.
  • the passenger may be difficult for the passenger to visually recognize the lighting color of the traffic light located in the traveling direction due to the reflection of sunlight, and even if the passenger confirms the lighting color of the traffic light, The signal recognition device 100 together with the passenger confirms the lighting color of the traffic light or predicts and notifies the change, so that the passenger can feel secure. Also, if the waiting time until the display state of the traffic signal changes (so-called signal waiting) is long, the signal recognition device 100 shares the display state of the traffic signal with the passenger, for example, feelings that are generally felt from there. Is expressed by this signal recognition device 100 and included in the notification information, the effect of reducing the frustration of the passenger can be obtained.
  • the information on the display state of the traffic signal acquired by the signal information acquisition unit 101 and the behavior information of the driver acquired by the behavior information acquisition unit 102 Accordingly, it is possible to determine the output form of the notification information notified to the passenger of the moving body by the determination unit 104. Therefore, when the driver starts the moving body promptly (for example, within a predetermined time), it is not possible to notify the passenger of the moving body that the lighting color of the traffic light has changed. As a result, the passenger can drive comfortably without unnecessary notification.
  • the prediction unit 103 displays the display of the traffic signal located in the traveling direction of the mobile object from the information on the display state of the traffic signal located on the side of the mobile object.
  • the prediction information of the state is calculated, and the determination unit 104 can determine the output form of the notification information notified to the passenger of the moving body according to the prediction information of the display state of the traffic signal located in the traveling direction of the moving body. . Therefore, before the lighting color of the traffic signal in the traveling direction changes to blue, it is possible to notify the passenger of the prediction that the lighting color of the traffic signal will change to blue. Thus, the passenger can prepare to start the moving body as soon as the lighting color of the traffic light changes to blue, and can drive smoothly.
  • the signal recognition device 100 of the present embodiment when the signal information acquisition unit 101 has not acquired the information on the display state of the traffic signal, the information on the display state of the traffic signal is acquired by the determination unit 104. An output form of notification information for notifying a passenger of a mobile object to the effect The state can be judged. Therefore, it is possible to notify whether or not the traffic light is being recognized. As a result, the user watches the surroundings with caution and can immediately notice when the color of the signal light changes to blue.
  • the signal information acquisition unit 101 can determine the display state of the traffic signal by acquiring the image information of the traffic signal. Therefore, image information can be acquired by a camera provided in a drive recorder or a navigation device. As a result, the user can obtain information on the display status of the traffic light without installing a new camera, so there is no need to add extra parts, and parts costs can be reduced. Absent.
  • the output form of the notification information determined by the determination unit 104 is notified by at least one output form of light output, operation output, and sound output. can do. Therefore, by combining light output, motion output, and sound output, it is possible to convey pseudo-intention and emotion to the passenger. This makes it possible to convey the nuances of various notification information such as foresight, awareness, and prompting in an easy-to-understand manner. This also allows passengers to enjoy driving comfortably while reducing the frustration caused by waiting for traffic lights, as if there are passengers and pets who can feel the same as waiting for traffic lights. Can do.
  • FIG. 3 is an explanatory view showing an example of the vicinity of a dashboard of a vehicle in which a navigation device that is useful in this embodiment is installed.
  • the navigation apparatus 300 is installed on the dashboard of the vehicle.
  • the navigation device 300 includes a main body M and a display unit (display) D.
  • the display unit D displays the current location of the vehicle, map information, and the current time.
  • the navigation apparatus 300 is connected to an in-vehicle robot 310 installed on the dashboard.
  • the in-vehicle robot 310 includes a camera, a lamp, a microphone, a speaker, and the like (not shown), and performs various outputs according to control of a control signal output from the navigation device 300.
  • the vehicle-mounted robot 310 has a shape imitating a human or an animal, for example, and includes a drive unit such as an arm or a drive unit such as a leg on the left and right sides. Therefore, the operation output of the drive unit is performed according to the control of the control signal output from the navigation device 300. Furthermore, the vehicle-mounted robot 310 may be configured to include a drive unit that swings the top and bottom, left and right, and a drive unit that moves the body part to the left and right, and controls the control signals output from the navigation device 300. Therefore, the operation output of the drive unit may be performed.
  • the top of the vehicle-mounted robot 310 may function as a camera.
  • the camera may rotate in the horizontal direction and the vertical direction to capture images inside and outside the vehicle. it can.
  • the vehicle-mounted robot 310 may be configured to have a lamp, microphone, and speaker with a lamp, microphone, and speaker on the body, and may collect and output light and sound with a lamp, microphone, and speaker.
  • a display unit for displaying images, characters, or the like may be provided, or a function for speaking words may be provided.
  • the vehicle-mounted robot 310 may be turned on and blinking in various colors in accordance with control of a control signal output from the navigation device 300 that may emit light entirely or partially. It is also possible to change the blinking pattern.
  • the vehicle-mounted robot 310 performs various outputs according to the control of the control signal output from the navigation device 300, thereby driving the suspicious intention or the pseudo emotion. It becomes the structure shown to the person!
  • the vehicle-mounted robot 310 for indicating pseudo-intentions and pseudo-emotions according to the information on the display state of the traffic lights and the behavior information of the driver. Determine the operating program to be used. Then, the vehicle-mounted robot 310 that operates according to the determined operation program can notify the passenger of the change in the lighting color of the traffic light.
  • the vehicle-mounted robot 310 is used as a driver's anxious presence, such as when a child or a pet is on board, and the driver is concerned about safe driving. The driver waits for the lighting color to change with the driver and expresses emotions such as the willingness to notify the power and the feeling of pleasure according to the driver's behavior. It is possible to notice changes in lighting color and to reduce frustration caused by waiting for a signal.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of a navigation apparatus that works on the present embodiment.
  • a navigation device 300 is mounted on a moving body such as a vehicle, and includes a CPU 401, a ROM 402, a RAM 403, a magnetic disk drive 404, a magnetic disk 405, and an optical disk drive 406. , Optical disc 407, audio IZF (interface) 408, microphone 409, speaker 410, input device 411, video IZF412, display 413, communication IZF414, GPS unit 415, various sensors 416, camera 417 and a drive unit 418. Each component 401 to 418 is connected by a bus 420.
  • the CPU 401 governs overall control of the navigation device 300.
  • the ROM 402 stores programs such as a boot program, a route search program, a route guidance program, a voice generation program, a map information display program, an operation program, and a signal lighting color prediction program.
  • the RAM 403 is used as a work area for the CPU 401.
  • the route search program searches for an optimum route from the departure point to the destination point using map information or the like recorded on the optical disc 407 described later.
  • the optimal route is the shortest (or fastest) route to the destination or the route that best meets the conditions specified by the user.
  • the guidance route searched by executing the route search program is output to the audio IZF 408 and the video IZF 412 via the CPU 401.
  • the route guidance program includes the guidance route information searched by executing the route search program, the current state of the navigation device 300 acquired by the communication IZF414. Based on the location information and the map information read from the optical disc 407, real-time route guidance information is generated. The route guidance information generated by executing the route guidance program is output to the audio IZF 408 and the video IZF 412 via the CPU 401.
  • the sound generation program generates tone and sound information corresponding to the pattern.
  • the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated, and output to the voice IZF 408 via the CPU 401.
  • the map information display program determines the display format of the map information displayed on the display 413 by the video I / F 412 and displays the map information on the display 413 according to the determined display format.
  • the operation program drives a drive unit 418, which will be described later, in accordance with the information on the display state of the traffic light and the behavior information of the driver. Details will be described with reference to FIGS. 5 and 6.
  • the force operation program for example, causes the drive unit 418 to indicate the pseudo-intention or the emotion selected by the information on the display state of the traffic lights or the behavior information of the driver. Select the operation program to be executed.
  • the signal lighting color prediction program calculates the prediction information of the display state of the traffic light in the traveling direction of the vehicle based on the display state information of the traffic light on the side of the vehicle. Details will be described with reference to FIG. 5 and FIG. 6, but the signal lighting color prediction program predicts the traffic signal in the traveling direction of the vehicle after a predetermined time, for example, by analyzing an image captured by a camera 417 described later. Display state information.
  • the magnetic disk drive 404 controls reading and writing of data to the magnetic disk 405 according to the control of the CPU 401.
  • the magnetic disk 405 records data written under the control of the magnetic disk drive 404.
  • the magnetic disk 405 for example, HD (node disk) or FD (flexible disk) can be used.
  • the optical disk drive 406 controls reading and writing of data with respect to the optical disk 407 according to the control of the CPU 401.
  • the optical disc 407 is a detachable recording medium from which data is read according to the control of the optical disc drive 406. Light di
  • the disk 407 can also use a writable recording medium.
  • the removable recording medium may be a power MO of the optical disc 407, a memory card, or the like.
  • map information used for route search and route guidance.
  • the map information includes background data that represents features (features) such as buildings, rivers, and the ground surface, and road shape data that represents the shape of the road, and can be displayed in two or three dimensions on the display screen of the display 413. To be drawn.
  • the navigation device 300 is guiding a route, the map information and the current location of the host vehicle acquired by the GPS unit 415 described later are displayed in an overlapping manner.
  • the map information is recorded on the magnetic disk 405 and the optical disk 407.
  • the map information may be provided outside the navigation device 300, not the information recorded only in the one integrated with the navigation device 300 hardware. In that case, the navigation device 300 acquires the map information via the network through the communication IZF 414, for example.
  • the acquired map information is stored in the RAM 403 or the like.
  • the audio IZF 408 is connected to an audio input microphone 409 and an audio output speaker 410. Audio received by the microphone 409 is AZD converted in the audio IZF408. In addition, sound is output from the speaker 410. Note that the voice input from the microphone 409 can be recorded on the magnetic disk 405 or the optical disk 407 as voice data.
  • the microphone 409 and the speaker 410 may be installed in a drive unit 418 (for example, the vehicle-mounted robot 310 in FIG. 3) described later, and may be rotationally driven according to the control of the operation program described above. It is also possible to input / output audio.
  • examples of the input device 411 include a remote controller, a keyboard, a mouse, a touch panel, and the like provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like.
  • the video I / F 412 is connected to the display 413 and the camera 417.
  • the video I / F 412 includes, for example, a graphic controller that controls the entire display 413, a buffer memory such as VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and a graphic Based on image data output from the controller And a control IC for controlling the display of the display 413.
  • VRAM Video RAM
  • the display 413 displays icons, cursors, menus, windows, or various data such as characters and images.
  • this display 413 for example, a CRT, a TFT liquid crystal display, a plasma display, or the like can be adopted.
  • the display 41 3 is installed, for example, in a manner like the display unit D in FIG.
  • the display 413 and the camera 417 are, for example, imaged according to the control of the operation program described above, which may be installed in a drive unit 418 (for example, the vehicle-mounted robot 310 in FIG. 3) described later. Or light output.
  • the camera 417 captures an image inside or outside the vehicle.
  • the image may be either a still image or a moving image.
  • the behavior of the passenger inside the vehicle may be imaged by the camera 417, or the display state of the traffic signal outside the vehicle may be imaged.
  • the captured video is output to a recording medium such as a magnetic disk 405 or an optical disk 407 via the video I / F 412, and used to acquire information on the display state of a traffic light and driver's behavior information described later. As well.
  • the communication IZF 414 is connected to a network via radio and functions as an interface between the navigation device 300 and the CPU 401.
  • the communication IZF 414 is further connected to a communication network such as the Internet via radio, and also functions as an interface between the communication network and the CPU 401.
  • Communication networks include LANs, WANs, public line networks and mobile phone networks.
  • the communication IZF414 is composed of, for example, FM tuner, VICS (Vehicle Information and Communication System) Z beacon Resino, wireless navigation device, and other navigation devices. Get road traffic information such as traffic regulations. VICS is a registered trademark.
  • the GPS unit 415 uses a received wave from the GPS satellite and output values from various sensors 416 (for example, an angular velocity sensor, an acceleration sensor, a tire rotation number, etc.) described later, Information indicating the current location of the location piggy-on device 300 is calculated.
  • the information indicating the current location is information that identifies one point on the map information, such as latitude'longitude and altitude.
  • the GPS unit 415 uses the output values from the various sensors 416 to Outputs domain, speed change, and direction change. This makes it possible to analyze dynamics such as sudden braking and sudden handling.
  • the various sensors 416 are a vehicle speed sensor, an acceleration sensor, an angular velocity sensor, and the like, and their output values are used for calculation of the current position by the GPS unit 415 and measurement of changes in speed and direction.
  • the various sensors 416 also include sensors that detect each operation of the vehicle by the driver. The detection of each operation of the vehicle may be configured to detect, for example, steering wheel operation, blinker input, side brake operation, accelerator pedal depression, or brake pedal depression.
  • the signal information acquisition unit 101, the behavior information acquisition unit 102, the prediction unit 103, the determination unit 104, and the notification unit 105 included in the signal recognition device 100 illustrated in FIG. 1 are the same as those in the navigation device 300 illustrated in FIG.
  • the CPU 401 executes a predetermined program using programs and data recorded in the ROM 402, RAM 403, magnetic disk 405, optical disk 407, etc., and controls each part in the navigation device 300 to realize its function.
  • the navigation device 300 of the embodiment includes the signal recognition device 100 shown in FIG. 1 by executing the signal recognition program recorded in the ROM 402 as a recording medium in the navigation device 300.
  • the function can be executed by the signal recognition processing procedure shown in Fig. 2.
  • FIG. 5 is a flowchart showing the contents of the processing of the navigation apparatus that is useful in this embodiment.
  • the vehicle waits until it stops (step S501: No loop), and when the vehicle stops (step S501: Yes), confirms that there is a traffic signal around the vehicle (step S501: No).
  • step S502 whether or not there is a traffic light is specifically checked by, for example, a camera in which the traffic light is reflected in an image around the vehicle captured by the camera mounted on the vehicle-mounted robot 310. Check whether it is a success or failure.
  • step S503 it is determined whether or not the lighting color of the traffic signal in the traveling direction is blue.
  • the traffic signal in the direction of travel is a traffic signal that is located on the road on which the vehicle is traveling and that the vehicle follows. If it is determined in step S503 that the traffic light is blue (step Step S503: Yes), the process proceeds to Step S510 described later. Also, in step S503, if the lighting color of the traffic light in the direction of travel is not blue (step S503: No), or if the lighting color of the traffic light in the direction of travel cannot be confirmed, the side traffic light is monitored. (Step S504).
  • the side traffic light will be described in detail in FIG. 6. For example, when the vehicle stops at the intersection, it is a traffic light other than the traffic light located in the traveling direction of the vehicle.
  • step S505 No loop
  • the lighting color of the pedestrian traffic signal is first blue, and then flashes blue. , Change in red order.
  • the lighting color of the vehicle traffic light changes from blue to yellow to red. If the side pedestrian traffic light blinks in step S505 (step S505: Yes), the system waits until the side vehicle traffic light turns yellow (step S506: No loop).
  • step S506 when the side vehicle traffic light turns yellow (step S506: Yes), the vehicle robot 310 performs the foreseeing operation (step S507).
  • the foreseeing action is an action that indicates to the vehicle passenger that the lighting color of the signal in the direction of travel will soon turn blue.
  • it is an operation that includes the intention of foreseeing notification and the feeling of expectation for resuming traveling, such that the vehicle-mounted robot 310 is preparing to run.
  • force voice information such as a speaker provided in the vehicle-mounted robot 310 may be output.
  • the voice information is, for example, information such as “It will soon be blue”.
  • step S508 the traffic signal in the traveling direction is monitored. If the lighting color of the traffic light in the traveling direction cannot be confirmed at the stage of step S503, the process proceeds to step S510 after step S507. Wait until the traffic light in the direction of travel turns blue (step S509: No loop). In step S509, when the lighting color of the traffic light in the traveling direction turns blue (step S509: Yes), the vehicle-mounted robot 310 performs a notice operation (step S510). Awareness movements let the vehicle occupants This is an action that indicates that the color of the traffic light has changed to blue.
  • this is an operation including the intention of notifying and the feeling of expectation for the further resumption of driving, for example, the operation in which the vehicle-mounted robot 310 starts to move.
  • the drive unit such as the arm of the in-vehicle robot 310 is powered up or down to point straight in the direction of the traffic light, and the drive unit such as the foot is bent slightly to power up and down slowly.
  • the sound information such as a speaker provided in the vehicle-mounted mouth bot 310 may be output.
  • the audio information is information such as “It turned blue”.
  • step S512 the behavior information of the driver is acquired (step S511), and it is determined whether or not the vehicle has started (step S512) based on the behavior information acquired in step S511. Specifically, in step S512, the determination of whether or not there is a starting action is made by, for example, determining whether or not the driver has released the side brake that was being pulled while the vehicle was stopped, and whether or not the driver released the brake pedal. No, whether the driver stepped on the accelerator pedal, whether the driver powered the steering wheel, whether the vehicle started driving, whether the vehicle changed its current position, etc. Is determined from information detected by the GPS unit 415 and various sensors 416.
  • step S512 If there is no start operation in step S512 (step S512: No), the process waits until a predetermined time has passed (step S513: No loop).
  • the predetermined time is generally the time required to make a force start operation after visually confirming that the lighting color of the traffic light in the traveling direction is blue.
  • the predetermined time may be a predetermined time obtained by calculating an average value or a maximum value excluding a singular value from a history of past behavior information for each driver.
  • step 513 it may be determined from the image around the vehicle imaged by the camera whether or not the power is in a state where the vehicle can start running after the lighting color of the signal turns blue. For example, from the image around the vehicle, even if the lighting color of the traffic light changes to blue, the traffic light in the direction of travel is recognized to be congested ahead (for example, the previous vehicle is not traveling) and starts traveling. It can be judged that the situation is not possible. In this case, it is assumed that the travel cannot be started even after the predetermined time has elapsed, so the restriction of the predetermined time is removed, and when the travel is started, the pleasure operation of step S515 described later is performed.
  • step S513 the in-car mouth bot 310 performs a prompting operation (step S514).
  • the prompting action is an action that prompts the vehicle occupant to start the vehicle. For example, this is a behavior where a child is intensive with “Let's go fast.” Specifically, a drive unit such as an arm provided in the vehicle-mounted robot 310 is raised upward and directed toward the driver, and the drive unit such as a foot is bent slightly to power back and forth. Also, for example, you can output a pseudo-intention such as “Let's go fast” or “Don't panic”!
  • step S512 if there is a start motion in step S512 (step S512: Yes), a joyful motion is performed by the vehicle-mounted robot 310 (step S515), and the series of processes is terminated.
  • the joy operation is an operation that expresses the joy that the vehicle has started. Specifically, the drive unit such as the torso of the in-vehicle robot 310 is moved to the right so that it is directed toward the driver, the drive unit such as the top of the head is slowly moved up and down, and Move the drive unit to the left again in the original direction. Also, for example, audio information such as “let's go” may be output. Also, if the traffic light in the direction of travel is blue in step S503 (step S503: Yes), proceed directly to step S510 to perform the subsequent processing!
  • the present invention is not limited to this.
  • the vehicle-mounted robot 310 may perform a search operation.
  • the action of looking for is an action that indicates the intention of notifying that it is trying to find a traffic light.
  • a driving unit such as the top of the vehicle-mounted robot 310 is vigorously moved to the left and right to perform operations such as searching for the kiyoguchi and the surrounding area.
  • audio information such as “I can't see the traffic light” may be output.
  • the force for monitoring the side traffic light in step S504 is not limited to this. Specifically, for example, when a side traffic signal cannot be monitored from the host vehicle, such as when a side traffic signal cannot be monitored by a vehicle ahead or an oncoming vehicle, the process proceeds to step S508, and a traffic signal in the direction of travel is set. You can monitor it.
  • step S502 and step S508 the traffic signal in the traveling direction is confirmed and monitored, but the present invention is not limited to this. Specifically, for example, when a traffic light in the direction of travel cannot be confirmed and monitored by a large vehicle such as a truck in front of the vehicle, the process proceeds from step S501 to step S504, and the traffic light on the side is monitored. Proceed from step S507 to step S510 to perform the action you noticed.
  • the foreseeing operation is performed in step S507, but the present invention is not limited to this.
  • the first foreseeing action is performed when the sidewalk traffic lights blink, and then the second foreseeing is performed when the sidewalk traffic lights turn red.
  • the third foreseeing operation is performed when the side vehicle traffic light turns yellow, and the side vehicle traffic light turns red.
  • the foreseeing operation may be performed. In this case, even if the operation is continued until the first foreseeing operation changes to the fourth foreseeing operation, the movement may gradually increase or become faster. Good.
  • the power of the first foreseeing The operation of the foreseeing until the fourth foreseeing may be set by the occupant, or determined from the past history of the occupant's behavior information It may be configured.
  • the force for performing the prompting operation in step S514 is not limited to this.
  • the operation, volume, and amount of light may be increased every time a predetermined time elapses.
  • the light color and blinking pattern may be changed.
  • the prompting operation is performed in step S514, but the present invention is not limited to this. Specifically, for example, when it is determined that the driver's line of sight is detected and the traffic light is viewed, the prompting operation may not be performed. Further, in the flowchart of FIG. 5, the lighting color and lighting state of each traffic light may be output as audio information or the like by the operation of the passenger. Passenger operations can be performed using touch panels and sound. Even voice input.
  • FIG. 6 is an explanatory diagram showing the confirmation order of traffic lights at the intersection.
  • FIG. 6 shows the host vehicle 601, the vehicle 602 ahead, the vehicle traffic signal 610 in the traveling direction, the side pedestrian traffic signal 620, and the side vehicle traffic signal 630.
  • the respective lighting colors are lit in the order of blue 611, yellow 612, and red 613.
  • the lighting color of the side vehicle traffic light 630 and the side pedestrian traffic light 620 turns blue.
  • the blue 621 light on the side pedestrian traffic light 620 flashes and changes to red 622.
  • the lighting color of the side vehicle traffic light 630 changes from blue 631 to yellow 632 and then red 633.
  • the lighting color of the vehicle traffic signal 610 in the traveling direction changes to blue 611.
  • the traffic lights are colored in the order of the pedestrian traffic light 620 on the side and the traffic light 630 on the side when the lighting color of the vehicle traffic light 610 in the traveling direction is yellow 612 and red 613.
  • the display state information of the side pedestrian traffic light 620 and the side vehicle traffic light 630 is acquired, and the process proceeds. It is possible to predict that the lighting color of the vehicle traffic signal 610 in the direction will turn blue.
  • the determination unit 104 determines whether the vehicle passenger is in accordance with the information on the display state of the traffic signal acquired by the signal information acquisition unit 101. It is possible to determine the output form of the notification information to be notified and notify the notification information in the determined output form. Therefore, it is possible to recognize the change in the lighting color of the traffic light and make the passenger aware of it. This makes it possible for passengers to notice when the traffic light changes to blue when they keep their eyes away from the traffic light, for example. it can.
  • the passenger may be difficult for the passenger to see the lighting color of the traffic light located in the traveling direction due to the reflection of sunlight, and even if the passenger confirms the lighting color of the traffic light, Since the lighting color of the traffic light is confirmed together with the passenger or the change is predicted and notified, the passenger can feel secure.
  • the waiting time until the display state of the traffic signal changes is long, the display state of the traffic signal is shared with the passenger, for example, expressing emotions that are generally felt from there, By including it, the effect of reducing the frustration of the passenger can be obtained.
  • the determination unit 104 can determine the output form of the notification information notified to the passenger of the moving object. Therefore, when the driver starts the moving body promptly (for example, within a predetermined time), it is not possible to notify the passenger of the moving body that the lighting color of the traffic light has changed. As a result, the passenger can drive comfortably without unnecessary notification.
  • the prediction state 103 causes the display state of the traffic signal located in the traveling direction of the mobile object from the information on the display state of the traffic signal located on the side of the mobile object.
  • the prediction information can be calculated, and the determination unit 104 can determine the output form of the notification information notified to the passenger of the moving body according to the prediction information of the display state of the traffic signal located in the traveling direction of the moving body. Therefore, before the lighting color of the traffic signal in the traveling direction changes to blue, it is possible to notify the passenger of the prediction that the lighting color of the traffic signal will change to blue. Thus, the passenger can prepare to start the moving body as soon as the lighting color of the traffic light changes to blue, and can drive smoothly. In addition, for example, even if a traffic light is not visible due to the presence of a large vehicle such as a truck, it is possible to prepare to start smoothly following the movement of the large vehicle.
  • the determination unit 104 acquires the information on the display state of the traffic signal.
  • An output form of notification information for notifying the passenger of the moving body The state can be judged. Therefore, it is possible to notify whether or not the traffic light is being recognized. As a result, the user watches the surroundings with caution and can immediately notice when the color of the signal light changes to blue.
  • the signal information acquisition unit 101 can determine the display state of the traffic signal by acquiring the image information of the traffic signal. Therefore, image information can be acquired by a camera provided in a drive recorder or a navigation device. As a result, the user can obtain information on the display status of the traffic light without installing a new camera, so there is no need to add extra parts, and parts costs can be reduced. Absent.
  • the output form of the notification information determined by the determination unit 104 is notified by at least one of the light output, the operation output, and the sound output. be able to. Therefore, by combining light output, motion output, and sound output, it is possible to convey pseudo-intention and emotion to the passenger. As a result, it is possible to notice the foreseeing and convey the nuances of various notification information such as prompts in an easy-to-understand manner. This also allows the passengers to feel as if there are other passengers and pets, reducing the frustration of waiting for traffic lights and having fun! 1) Can drive properly.
  • the navigation device 300 is a vehicle imaged by the camera 417 mounted on the vehicle.
  • at least one of light output, motion output, and sound output from the drive unit can be used to change the traffic signal located in the direction of travel for the vehicle occupant. Can be notified by.
  • the passenger can share emotions with the drive unit, and if the lighting color of the traffic light changes, the passenger can notice the change in the lighting color by being notified easily. At the same time, it reduces the frustration caused by waiting for traffic lights and makes it fun! 1) You can drive properly.
  • the signal recognition method described in the present embodiment is such that a program prepared in advance is executed on a computer such as a personal computer or a workstation. It can be realized more.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by reading the recording medium force by the computer.
  • the program may be a transmission medium that can be distributed through a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Selon l'invention, une section de prédiction (103) prédit l'état d'affichage d'un signal dans la direction d'avancement, en fonction d'informations sur l'état d'affichage de signal acquises par une section d'acquisition d'informations de signal (101). Une section d'acquisition d'informations de comportement (102) acquiert le comportement d'un conducteur. En fonction des informations sur l'état d'affichage de signal prédit et des informations sur le comportement du conducteur, une section de détermination (104) détermine le mode de production des informations à notifier au conducteur d'un corps mobile, et une section de notification (105) notifie les informations de notification par un mode de production déterminé.
PCT/JP2006/319342 2006-09-28 2006-09-28 Dispositif de reconnaissance de signal, procédé de reconnaissance de signal, programme de reconnaissance de signal, et support d'enregistrement WO2008038376A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2006/319342 WO2008038376A1 (fr) 2006-09-28 2006-09-28 Dispositif de reconnaissance de signal, procédé de reconnaissance de signal, programme de reconnaissance de signal, et support d'enregistrement
JP2008536257A JP4926182B2 (ja) 2006-09-28 2006-09-28 信号認識装置、信号認識方法、信号認識プログラム、および記録媒体

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2006/319342 WO2008038376A1 (fr) 2006-09-28 2006-09-28 Dispositif de reconnaissance de signal, procédé de reconnaissance de signal, programme de reconnaissance de signal, et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2008038376A1 true WO2008038376A1 (fr) 2008-04-03

Family

ID=39229824

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/319342 WO2008038376A1 (fr) 2006-09-28 2006-09-28 Dispositif de reconnaissance de signal, procédé de reconnaissance de signal, programme de reconnaissance de signal, et support d'enregistrement

Country Status (2)

Country Link
JP (1) JP4926182B2 (fr)
WO (1) WO2008038376A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016207065A (ja) * 2015-04-27 2016-12-08 住友電気工業株式会社 運転支援装置及び運転支援方法
CN107264394A (zh) * 2017-05-19 2017-10-20 上海集成电路研发中心有限公司 一种智能识别前方车辆灯光的系统及其识别方法
CN108349219A (zh) * 2016-08-05 2018-07-31 法国圣戈班玻璃厂 具有显示装置的复合玻璃板
JP2020021400A (ja) * 2018-08-03 2020-02-06 パイオニア株式会社 情報処理装置
JP2020179844A (ja) * 2020-06-23 2020-11-05 株式会社ユピテル 運転支援システムおよび運転支援プログラム

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102252913B1 (ko) * 2015-12-11 2021-05-18 현대자동차주식회사 신호등 인식 시스템 및 그 방법

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09274700A (ja) * 1996-04-08 1997-10-21 Toyota Motor Corp 車両誘導制御装置
JP2002331890A (ja) * 2001-05-10 2002-11-19 Toyota Motor Corp 乗物の推奨操作表現システム
JP2004205389A (ja) * 2002-12-26 2004-07-22 Alpine Electronics Inc ナビゲーション装置、信号待ち回数予測方法、所要時間予測方法、信号待ち回数予測プログラム、及び所要時間予測プログラム
JP2005316889A (ja) * 2004-04-30 2005-11-10 Fujitsu Ten Ltd 運転支援装置
JP2006048624A (ja) * 2004-07-09 2006-02-16 Aisin Aw Co Ltd 信号情報作成方法、信号案内情報提供方法及びナビゲーション装置
JP2006120137A (ja) * 2001-02-19 2006-05-11 Hitachi Kokusai Electric Inc 画像情報通報システム
JP2006155319A (ja) * 2004-11-30 2006-06-15 Equos Research Co Ltd 走行支援装置
JP2006224754A (ja) * 2005-02-16 2006-08-31 Denso Corp 運転支援装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09274700A (ja) * 1996-04-08 1997-10-21 Toyota Motor Corp 車両誘導制御装置
JP2006120137A (ja) * 2001-02-19 2006-05-11 Hitachi Kokusai Electric Inc 画像情報通報システム
JP2002331890A (ja) * 2001-05-10 2002-11-19 Toyota Motor Corp 乗物の推奨操作表現システム
JP2004205389A (ja) * 2002-12-26 2004-07-22 Alpine Electronics Inc ナビゲーション装置、信号待ち回数予測方法、所要時間予測方法、信号待ち回数予測プログラム、及び所要時間予測プログラム
JP2005316889A (ja) * 2004-04-30 2005-11-10 Fujitsu Ten Ltd 運転支援装置
JP2006048624A (ja) * 2004-07-09 2006-02-16 Aisin Aw Co Ltd 信号情報作成方法、信号案内情報提供方法及びナビゲーション装置
JP2006155319A (ja) * 2004-11-30 2006-06-15 Equos Research Co Ltd 走行支援装置
JP2006224754A (ja) * 2005-02-16 2006-08-31 Denso Corp 運転支援装置

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016207065A (ja) * 2015-04-27 2016-12-08 住友電気工業株式会社 運転支援装置及び運転支援方法
CN108349219A (zh) * 2016-08-05 2018-07-31 法国圣戈班玻璃厂 具有显示装置的复合玻璃板
JP2019530604A (ja) * 2016-08-05 2019-10-24 サン−ゴバン グラス フランス 表示装置を有している複合ペイン
JP2021120270A (ja) * 2016-08-05 2021-08-19 サン−ゴバン グラス フランス 表示装置を有している複合ペイン
CN108349219B (zh) * 2016-08-05 2021-11-30 法国圣戈班玻璃厂 具有显示装置的复合玻璃板
US11220090B2 (en) 2016-08-05 2022-01-11 Saint-Gobain Glass France Composite pane with a display device
JP7202412B2 (ja) 2016-08-05 2023-01-11 サン-ゴバン グラス フランス 表示装置を有している複合ペイン
CN107264394A (zh) * 2017-05-19 2017-10-20 上海集成电路研发中心有限公司 一种智能识别前方车辆灯光的系统及其识别方法
JP2020021400A (ja) * 2018-08-03 2020-02-06 パイオニア株式会社 情報処理装置
JP2020179844A (ja) * 2020-06-23 2020-11-05 株式会社ユピテル 運転支援システムおよび運転支援プログラム
JP6998619B2 (ja) 2020-06-23 2022-01-18 株式会社ユピテル 運転支援システムおよび運転支援プログラム

Also Published As

Publication number Publication date
JP4926182B2 (ja) 2012-05-09
JPWO2008038376A1 (ja) 2010-01-28

Similar Documents

Publication Publication Date Title
EP3232289B1 (fr) Methode de commande de présentation d'informations et véhicule autonome
US9723243B2 (en) User interface method for terminal for vehicle and apparatus thereof
CN108240819B (zh) 驾驶辅助装置和驾驶辅助方法
CN110050301A (zh) 车辆控制装置
JP2023058521A (ja) 車載装置
JP6604577B2 (ja) 運転支援方法およびそれを利用した運転支援装置、運転支援システム、自動運転制御装置、車両、プログラム
US11460309B2 (en) Control apparatus, control method, and storage medium storing program
JP4926182B2 (ja) 信号認識装置、信号認識方法、信号認識プログラム、および記録媒体
JP7119846B2 (ja) 車両の走行制御方法及び走行制御装置
JP4790020B2 (ja) 疑似感情出力装置、疑似感情出力方法、疑似感情出力プログラムおよびコンピュータに読み取り可能な記録媒体
JPWO2008038375A1 (ja) 情報処理装置、情報処理方法、情報処理プログラムおよびコンピュータに読み取り可能な記録媒体
WO2020065892A1 (fr) Procédé et dispositif de commande de déplacement pour véhicule
CN114207685B (zh) 自主车辆交互系统
CN115384545A (zh) 控制方法和装置
JP7469359B2 (ja) 交通安全支援システム
JP7469358B2 (ja) 交通安全支援システム
JP7543082B2 (ja) 情報提示装置、情報提示方法及び情報提示プログラム
US20230311922A1 (en) Traffic safety support system
JP2024052612A (ja) 交通安全支援システム及びコンピュータプログラム
CN116895182A (zh) 交通安全辅助系统
CN116895184A (zh) 交通安全辅助系统
JP2022118053A (ja) 出力装置
JP2023045586A (ja) 周囲監視装置、周囲監視方法、および周囲監視プログラム
CN116895176A (zh) 交通安全辅助系统
CN116895179A (zh) 交通安全辅助系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06810789

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008536257

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06810789

Country of ref document: EP

Kind code of ref document: A1