US20170120798A1 - Apparatus and method to visually communicate with a vehicle - Google Patents
Apparatus and method to visually communicate with a vehicle Download PDFInfo
- Publication number
- US20170120798A1 US20170120798A1 US14/930,365 US201514930365A US2017120798A1 US 20170120798 A1 US20170120798 A1 US 20170120798A1 US 201514930365 A US201514930365 A US 201514930365A US 2017120798 A1 US2017120798 A1 US 2017120798A1
- Authority
- US
- United States
- Prior art keywords
- user
- vehicle
- visual
- communication apparatus
- gestures
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/0017—Devices integrating an element dedicated to another function
- B60Q1/0023—Devices integrating an element dedicated to another function the element being a sensor, e.g. distance sensor, camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/02—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
- B60Q1/22—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments for reverse drive
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/34—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating change of drive direction
- B60Q1/346—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating change of drive direction with automatic actuation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/50—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
- B60Q1/543—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for indicating other states or conditions of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/20—Means to switch the anti-theft system on or off
- B60R25/2054—Means to switch the anti-theft system on or off by foot gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/20—Means to switch the anti-theft system on or off
- B60R25/24—Means to switch the anti-theft system on or off using electronic identifiers containing a code not memorised by the user
- B60R25/246—Means to switch the anti-theft system on or off using electronic identifiers containing a code not memorised by the user characterised by the challenge triggering
-
- G07C9/00007—
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/00174—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
- G07C9/00309—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated with bidirectional data transmission between data carrier and locks
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/00174—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
- G07C9/00658—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated by passive electrical keys
- G07C9/00674—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated by passive electrical keys with switch-buttons
- G07C9/00698—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated by passive electrical keys with switch-buttons actuated in function of displayed informations
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/20—Individual registration on entry or exit involving the use of a pass
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q2400/00—Special features or arrangements of exterior signal lamps for vehicles
- B60Q2400/40—Welcome lights, i.e. specific or existing exterior lamps to assist leaving or approaching the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q2400/00—Special features or arrangements of exterior signal lamps for vehicles
- B60Q2400/50—Projected symbol or information, e.g. onto the road or car body
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/00174—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
- G07C2009/00968—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys shape of the data carrier
- G07C2009/00984—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys shape of the data carrier fob
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C2209/00—Indexing scheme relating to groups G07C9/00 - G07C9/38
- G07C2209/14—With a sequence of inputs of different identification information
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C2209/00—Indexing scheme relating to groups G07C9/00 - G07C9/38
- G07C2209/60—Indexing scheme relating to groups G07C9/00174 - G07C9/00944
- G07C2209/62—Comprising means for indicating the status of the lock
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C2209/00—Indexing scheme relating to groups G07C9/00 - G07C9/38
- G07C2209/60—Indexing scheme relating to groups G07C9/00174 - G07C9/00944
- G07C2209/63—Comprising locating means for detecting the position of the data carrier, i.e. within the vehicle or within a certain distance from the vehicle
- G07C2209/64—Comprising locating means for detecting the position of the data carrier, i.e. within the vehicle or within a certain distance from the vehicle using a proximity sensor
Definitions
- the present disclosure relates to an apparatus and method to visually communicate with a vehicle. More precisely, embodiments give the possibility to request commands for the vehicle as well as to obtain information about the vehicle.
- the object of the present disclosure is to provide an apparatus and a method to visually communicate with a vehicle which overcomes the above-mentioned limitations.
- the apparatus ensures better communication by providing two-way communication between the user and the vehicle through visual indications projected on a ground surface. These visual indications provide a way for the user to input commands for the vehicle (e.g., opening/closing door, turning on/off light) as well as for the vehicle to display information about the conditions of the vehicle to the user or other person near the vehicle (e.g., door opened/closed, light on/off, fuel/battery level, vehicle is backing up).
- the vehicle e.g., opening/closing door, turning on/off light
- the vehicle to display information about the conditions of the vehicle to the user or other person near the vehicle (e.g., door opened/closed, light on/off, fuel/battery level, vehicle is backing up).
- the visual indications act in a similar way as the keys of a computer keyboard on which the user perform a gesture (e.g., stepping, feet tapping, heel raising, standing still) to activate a specific command symbolized by a corresponding visual indication. Once a corresponding visual indication has been activated, the vehicle knows unmistakably and precisely which command is requested by the user.
- the visual indications act similarly as the screen of a computer by displaying information about the conditions of the vehicle in a clear and understandable way for the user.
- the visual indications have the ability to follow/track the position of the user and be always placed at a convenient location for the user.
- a visual communication apparatus of a vehicle includes a smart key system that identifies the user by detecting a key fob of a user within a keyless detection zone, a projector that projects visual indications on a projecting zone on a ground surface, a sensor that optically captures gestures of a user, and a control unit that actuates important elements of the vehicle in response to the gestures.
- a method to communicate with a vehicle through a visual communication apparatus includes a projector that projects visual indications on a ground surface, and a sensor that optically captures gestures of a user.
- the method to communicate with a vehicle through a visual communication apparatus includes detecting the presence of a key fob of the user inside a keyless detection zone through a smart key system, detecting the presence of the user inside the projecting zone, computing a location and a normal direction of the user inside the projecting zone, projecting visual indications at a predetermined distance from the location of the user and at a predetermined direction from the normal of the user, detecting gestures of the user in response to the visual indications, actuating key elements of the vehicle through an electrical control unit in response to the gestures, and indicating to the user that the key elements are being executed.
- FIG. 1 is a perspective view of a visual communication apparatus affixed to a vehicle, according to certain aspects of the disclosure
- FIG. 2 is a top view of the visual communication apparatus affixed to the vehicle, according to certain aspects of the disclosure
- FIG. 3A is a schematic view of a hardware diagram of the visual communication apparatus in an integrated model, according to certain aspects of the disclosure
- FIG. 3B is a schematic view of a hardware diagram of the visual communication apparatus in a separated model, according to certain aspects of the disclosure
- FIG. 4 is top view of the visual communication apparatus projecting active visual indications to an user, according to certain aspects of the disclosure
- FIG. 5 is a top view of the visual communication apparatus projecting passive visual indications to the user and other persons, according to certain aspects of the disclosure.
- FIG. 6 is a flow chart of a method for the user to communicate with the vehicle through the visual communication apparatus, according to certain aspects of the disclosure.
- FIGS. 1-2 are a perspective view and a top view, of a visual communication apparatus 100 , respectively, according to certain aspects of the disclosure.
- FIGS. 3A-3B are schematic views of hardware diagrams of the visual communication apparatus in an integrated model and in a separated model, respectively, according to certain aspects of the disclosure.
- the visual communication apparatus establishes a visual communication between the vehicle 300 and a user 200 having a key fob 202 through a projection module 101 .
- the projection module 101 may be mounted on a backside 302 of the vehicle 300 for power back door (PBD) vehicles (e.g., vehicles having a motorized trunk door 302 a, as illustrated in FIGS. 1-2 ), or mounted on a side of the vehicle 300 for power slide door (PSD) vehicles (e.g., vehicles having a motorized door on a side).
- PBD power back door
- PSD power slide door
- the projection module 101 is linked to main circuitry 400 including different control elements of the vehicle 300 .
- these different control elements may be a smart ECU 410 controlling a smart key system 306 that identifies the key fob 202 of the user 200 within a keyless detection zone 208 , or a body ECU 440 with a PBD/PSD ECU 430 that operates the PBD and/or the PSD via a PBD/PSD actuator 320 .
- the smart key system 306 may include an antenna 308 a affixed to the vehicle 3000 , a Low Frequency (LF) receiver 210 a and a Radio Frequency (RF) transmitter 210 b both integrated into the key fob 202 , and a RF receiver 308 b affixed to the vehicle 300 .
- the antenna 308 a broadcasts a first group of radio waves, e.g. 120-135 kHz, over the keyless detection zone 208 , wherein the keyless detection zone 208 may be within a predetermined distance d from the visual antenna 308 a.
- the LF receiver 210 a receives the first group of radio waves broadcasted by the antenna 308 a and the RF transmitter 210 b broadcasts back to the vehicle 300 a second group of radio waves, e.g. 300-400 MHz. Then, the second group of radio waves is received by the RF receiver 308 b.
- a smart ECU 410 of the main circuitry 400 controls the broadcasting of the first group of radio waves and detects the reception of the second group of radio waves in order to identify the presence of the key fob 202 , and of the user 200 , inside the keyless detection zone 208 .
- the smart key system 306 may be in communication with the body ECU 440 , or other elements of the main circuitry 400 , via a bus such as a Controller Area Network (CAN) bus or Local Interconnect Network (LIN) bus.
- a bus such as a Controller Area Network (CAN) bus or Local Interconnect Network (LIN) bus.
- CAN Controller Area Network
- LIN Local Interconnect Network
- the visual communication apparatus 100 may also include a steering wheel angle sensor 310 placed on a steering wheel 311 of the vehicle 300 .
- the steering wheel angle sensor 310 enables to measure a steering angle to estimate a direction in which the vehicle 300 is displaced.
- the steering wheel angle sensor 310 may rely on electromagnetic sensors or potentiometers to detect the steering angle.
- the projection module 101 includes a projector 102 projecting visual indications 106 (e.g., symbols or text messages) on a projecting zone 108 , and a sensor 104 optically capturing important information about the user 200 .
- the projection module 101 includes circuitry 101 a having a detection circuit 105 to operate the sensor 104 and a projection circuit 103 to operate the projector 102 .
- the main information about the user 200 may include the presence of the user 200 inside the projecting zone 108 , a location X of the user 200 inside the projecting zone 108 , a normal direction N of the user 200 , and gestures of the user 200 (e.g., stepping, foot tapping, heel raising or standing still).
- the optically captured main information about the user 200 are analyzed and converted into specific commands for the vehicle 300 by software instructions executed by the circuitry 101 a including the detection circuit 105 and the projection circuit 103 .
- the specific commands may include opening/closing the back door or the side door of the vehicle 300 .
- such command may be performed via a control load sent from the circuitry 101 a to the PBD/PSD ECU 430 of the main circuitry 400 .
- Such an analysis may be performed through optical detections using the orientation of the light reflected, through digital image processing using tools such as color intensity differences, image segmentations, edge detections, or through any technique known by someone having ordinary skill in the art.
- the projecting zone 108 may be lying on a ground surface of the vehicle 300 and have different shapes such as a square, a triangle with an apex below the visual communication apparatus 100 , or a slice of a disc with a center below the visual communication apparatus 100 .
- the projecting zone 108 may be included inside the keyless detection zone 208 .
- the projector 102 may be any kind of light emitting devices capable of projecting symbols and text messages visible by the naked eye and with a resolution and contrast sufficiently high to be viewed or read by the user 200 without any difficulty and independently of the luminosity surrounding the vehicle 300 .
- the sensor 104 may be any kind of light capturing devices capable of detecting light reflected on the projecting zone 108 and the user 200 with sufficient precision and speed to be analyzed by the circuitry 101 a and provide the main information about the user 200 .
- the projector 102 may include a first laser diode 120 projecting symbols and text visible to the naked eye and may also include a second laser diode 122 projecting a plurality of invisible beams of light 122 a on the projecting zone 108 .
- the user 200 on the projecting zone 108 breaks some of the invisible beams of light 122 a and reflects light back to the sensor 104 under the form of reflected infrared beam.
- the sensor 104 detects and captures images of the reflected infrared beam.
- the captured images of the reflected infrared beam are then analyzed via software instructions to obtain the main information about the user 200 such as the presence on the user 200 in the projecting zone 108 , the location X of the user 200 , the normal direction N of the user 200 , and the gestures of the user 200 .
- the projector 102 may rely on Light-Emitting Diodes (LED) technology, Digital Light Processing (DLP) technology, liquid-crystal display (LCD) technology, liquid crystal on silicon technology or any other technologies known by a person having ordinary skill in the art.
- LED Light-Emitting Diodes
- DLP Digital Light Processing
- LCD liquid-crystal display
- the projection module 101 may include an analog or digital camera 104 b for detecting the gestures of the user 200 in the projecting zone 108 under the form of images and/or videos.
- the camera 104 b may include an electronic image pickup device (e.g., photo electric elements) and an optical system having an optical lens with a variable diaphragm and a shutter to control the amount of light entering the image pickup device.
- the gestures of the user 200 are obtained by analyzing images and/or videos of the user 200 , captured by the camera 104 a. This analysis may be performed by the circuitry 101 a via image processing tools such as color intensity differences, image segmentations, edge detections.
- the camera 104 a may be directly integrated into the projection module 101 , see FIG. 3A , or separated from the projection module 101 and linked to the main circuitry 400 via a navigation ECU 420 of the vehicle 300 , see FIG. 3B .
- the images and/or video captured by the camera 104 a may be sent to the circuitry 101 a via standard video protocols such as low voltage differential system (LVDS) protocol and/or national television system committee (NTSC) protocol.
- LVDS low voltage differential system
- NSC national television system committee
- FIG. 4 is a top view of the visual communication apparatus 100 projecting active visual indications 106 a to the user 200 , according to certain aspects of the disclosure.
- the visual indications 106 may include the active visual indications 106 a requiring an input from the user 200 .
- the active visual indications 106 a may be symbols (e.g., V-shaped lines, rectangles, squares, circles, arrows) or text being placed at a predetermined distance D from the location X and at a predetermined direction U from the normal direction N of the user 200 , such as in front or on the side of the user 200 , as illustrated in FIG. 4 .
- the input from the user 200 is given by the gestures of the user 200 on the active visual indications 106 a.
- the gestures of the user 200 are performed on the active visual indications 106 a to request the vehicle 300 to perform specific commands (e.g., turning on/off lights of the vehicle 300 , opening/closing a door of the vehicle 300 , as illustrated in FIG. 4 ).
- the active visual indications 106 a can be modified (e.g., changing the color, shape or text of the active visual indications 106 a ) to indicate to the user 200 that the gestures have been recognized and the commands are being executed by the vehicle 300 .
- FIG. 5 is a top view of the visual communication apparatus 100 projecting passive visual indications 106 b to the user 200 and other persons, according to certain aspects of the disclosure.
- the visual indications 106 may also include passive visual indications 106 b requiring no input from the user 200 .
- the passive visual indications 106 b may be symbols (e.g., arrows, gauge indicators, warning signals) or text providing information about the vehicle 300 (e.g., vehicle moving in the direction of the arrows, low fuel, low battery, low oil pressure, or low tire pressure) to the user 200 and/or other persons.
- the passive visual indications 106 b may be placed at the predetermined distance D from the location X and at the predetermined direction U from the normal N when the user 200 is present in the projecting zone 108 , as illustrated in FIG. 4 or centered at a second predetermined location Y inside the projecting zone 108 when the user 200 is not present in the projecting zone 108 , as illustrated in FIG. 5 .
- the passive visual indication 106 b may be used to inform on the motion of the vehicle 300 as soon as the vehicle 300 is ready to be put in motion, (e.g., a reverse gear is engaged or a parking brake is released by the user 200 ).
- the passive visual indication 106 b indicating on the motion on the vehicle 300 may be at least one arrow pointing in the direction of the motion of the vehicle 300 .
- the direction of the motion of the vehicle 300 may be detected by the steering wheel angle sensor 310 and used to orient the at least one arrow projected on the projecting zone 108 , as illustrated in FIG. 5 .
- FIG. 6 is a flow chart of a method to communicate with the vehicle 300 through the visual communication apparatus 100 , according to certain aspects of the disclosure.
- a step S 400 it is determined if the key fob 202 is present inside the keyless detection zone 208 through the smart key system 306 .
- the antenna 308 may broadcast the first group of radio waves and the RF receiver 308 b may detect the second group of radio waves emitted back from RF transmitter 210 b, after the first group of radio waves have been received by the LF receiver 210 a. If it is determined that the key fob 202 is present inside the keyless detection zone 208 , the process goes to a step S 402 . Otherwise, the process ends.
- the process S 402 it is determined if the user 200 is present inside the projection zone 108 through the projector 102 .
- the presence detection of the user 200 inside the projection zone 108 may be performed by detecting through the sensor 104 the reflection of infrared beam on the user 200 , wherein the infrared are initially generated by the second laser diode 122 .
- the presence detection of the user 200 inside the projection zone 108 may also be performed by analyzing digital images of the projecting zone 108 captured by the camera 104 a.
- image processing tools such as color intensity differences, image segmentations, edge detections or any other image processing tools known by a person having ordinary skill in the art.
- the location X and the normal direction N of the user 200 (as illustrated in FIG. 2 ) inside the projecting zone 108 is computed.
- the computation of the location X can be performed by analyzing the light reflected on the projecting zone 108 and the user 200 and captured by the sensor 104 via software instructions executed by the circuitry 101 a.
- the visual indications 106 i.e., the active visual indications 106 a and/or passive visual indications 106 b
- the projector 102 are projected by the projector 102 at the predetermined distance from the location X and at the predetermined direction U from the normal direction N of the user 200 (e.g., the front or sides).
- the passive visual indications 106 b may be symbols (e.g., arrows, gauge indicators, warning signals) or text providing information about the vehicle 300 (e.g., “vehicle moving in the direction of the arrows”, “low fuel”, “low battery”, “low oil pressure”, or “low tire pressure”) to the user 200 and/or other persons and requiring no activation from the user 200 .
- symbols e.g., arrows, gauge indicators, warning signals
- text providing information about the vehicle 300 (e.g., “vehicle moving in the direction of the arrows”, “low fuel”, “low battery”, “low oil pressure”, or “low tire pressure”) to the user 200 and/or other persons and requiring no activation from the user 200 .
- the active visual indications 106 a may be symbols (e.g., V-shaped lines, rectangles, squares, circles, arrows) or text requiring activation from the user 200 to perform specific commands (e.g., turning on/off lights of the vehicle 300 , opening/closing a door of the vehicle 300 ) while the passive visual indications 106 b may be symbols or text providing information about the vehicle 300 (e.g., “vehicle moving in the arrows direction”, “low fuel”, “low battery”, “low oil pressure” or “low tire pressure”) without any activation from the user 200 .
- specific commands e.g., turning on/off lights of the vehicle 300 , opening/closing a door of the vehicle 300
- the passive visual indications 106 b may be symbols or text providing information about the vehicle 300 (e.g., “vehicle moving in the arrows direction”, “low fuel”, “low battery”, “low oil pressure” or “low tire pressure”) without any activation from the user 200 .
- gestures e.g., stepping, foot tapping, heel raising or standing still
- the user 200 on the active visual indications 106 a are detected and identified by the sensor 104 via software instructions executed by the circuitry 101 a.
- a step S 410 once the gestures of the user 200 on the active visual indications 106 a have been detected and identified,
- the circuitry 101 a sends a signal to the main circuitry 400 to actuate elements of the vehicle 300 (e.g., light switches, or door locks) to perform the specific commands (e.g., turning on/off lights of the vehicle 300 , opening/closing a door of the vehicle 300 ) represented by active visual indications 106 a.
- the circuitry 101 can send a signal to the PBD/PSD ECU 430 of the main circuitry 400 to actuate the PBD/PSD actuator 320 in order to operate the back and/or slide door of the vehicle 300 .
- the active indications 106 a is modified to indicate to the user 200 that the gestures have been recognized and the commands are being executed by the vehicle 300 .
- the color, the shape or the text of the active indications 106 a can be modified to let the user 200 know that the commands are being executed. For example, by having the projector 102 projecting a flashing light in a different color as well as having additional text (e.g., Caution door opening), the user 200 can be aware that the vehicle 300 is executing the commands.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Lock And Its Accessories (AREA)
- User Interface Of Digital Computer (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
- Power-Operated Mechanisms For Wings (AREA)
Abstract
Description
- Field of the Disclosure
- The present disclosure relates to an apparatus and method to visually communicate with a vehicle. More precisely, embodiments give the possibility to request commands for the vehicle as well as to obtain information about the vehicle.
- Description of the Related Art
- Vehicles providing various remote services such as remote starting, door locking or light switching by actuating buttons on a key fob have been known and widely used. However, such systems requiring the action of at least one hand and can be inconvenient when the hands of the user are full.
- Recently, there have been technical attempts to detect the intentions of the user without having the user use her/his hands. Such technical attempts have notably been focused on automatic trunk opening via a smart key device. For example, such systems require the user to stand on one foot while moving the second foot below a motion detector placed on the vehicle, or require the user to stand for a relatively long period of time in front of a presence detector placed on the vehicle.
- Though such systems provide certain ease and comfort to the user by enabling the opening of the backside door while the hands are not free, they have numerous drawbacks. Such systems might generate balance issues for the user which could result in an uncomfortable position. But mostly, in such systems the communication between the user and the vehicle is extremely limited resulting in misinterpretation from the vehicle, resulting in inadvertent actuation or unknown condition of the vehicle. For example, the backside door is unwillingly open by the system while the user is standing behind the trunk or the backside door is just unlatched but not open enough for the user to clearly see it. Thus, an apparatus and method to visually communicate with a vehicle solving the aforementioned problems of limited communication is desired. (Related Publication JP2015-021237A)
- Accordingly, the object of the present disclosure is to provide an apparatus and a method to visually communicate with a vehicle which overcomes the above-mentioned limitations.
- The apparatus ensures better communication by providing two-way communication between the user and the vehicle through visual indications projected on a ground surface. These visual indications provide a way for the user to input commands for the vehicle (e.g., opening/closing door, turning on/off light) as well as for the vehicle to display information about the conditions of the vehicle to the user or other person near the vehicle (e.g., door opened/closed, light on/off, fuel/battery level, vehicle is backing up).
- The visual indications act in a similar way as the keys of a computer keyboard on which the user perform a gesture (e.g., stepping, feet tapping, heel raising, standing still) to activate a specific command symbolized by a corresponding visual indication. Once a corresponding visual indication has been activated, the vehicle knows unmistakably and precisely which command is requested by the user. In addition, the visual indications act similarly as the screen of a computer by displaying information about the conditions of the vehicle in a clear and understandable way for the user.
- Furthermore, the visual indications have the ability to follow/track the position of the user and be always placed at a convenient location for the user.
- In one non-limiting illustrative example, a visual communication apparatus of a vehicle is presented. The visual communication apparatus of a vehicle includes a smart key system that identifies the user by detecting a key fob of a user within a keyless detection zone, a projector that projects visual indications on a projecting zone on a ground surface, a sensor that optically captures gestures of a user, and a control unit that actuates important elements of the vehicle in response to the gestures.
- In another non-limiting illustrative example, a method to communicate with a vehicle through a visual communication apparatus is presented. The visual communication apparatus includes a projector that projects visual indications on a ground surface, and a sensor that optically captures gestures of a user. The method to communicate with a vehicle through a visual communication apparatus includes detecting the presence of a key fob of the user inside a keyless detection zone through a smart key system, detecting the presence of the user inside the projecting zone, computing a location and a normal direction of the user inside the projecting zone, projecting visual indications at a predetermined distance from the location of the user and at a predetermined direction from the normal of the user, detecting gestures of the user in response to the visual indications, actuating key elements of the vehicle through an electrical control unit in response to the gestures, and indicating to the user that the key elements are being executed.
- To easily identify the discussion of any particular element or act, the most significant digit or digits in a reference number refer to the figure number in which that element is first introduced.
-
FIG. 1 is a perspective view of a visual communication apparatus affixed to a vehicle, according to certain aspects of the disclosure; -
FIG. 2 is a top view of the visual communication apparatus affixed to the vehicle, according to certain aspects of the disclosure; -
FIG. 3A is a schematic view of a hardware diagram of the visual communication apparatus in an integrated model, according to certain aspects of the disclosure; -
FIG. 3B is a schematic view of a hardware diagram of the visual communication apparatus in a separated model, according to certain aspects of the disclosure; -
FIG. 4 is top view of the visual communication apparatus projecting active visual indications to an user, according to certain aspects of the disclosure; -
FIG. 5 is a top view of the visual communication apparatus projecting passive visual indications to the user and other persons, according to certain aspects of the disclosure; and -
FIG. 6 is a flow chart of a method for the user to communicate with the vehicle through the visual communication apparatus, according to certain aspects of the disclosure. - All publications, patent applications, patents, and other references mentioned herein are incorporated by reference in their entirety. Further, the materials, methods, and examples discussed herein are illustrative only and are not intended to be limiting.
- In the drawings, like reference numerals designate identical or corresponding parts throughout the several views. Further, as used herein, the words “a”, “an”, and the like include a meaning of “one or more”, unless stated otherwise. The drawings are generally drawn not to scale unless specified otherwise or illustrating schematic structures or flowcharts.
-
FIGS. 1-2 are a perspective view and a top view, of avisual communication apparatus 100, respectively, according to certain aspects of the disclosure.FIGS. 3A-3B are schematic views of hardware diagrams of the visual communication apparatus in an integrated model and in a separated model, respectively, according to certain aspects of the disclosure. The visual communication apparatus establishes a visual communication between thevehicle 300 and auser 200 having akey fob 202 through aprojection module 101. - The
projection module 101 may be mounted on abackside 302 of thevehicle 300 for power back door (PBD) vehicles (e.g., vehicles having a motorizedtrunk door 302 a, as illustrated inFIGS. 1-2 ), or mounted on a side of thevehicle 300 for power slide door (PSD) vehicles (e.g., vehicles having a motorized door on a side). - The
projection module 101 is linked tomain circuitry 400 including different control elements of thevehicle 300. For example, these different control elements may be asmart ECU 410 controlling asmart key system 306 that identifies thekey fob 202 of theuser 200 within akeyless detection zone 208, or abody ECU 440 with a PBD/PSD ECU 430 that operates the PBD and/or the PSD via a PBD/PSD actuator 320. - The
smart key system 306 may include anantenna 308 a affixed to the vehicle 3000, a Low Frequency (LF)receiver 210 a and a Radio Frequency (RF)transmitter 210 b both integrated into thekey fob 202, and aRF receiver 308 b affixed to thevehicle 300. Theantenna 308 a broadcasts a first group of radio waves, e.g. 120-135 kHz, over thekeyless detection zone 208, wherein thekeyless detection zone 208 may be within a predetermined distance d from thevisual antenna 308 a. Once thekey fob 202 is inside thekeyless detection zone 208, theLF receiver 210 a receives the first group of radio waves broadcasted by theantenna 308 a and theRF transmitter 210 b broadcasts back to the vehicle 300 a second group of radio waves, e.g. 300-400 MHz. Then, the second group of radio waves is received by theRF receiver 308 b. In addition, asmart ECU 410 of themain circuitry 400 controls the broadcasting of the first group of radio waves and detects the reception of the second group of radio waves in order to identify the presence of thekey fob 202, and of theuser 200, inside thekeyless detection zone 208. - The
smart key system 306 may be in communication with the body ECU 440, or other elements of themain circuitry 400, via a bus such as a Controller Area Network (CAN) bus or Local Interconnect Network (LIN) bus. - The
visual communication apparatus 100 may also include a steeringwheel angle sensor 310 placed on asteering wheel 311 of thevehicle 300. The steeringwheel angle sensor 310 enables to measure a steering angle to estimate a direction in which thevehicle 300 is displaced. - The steering
wheel angle sensor 310 may rely on electromagnetic sensors or potentiometers to detect the steering angle. - The
projection module 101 includes aprojector 102 projecting visual indications 106 (e.g., symbols or text messages) on aprojecting zone 108, and asensor 104 optically capturing important information about theuser 200. Theprojection module 101 includescircuitry 101 a having adetection circuit 105 to operate thesensor 104 and aprojection circuit 103 to operate theprojector 102. The main information about theuser 200 may include the presence of theuser 200 inside theprojecting zone 108, a location X of theuser 200 inside theprojecting zone 108, a normal direction N of theuser 200, and gestures of the user 200 (e.g., stepping, foot tapping, heel raising or standing still). - The optically captured main information about the
user 200 are analyzed and converted into specific commands for thevehicle 300 by software instructions executed by thecircuitry 101 a including thedetection circuit 105 and theprojection circuit 103. For example, the specific commands may include opening/closing the back door or the side door of thevehicle 300. For instance, such command may be performed via a control load sent from thecircuitry 101 a to the PBD/PSD ECU 430 of themain circuitry 400. - Such an analysis may be performed through optical detections using the orientation of the light reflected, through digital image processing using tools such as color intensity differences, image segmentations, edge detections, or through any technique known by someone having ordinary skill in the art.
- The projecting
zone 108 may be lying on a ground surface of thevehicle 300 and have different shapes such as a square, a triangle with an apex below thevisual communication apparatus 100, or a slice of a disc with a center below thevisual communication apparatus 100. In addition, the projectingzone 108 may be included inside thekeyless detection zone 208. - The
projector 102 may be any kind of light emitting devices capable of projecting symbols and text messages visible by the naked eye and with a resolution and contrast sufficiently high to be viewed or read by theuser 200 without any difficulty and independently of the luminosity surrounding thevehicle 300. - The
sensor 104 may be any kind of light capturing devices capable of detecting light reflected on the projectingzone 108 and theuser 200 with sufficient precision and speed to be analyzed by thecircuitry 101 a and provide the main information about theuser 200. - In an exemplary embodiment, the
projector 102 may include afirst laser diode 120 projecting symbols and text visible to the naked eye and may also include asecond laser diode 122 projecting a plurality of invisible beams of light 122 a on the projectingzone 108. Theuser 200 on the projectingzone 108 breaks some of the invisible beams of light 122 a and reflects light back to thesensor 104 under the form of reflected infrared beam. Thesensor 104 detects and captures images of the reflected infrared beam. The captured images of the reflected infrared beam are then analyzed via software instructions to obtain the main information about theuser 200 such as the presence on theuser 200 in the projectingzone 108, the location X of theuser 200, the normal direction N of theuser 200, and the gestures of theuser 200. - In another exemplary embodiment, the
projector 102 may rely on Light-Emitting Diodes (LED) technology, Digital Light Processing (DLP) technology, liquid-crystal display (LCD) technology, liquid crystal on silicon technology or any other technologies known by a person having ordinary skill in the art. - In addition to the
sensor 104, theprojection module 101 may include an analog ordigital camera 104 b for detecting the gestures of theuser 200 in the projectingzone 108 under the form of images and/or videos. Thecamera 104 b may include an electronic image pickup device (e.g., photo electric elements) and an optical system having an optical lens with a variable diaphragm and a shutter to control the amount of light entering the image pickup device. - The gestures of the
user 200 are obtained by analyzing images and/or videos of theuser 200, captured by the camera 104 a. This analysis may be performed by thecircuitry 101 a via image processing tools such as color intensity differences, image segmentations, edge detections. - The camera 104 a may be directly integrated into the
projection module 101, seeFIG. 3A , or separated from theprojection module 101 and linked to themain circuitry 400 via a navigation ECU 420 of thevehicle 300, seeFIG. 3B . - When the camera 104 a is separated from the
projection module 101, the images and/or video captured by the camera 104 a may be sent to thecircuitry 101 a via standard video protocols such as low voltage differential system (LVDS) protocol and/or national television system committee (NTSC) protocol. -
FIG. 4 is a top view of thevisual communication apparatus 100 projecting activevisual indications 106 a to theuser 200, according to certain aspects of the disclosure. - The
visual indications 106 may include the activevisual indications 106 a requiring an input from theuser 200. The activevisual indications 106 a may be symbols (e.g., V-shaped lines, rectangles, squares, circles, arrows) or text being placed at a predetermined distance D from the location X and at a predetermined direction U from the normal direction N of theuser 200, such as in front or on the side of theuser 200, as illustrated inFIG. 4 . The input from theuser 200 is given by the gestures of theuser 200 on the activevisual indications 106 a. The gestures of the user 200 (e.g., stepping, tapping, heel raising or standing still) are performed on the activevisual indications 106 a to request thevehicle 300 to perform specific commands (e.g., turning on/off lights of thevehicle 300, opening/closing a door of thevehicle 300, as illustrated inFIG. 4 ). - In addition, the active
visual indications 106 a can be modified (e.g., changing the color, shape or text of the activevisual indications 106 a) to indicate to theuser 200 that the gestures have been recognized and the commands are being executed by thevehicle 300. -
FIG. 5 is a top view of thevisual communication apparatus 100 projecting passivevisual indications 106 b to theuser 200 and other persons, according to certain aspects of the disclosure. - The
visual indications 106 may also include passivevisual indications 106 b requiring no input from theuser 200. The passivevisual indications 106 b may be symbols (e.g., arrows, gauge indicators, warning signals) or text providing information about the vehicle 300 (e.g., vehicle moving in the direction of the arrows, low fuel, low battery, low oil pressure, or low tire pressure) to theuser 200 and/or other persons. The passivevisual indications 106 b may be placed at the predetermined distance D from the location X and at the predetermined direction U from the normal N when theuser 200 is present in the projectingzone 108, as illustrated inFIG. 4 or centered at a second predetermined location Y inside the projectingzone 108 when theuser 200 is not present in the projectingzone 108, as illustrated inFIG. 5 . - In another exemplary embodiment, the passive
visual indication 106 b may be used to inform on the motion of thevehicle 300 as soon as thevehicle 300 is ready to be put in motion, (e.g., a reverse gear is engaged or a parking brake is released by the user 200). The passivevisual indication 106 b indicating on the motion on thevehicle 300 may be at least one arrow pointing in the direction of the motion of thevehicle 300. The direction of the motion of thevehicle 300 may be detected by the steeringwheel angle sensor 310 and used to orient the at least one arrow projected on the projectingzone 108, as illustrated inFIG. 5 . -
FIG. 6 is a flow chart of a method to communicate with thevehicle 300 through thevisual communication apparatus 100, according to certain aspects of the disclosure. - In a step S400, it is determined if the
key fob 202 is present inside thekeyless detection zone 208 through the smartkey system 306. For example, theantenna 308 may broadcast the first group of radio waves and theRF receiver 308 b may detect the second group of radio waves emitted back fromRF transmitter 210 b, after the first group of radio waves have been received by theLF receiver 210 a. If it is determined that thekey fob 202 is present inside thekeyless detection zone 208, the process goes to a step S402. Otherwise, the process ends. - In the step S402, it is determined if the
user 200 is present inside theprojection zone 108 through theprojector 102. The presence detection of theuser 200 inside theprojection zone 108 may be performed by detecting through thesensor 104 the reflection of infrared beam on theuser 200, wherein the infrared are initially generated by thesecond laser diode 122. The presence detection of theuser 200 inside theprojection zone 108 may also be performed by analyzing digital images of the projectingzone 108 captured by the camera 104 a. The analysis of digital images of the projectingzone 108 via image processing tools such as color intensity differences, image segmentations, edge detections or any other image processing tools known by a person having ordinary skill in the art. If it is determined that theuser 200 is present inside the projectingzone 108, the process goes to a step S404. Otherwise, the process ends. - In the step S404, The location X and the normal direction N of the user 200 (as illustrated in
FIG. 2 ) inside the projectingzone 108 is computed. The computation of the location X can be performed by analyzing the light reflected on the projectingzone 108 and theuser 200 and captured by thesensor 104 via software instructions executed by thecircuitry 101 a. - In a step S406, the visual indications 106 (i.e., the active
visual indications 106 a and/or passivevisual indications 106 b) are projected by theprojector 102 at the predetermined distance from the location X and at the predetermined direction U from the normal direction N of the user 200 (e.g., the front or sides). - The passive
visual indications 106 b may be symbols (e.g., arrows, gauge indicators, warning signals) or text providing information about the vehicle 300 (e.g., “vehicle moving in the direction of the arrows”, “low fuel”, “low battery”, “low oil pressure”, or “low tire pressure”) to theuser 200 and/or other persons and requiring no activation from theuser 200. - The active
visual indications 106 a may be symbols (e.g., V-shaped lines, rectangles, squares, circles, arrows) or text requiring activation from theuser 200 to perform specific commands (e.g., turning on/off lights of thevehicle 300, opening/closing a door of the vehicle 300) while the passivevisual indications 106 b may be symbols or text providing information about the vehicle 300 (e.g., “vehicle moving in the arrows direction”, “low fuel”, “low battery”, “low oil pressure” or “low tire pressure”) without any activation from theuser 200. - In a step S408, gestures (e.g., stepping, foot tapping, heel raising or standing still) performed by the
user 200 on the activevisual indications 106 a are detected and identified by thesensor 104 via software instructions executed by thecircuitry 101 a. - In a step S410, once the gestures of the
user 200 on the activevisual indications 106 a have been detected and identified, Thecircuitry 101 a sends a signal to themain circuitry 400 to actuate elements of the vehicle 300 (e.g., light switches, or door locks) to perform the specific commands (e.g., turning on/off lights of thevehicle 300, opening/closing a door of the vehicle 300) represented by activevisual indications 106 a. For example, thecircuitry 101 can send a signal to the PBD/PSD ECU 430 of themain circuitry 400 to actuate the PBD/PSD actuator 320 in order to operate the back and/or slide door of thevehicle 300. - In a step S412, the
active indications 106 a is modified to indicate to theuser 200 that the gestures have been recognized and the commands are being executed by thevehicle 300. The color, the shape or the text of theactive indications 106 a can be modified to let theuser 200 know that the commands are being executed. For example, by having theprojector 102 projecting a flashing light in a different color as well as having additional text (e.g., Caution door opening), theuser 200 can be aware that thevehicle 300 is executing the commands. - The foregoing discussion discloses and describes merely exemplary embodiments of an object of the present disclosure. As will be understood by those skilled in the art, an object of the present disclosure may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the present disclosure is intended to be illustrative, but not limiting of the scope of an object of the present disclosure as well as the claims.
- Numerous modifications and variations on the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.
Claims (9)
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/930,365 US9616802B1 (en) | 2015-11-02 | 2015-11-02 | Apparatus and method to visually communicate with a vehicle |
PCT/US2016/048845 WO2017078830A1 (en) | 2015-11-02 | 2016-08-26 | Apparatus and method to visually communicate with a vehicle |
JP2018516495A JP2018531332A (en) | 2015-11-02 | 2016-08-26 | Apparatus and method for visually communicating with a vehicle |
DE212016000220.1U DE212016000220U1 (en) | 2015-11-02 | 2016-08-26 | Device for communicating visually with a vehicle |
US15/457,583 US9862311B2 (en) | 2015-11-02 | 2017-03-13 | Apparatus and method to visually communicate with a vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/930,365 US9616802B1 (en) | 2015-11-02 | 2015-11-02 | Apparatus and method to visually communicate with a vehicle |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/457,583 Continuation US9862311B2 (en) | 2015-11-02 | 2017-03-13 | Apparatus and method to visually communicate with a vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
US9616802B1 US9616802B1 (en) | 2017-04-11 |
US20170120798A1 true US20170120798A1 (en) | 2017-05-04 |
Family
ID=58461658
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/930,365 Expired - Fee Related US9616802B1 (en) | 2015-11-02 | 2015-11-02 | Apparatus and method to visually communicate with a vehicle |
US15/457,583 Expired - Fee Related US9862311B2 (en) | 2015-11-02 | 2017-03-13 | Apparatus and method to visually communicate with a vehicle |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/457,583 Expired - Fee Related US9862311B2 (en) | 2015-11-02 | 2017-03-13 | Apparatus and method to visually communicate with a vehicle |
Country Status (4)
Country | Link |
---|---|
US (2) | US9616802B1 (en) |
JP (1) | JP2018531332A (en) |
DE (1) | DE212016000220U1 (en) |
WO (1) | WO2017078830A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3466772A1 (en) * | 2017-10-04 | 2019-04-10 | Huf Hülsbeck & Fürst GmbH & Co. KG | Mounting module with a display element |
DE102018207663B3 (en) * | 2018-05-16 | 2019-10-10 | Volkswagen Aktiengesellschaft | Method for detecting a user input and vehicle |
US11084418B2 (en) * | 2019-04-10 | 2021-08-10 | Hyundai Motor Company | Apparatus and method for outputting platooning information in vehicle |
CN113247007A (en) * | 2021-06-22 | 2021-08-13 | 肇庆小鹏新能源投资有限公司 | Vehicle control method and vehicle |
US11724637B2 (en) | 2021-02-17 | 2023-08-15 | Aisin Corporation | Vehicle opening and closing body control device that senses gestures |
Families Citing this family (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD731383S1 (en) | 2014-05-23 | 2015-06-09 | Google Inc. | Vehicle door |
FR3041110B1 (en) * | 2015-09-14 | 2018-03-16 | Valeo Vision | PROJECTION METHOD FOR A MOTOR VEHICLE OF AN IMAGE ON A PROJECTION SURFACE |
US10752165B2 (en) * | 2016-04-13 | 2020-08-25 | Panasonic Automative Systems Company of America, Division of Panasonic Corporation of North America | Vehicle signal |
WO2018006224A1 (en) * | 2016-07-04 | 2018-01-11 | SZ DJI Technology Co., Ltd. | System and method for automated tracking and navigation |
WO2018035484A1 (en) * | 2016-08-18 | 2018-02-22 | Apple Inc. | System and method for interactive scene projection |
KR101886504B1 (en) * | 2016-08-31 | 2018-08-07 | 현대자동차주식회사 | Method for controlling operation waiting time of driver convenience system |
JP1593572S (en) * | 2016-12-28 | 2021-01-18 | ||
USD835028S1 (en) | 2017-04-28 | 2018-12-04 | Waymo Llc | Roof sensor housing |
USD834971S1 (en) * | 2017-04-28 | 2018-12-04 | Waymo Llc | Front sensor housing |
USD858381S1 (en) | 2017-04-28 | 2019-09-03 | Waymo Llc | Fender sensor housing |
JP1603411S (en) * | 2017-06-12 | 2020-04-27 | ||
DE102017119919A1 (en) * | 2017-08-30 | 2019-02-28 | Automotive Lighting Reutlingen Gmbh | Luminaire arrangement of a motor vehicle and ambient lighting device with such a lamp arrangement |
DE112017008294T5 (en) * | 2017-12-22 | 2020-10-15 | Mitsubishi Electric Corporation | IN-VEHICLE DEVICE, AUTHENTICATION PROCEDURE AND AUTHENTICATION PROGRAM |
USD866368S1 (en) | 2018-03-15 | 2019-11-12 | Waymo Llc | Roof sensor housing |
CN108482386A (en) * | 2018-03-30 | 2018-09-04 | 北京新能源汽车股份有限公司 | Vehicular interaction system and method for vehicle control |
JP6915213B2 (en) * | 2018-08-06 | 2021-08-04 | 三井金属アクト株式会社 | Opening and closing system |
JP7298069B2 (en) * | 2018-09-28 | 2023-06-27 | 株式会社小糸製作所 | vehicle headlight |
US20200111336A1 (en) * | 2018-10-05 | 2020-04-09 | Ron CHATTERJEE | Security monitoring and communication system using projector for doors and vehicles |
DE102018126827A1 (en) * | 2018-10-26 | 2020-04-30 | Bayerische Motoren Werke Aktiengesellschaft | Motor vehicle |
DE102018220145A1 (en) * | 2018-11-23 | 2020-05-28 | Brose Fahrzeugteile Se & Co. Kommanditgesellschaft, Bamberg | Method and adjustment device for adjusting a vehicle adjustment part with output status information |
JP7208040B2 (en) * | 2019-01-31 | 2023-01-18 | 東洋電装株式会社 | vehicle controller |
USD902756S1 (en) | 2019-02-20 | 2020-11-24 | Waymo Llc | Sensor assembly |
USD915913S1 (en) | 2019-05-01 | 2021-04-13 | Waymo Llc | Roof pod housing |
USD957968S1 (en) | 2019-02-20 | 2022-07-19 | Waymo Llc | Sensor housing |
USD950404S1 (en) | 2019-05-01 | 2022-05-03 | Waymo Llc | Roof pod housing |
EP3708485B1 (en) | 2019-03-14 | 2022-12-21 | AIRBUS HELICOPTERS DEUTSCHLAND GmbH | An information projection and control system |
USD927998S1 (en) | 2019-04-25 | 2021-08-17 | Waymo Llc | Front sensor housing |
USD964249S1 (en) | 2019-04-25 | 2022-09-20 | Waymo Llc | Perimeter sensor housing |
USD965498S1 (en) | 2019-04-25 | 2022-10-04 | Waymo Llc | Perimeter sensor housing |
USD928639S1 (en) | 2019-04-25 | 2021-08-24 | Waymo Llc | Rear sensor housing |
USD964908S1 (en) | 2019-04-25 | 2022-09-27 | Waymo Llc | Perimeter sensor housing |
USD964909S1 (en) | 2019-04-25 | 2022-09-27 | Waymo Llc | Perimeter sensor housing |
USD954571S1 (en) | 2019-04-25 | 2022-06-14 | Waymo Llc | Front sensor housing |
USD956585S1 (en) | 2019-04-25 | 2022-07-05 | Waymo Llc | Rear sensor housing |
US11964616B2 (en) | 2019-05-01 | 2024-04-23 | Waymo Llc | Autonomous vehicle roof pod |
US11603048B2 (en) | 2019-05-01 | 2023-03-14 | Waymo Llc | Autonomous vehicle roof pod |
DE102019124513A1 (en) * | 2019-09-12 | 2021-03-18 | Bayerische Motoren Werke Aktiengesellschaft | Motor vehicle |
FR3102615B1 (en) * | 2019-10-23 | 2022-07-01 | Renault Sas | Vehicle comprising a box fitted with an antenna |
CN112874433B (en) * | 2020-06-11 | 2022-07-12 | 长城汽车股份有限公司 | Method, device and system for controlling vehicle light |
JP7474127B2 (en) | 2020-06-17 | 2024-04-24 | 株式会社ニチベイ | Shielding material control device and shielding material control method |
CN114643956A (en) * | 2021-08-16 | 2022-06-21 | 长城汽车股份有限公司 | Method for controlling vehicle back door, device for controlling vehicle back door, and vehicle |
CN114291034B (en) * | 2021-12-31 | 2023-08-08 | 佛山市安驾科技有限公司 | Skirting control method and control system for electric tail door of automobile |
CN115653442A (en) * | 2022-11-23 | 2023-01-31 | 中国第一汽车股份有限公司 | Vehicle door control method, device, equipment and storage medium |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2740501B1 (en) * | 1995-10-26 | 1998-06-19 | Valeo Securite Habitacle | HANDS-FREE SYSTEM FOR UNLOCKING AND / OR OPENING THE VEHICLE OPENING ELEMENT |
JP3575364B2 (en) * | 1999-12-28 | 2004-10-13 | 株式会社豊田自動織機 | Steering support device |
JP4207060B2 (en) * | 2006-05-31 | 2009-01-14 | アイシン・エィ・ダブリュ株式会社 | Drawing system |
JP4927514B2 (en) * | 2006-12-12 | 2012-05-09 | クラリオン株式会社 | Driving assistance device |
FR2920172B1 (en) * | 2007-08-21 | 2009-12-04 | Valeo Securite Habitacle | AUTOMATIC UNLOCKING METHOD OF A MOTOR VEHICLE OPENING FOR HANDS-FREE SYSTEM AND DEVICE FOR IMPLEMENTING THE METHOD |
DE102008063366B4 (en) * | 2008-12-30 | 2022-04-28 | Huf Hülsbeck & Fürst Gmbh & Co. Kg | Device for contactless actuation of a tailgate of a motor vehicle and method for actuating a tailgate of a motor vehicle and motor vehicle |
DE202009018205U1 (en) * | 2009-06-02 | 2011-05-05 | Volkswagen Ag | Device for actuating a closing element of a vehicle |
US20100321945A1 (en) * | 2009-06-19 | 2010-12-23 | Gm Global Technology Operations, Inc. | Vehicular graphics projection system |
DE102010034853A1 (en) * | 2010-08-18 | 2012-02-23 | Gm Global Technology Operations Llc (N.D.Ges.D. Staates Delaware) | Motor vehicle with digital projectors |
JP5382050B2 (en) * | 2011-04-06 | 2014-01-08 | アイシン精機株式会社 | Opening and closing body actuator for vehicle |
KR101312630B1 (en) * | 2011-07-08 | 2013-10-04 | 에스엘 주식회사 | Automotive lamp and cotrolling method for the same |
GB201119792D0 (en) * | 2011-11-16 | 2011-12-28 | Jaguar Cars | Vehicle access system |
JP2013123096A (en) * | 2011-12-09 | 2013-06-20 | Fujitsu Ten Ltd | Remote starter, information processor and remote start system |
KR101316873B1 (en) * | 2012-07-04 | 2013-10-08 | 현대자동차주식회사 | System and method for operating gate |
US8733939B2 (en) * | 2012-07-26 | 2014-05-27 | Cloudcar, Inc. | Vehicle content projection |
JP6186971B2 (en) | 2013-07-17 | 2017-08-30 | アイシン精機株式会社 | Vehicle door opening and closing device and control method thereof |
WO2015032795A2 (en) * | 2013-09-03 | 2015-03-12 | Jaguar Land Rover Limited | System for imaging |
EP2860704B1 (en) * | 2013-10-10 | 2016-04-27 | U-Shin France SAS | Method for opening a movable panel of the motor vehicle and corresponding opening control device |
EP2930071B1 (en) * | 2014-04-10 | 2018-11-14 | U-Shin France | Method for opening a movable panel of the motor vehicle and corresponding opening control device |
-
2015
- 2015-11-02 US US14/930,365 patent/US9616802B1/en not_active Expired - Fee Related
-
2016
- 2016-08-26 WO PCT/US2016/048845 patent/WO2017078830A1/en active Application Filing
- 2016-08-26 JP JP2018516495A patent/JP2018531332A/en active Pending
- 2016-08-26 DE DE212016000220.1U patent/DE212016000220U1/en not_active Expired - Lifetime
-
2017
- 2017-03-13 US US15/457,583 patent/US9862311B2/en not_active Expired - Fee Related
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3466772A1 (en) * | 2017-10-04 | 2019-04-10 | Huf Hülsbeck & Fürst GmbH & Co. KG | Mounting module with a display element |
DE102018207663B3 (en) * | 2018-05-16 | 2019-10-10 | Volkswagen Aktiengesellschaft | Method for detecting a user input and vehicle |
US11084418B2 (en) * | 2019-04-10 | 2021-08-10 | Hyundai Motor Company | Apparatus and method for outputting platooning information in vehicle |
US11724637B2 (en) | 2021-02-17 | 2023-08-15 | Aisin Corporation | Vehicle opening and closing body control device that senses gestures |
CN113247007A (en) * | 2021-06-22 | 2021-08-13 | 肇庆小鹏新能源投资有限公司 | Vehicle control method and vehicle |
Also Published As
Publication number | Publication date |
---|---|
US9616802B1 (en) | 2017-04-11 |
JP2018531332A (en) | 2018-10-25 |
WO2017078830A1 (en) | 2017-05-11 |
DE212016000220U1 (en) | 2018-07-13 |
US9862311B2 (en) | 2018-01-09 |
US20170182933A1 (en) | 2017-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9862311B2 (en) | Apparatus and method to visually communicate with a vehicle | |
JP2018531332A6 (en) | Apparatus and method for visually communicating with a vehicle | |
US11560092B2 (en) | Vehicular vision system | |
US10229654B2 (en) | Vehicle and method for controlling the vehicle | |
EP3132436B1 (en) | Trainable transceiver and camera systems and methods | |
CN110419211B (en) | Information processing apparatus, information processing method, and computer-readable storage medium | |
US20160132126A1 (en) | System for information transmission in a motor vehicle | |
US20070244613A1 (en) | Noncontact Input Device For In-Vehicle Apparatus | |
US20140195096A1 (en) | Apparatus and method for contactlessly detecting objects and/or persons and gestures and/or operating procedures made and/or carried out thereby | |
WO2014156222A1 (en) | Obstacle detection device, and electric-powered vehicle provided therewith | |
CN102859568A (en) | Video based intelligent vehicle control system | |
US8184018B2 (en) | Image-based vehicle safety warning system | |
US11181909B2 (en) | Remote vehicle control device, remote vehicle control system, and remote vehicle control method | |
US20200189522A1 (en) | Apparatus for managing vehicle intrusion, system having the same and method thereof | |
KR101671993B1 (en) | Safety system for vehicle | |
CN111556281B (en) | Vehicle safety system and operation method thereof | |
JP2010179828A (en) | Operation input device for vehicle | |
US20190302755A1 (en) | Information processing apparatus and computer readable storage medium | |
CN106627149B (en) | Automobile instrument panel and control method | |
CN211543391U (en) | Safe driving auxiliary system | |
JP2011192070A (en) | Apparatus for monitoring surroundings of vehicle | |
JP2012142832A (en) | Imaging apparatus | |
EP1848611A1 (en) | A driver assistance system | |
EP2806414B1 (en) | Driver assistance in passing a narrow thoroughfare | |
US20230141584A1 (en) | Apparatus for displaying at least one virtual lane line based on environmental condition and method of controlling same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AISIN TECHNICAL CENTER OF AMERICA, INC., MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIRIYAMA, HIROSHI;KIMURA, SHOGO;KURITA, MINEO;REEL/FRAME:036943/0136 Effective date: 20151029 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20210411 |