EP2850506A2 - Projiziertes virtuelles eingabesystem für ein fahrzeug - Google Patents

Projiziertes virtuelles eingabesystem für ein fahrzeug

Info

Publication number
EP2850506A2
EP2850506A2 EP13790166.6A EP13790166A EP2850506A2 EP 2850506 A2 EP2850506 A2 EP 2850506A2 EP 13790166 A EP13790166 A EP 13790166A EP 2850506 A2 EP2850506 A2 EP 2850506A2
Authority
EP
European Patent Office
Prior art keywords
input
vehicle
detector
projector
virtual touchscreen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13790166.6A
Other languages
English (en)
French (fr)
Other versions
EP2850506A4 (de
Inventor
Malte ROTHHÄMEL
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Scania CV AB
Original Assignee
Scania CV AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania CV AB filed Critical Scania CV AB
Publication of EP2850506A2 publication Critical patent/EP2850506A2/de
Publication of EP2850506A4 publication Critical patent/EP2850506A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/334Projection means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/26Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using acoustic output

Definitions

  • the present invention concerns an input system, and a method associated with such an input system, according to the preambles to the independent claims.
  • the invention concerns in particular an input system and method for a vehicle, which system and method facilitate the control of systems and functions on the vehicle.
  • a modern goods vehicle there is a plurality of systems or functions that the driver, or a passenger, desires to be able to use from the driver's seat, from the passenger seat or from other locations in and around the vehicle, such as from the bed or outside the vehicle.
  • systems or functions include lighting, heating systems, radio, TV and multimedia as well as, for example, the adjustment of the pneumatic suspension for the goods vehicle, for example in connection with the hitching of a trailer.
  • buttons and controls are currently fulfilled by arranging buttons and controls in a number of locations in and in connection with the vehicle to enable the systems and the functions to be used even if the user is not sitting in the driver's seat.
  • the interior lighting can be controlled from a panel near the bed.
  • a remote control for adjusting the pneumatic suspension, which control is accessible at the passenger seat so that it can be taken out when an inspection occurs.
  • Such a remote control is often connected by means of a cable.
  • US 7,248,151 concerns a virtual keyboard for a vehicle, which keyboard is used in connection with controlling various functions for the vehicle, such as unlocking the vehicle.
  • a keyboard is projected, for example, on a side window of the vehicle.
  • a detector and a processor are arranged so as to detect gestures and identify the gestures that are performed on the keyboard.
  • the detector can, for example, consist of a camera or an infrared detector.
  • US 7,050,606 concerns a system for detecting and identifying hand gestures that are particularly suitable for controlling various functions for a vehicle. These functions pertain to, for example, heating, air conditioning, lighting, CD/radio settings, etc.
  • US 2011/0286676 describes systems and related methods intended for vehicles in order to detect and identify gestures in three dimensions.
  • the step is included of receiving one or a plurality of unprocessed frames of image data from a sensor, processing and combining a plurality of frames in order to identify body parts of the user in the vehicle and calculate the position of the end of the hand of the user, determining whether the hand has performed a dynamic or a static gesture, receiving a command that corresponds to a number of stored gestures, and performing the command.
  • Microsoft Kinect is thus a system that is adapted primarily for the game industry, and entails that sensors are arranged in connection with a display, wherein the sensors comprise a camera and a 3D sensor that can, with the help of special software, detect the motions of a user in three dimensions and identify, among other things, the face of the user.
  • the Kinect system is thus used in the Xbox game console and elsewhere.
  • the sensor for three dimensions consists of an infrared laser projector combined with a monochrome CMOS sensor, which can detect video data in three dimensions under daylight conditions.
  • the sensing distance for the 3D sensor can be adjusted, and the software can automatically calibrate the sensor depending upon which game is being played, and upon the physical surroundings of the player, so that furniture and other obstacles can be taken into account.
  • the system enables advanced motion recognition and can track the movements of two active players simultaneously, whereupon motion analysis can occur by evaluating motion steps from up to two joints per player.
  • the Kinect system sensor outputs a video signal with a frame rate of 30 Hz.
  • the video streams with the RGB signal use an 8-bit VGA resolution (640 x 480 pixels) with a color filter, while the monochrome video streams for 3D effects have a VGA resolution (640 ⁇ 480 pixels) of 11 bits, which offers sensitivity at 2,048 levels.
  • the Kinect sensor functions at distances within the range of 1.2 to 3.5 meters when it is used in conjunction with the software for the Xbox, but can be given an expanded range of from 0.7 to 6.0 meters.
  • the sensor has a horizontal detection angle of 57° and a vertical of 43°. The sensor can then be pivoted 27° in the horizontal direction.
  • buttons and control are arranged in the vehicle, not only in connection with the driver's seat but, for example, by the bed and in the form of remote controls. These buttons and controls often require separate cable runs, which makes their installation complicated and entails high costs. In addition, it can sometimes be difficult for the driver of the vehicle to know where the buttons for a given function are located.
  • the object of the present invention is to provide an improved input interface that is both more user-friendly and offers cost savings for the vehicle manufacturer.
  • a projector system and a detector system that is adapted so as to detect the body motions of a person are combined by means of the input system according to the invention.
  • miniature projectors and camera systems are used that are also used in connection with, among other things, video games, such as Microsoft Kinect, which is used in the Xbox game console (as described above).
  • video games such as Microsoft Kinect
  • a virtual touch screen is projected onto a desired surface in or outside the goods vehicle and used to input control instructions for systems and functions in the vehicle.
  • the system is arranged, for example, in the ceiling of the goods vehicle so that the virtual screen can be projected in any conceivable position.
  • the touchscreen can, for example, be projected on the wall next to the bed, on the mattress, on the floor or on walls. If one is outside the vehicle, the virtual touchscreen can be projected on a plate held in the hand or, for example, on the inside of the door or on the step.
  • buttons in the goods vehicle for, for example, remote control of the pneumatic suspension, lighting buttons and the control unit for the heating system. It is also possible to define, by oneself, which systems and functions are to be controllable by means of the virtual touchscreen.
  • a system is achieved that is both more user-friendly in part because access for controlling the various vehicle systems is improved. It also offers cost savings for the vehicle manufacture because, for example, fewer cable runs are required and fewer buttons and control units are needed.
  • Figure 1 is a block diagram that schematically illustrates the present invention.
  • FIG. 2 shows a flow diagram that illustrates the present invention. Detailed description of preferred embodiments of the invention
  • the present invention concerns an input system 2 for a vehicle, preferably, a goods vehicle, a bus or a motor home, but also for cars.
  • the input system comprises a projector system 4 and a detector system 6, which are adapted so as to communicate with a control unit 8.
  • the projector system 4 is adapted so as to generate a virtual touchscreen 10 comprising an adjustable input menu adapted so as to input control instructions to control one or a plurality of systems for the vehicle, which projector system 4 is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle.
  • the projector system 4 is preferably arranged in the ceiling of the vehicle cab.
  • the projector system 4 comprises at least one of an image generator, for example a cathode ray tube (CRT), a digital light processing unit, and an optical projecting unit that includes optical projection lenses for projecting a generated image.
  • an image generator for example a cathode ray tube (CRT)
  • CRT cathode ray tube
  • optical projecting unit that includes optical projection lenses for projecting a generated image.
  • the projector system can also consist of a laser device that "draws" the virtual touchscreen.
  • the projector system consists of three units that are disposed in various positions in and/or around the vehicle to enable projection in as many locations as possible.
  • a system with only one unit is also possible, in which case the unit can, for example, be arranged for projection near the bed.
  • the detector system 6 is also preferably arranged in the ceiling of the vehicle cab and can suitably be arranged in connection to the projector system.
  • the detector system comprises three units, each of which comprises at least one of an optical data-gathering unit, such as a camera or an infrared detector.
  • the detector system preferably also comprises a sound detector adapted so as to detect activation measures and inputting activities in the form of sound, for example from voices or tapping sounds.
  • the input system 2 is adapted so as to be in at least two modes, a standby mode and a use mode.
  • the input system is of course also in an entirely passive mode when the system is turned off.
  • the input system 2 is thus adapted so as to:
  • a - detect a first activation measure by means of said detector system 6 and, if such a first activation measure is detected, to
  • the system waits for the user to confirm that the touchscreen is to be projected, which the user does by performing a second activation measure.
  • the system according to this embodiment is adapted so as to
  • the input system is then ready to receive control instructions via the virtual touchscreen that is projected on the selected display surface. This occurs via the steps of:
  • control of the systems and functions for the vehicle is indicated by a double arrow from the control unit 8.
  • the control unit is preferably connected to the vehicle bus system, and the control signals that it generates are processed by the vehicle in the normal way, and consequently need not be described further here.
  • Examples of systems or functions that can suitably be controlled by means of the input system include the lighting, heating system, radio, TV and multimedia as well as, for example, settings of the pneumatic suspension for the goods vehicle, for example in connection with the hitching of a trailer.
  • the first and, in applicable cases, second activation measure comprise one or a plurality of a predetermined pattern of motion (a gesture); a finger snap or a tap.
  • the first activation measure can, for example, consist of tapping two times to initiate the use mode and then, according to one embodiment, of causing the system to display the virtual touchscreen by identifying the selected display surface by distinctly pointing with one finger, which then constitutes the second activation measure.
  • the input menu is displayed on the virtual touchscreen.
  • the start menu i.e. the first input menu display
  • the start menu can be dependent upon where the virtual touchscreen is projected. If, for example, the touchscreen is to be projected near the bed, a menu for adjusting the lighting and the radio will be displayed.
  • the input menu preferably comprises a plurality of input menus arranged in a hierarchical system, wherein activation of a function occurs by means of a specified input activity.
  • An input activity consists of, for example, pressing a predefined area on the input menu by keeping the hand/finger in the input area for at least a predetermined time (on the order of parts of a second up to several seconds).
  • the system is adapted so as to generate a predetermined acknowledgement that consists of one or a plurality changes of the color or shape of an input area (the button), and the generation of an acoustic signal.
  • the present invention further comprises a method in connection with an input system for a vehicle, wherein the input system comprises a projector system and a detector system, which are arranged so as to communicate with a control unit.
  • the input system comprises a projector system and a detector system, which are arranged so as to communicate with a control unit.
  • the method will now be described briefly with reference to the flow diagram in Figure 2. Reference is also made to relevant parts of the foregoing description of the input system.
  • a projector system is thus adapted so as to generate a virtual touchscreen comprising an adjustable input menu adapted for inputting control instructions for controlling one or a plurality of systems for the vehicle, whereupon the projector system is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle.
  • the input system is adapted so as to be in at least two modes, a sleep mode and a use mode, wherein the method comprises the steps of:
  • the method comprises detecting a second activation measure by means of said detector system, and performing step D, and the subsequent steps, only when said second activation measure has been detected.
  • This elective step has been identified using broken lines in Figure 2.
  • the first and, where applicable, second activation measures comprise, for example, one or a plurality of predetermined patterns of movement (a gesture), a tap.
  • various functions can be activated by generating various input activities that comprise touching a predefined input area by keeping a hand/finger in the input area for at least a
  • the input menu preferably comprises a plurality of input menus arranged in a hierarchical system, wherein the activation of a function is achieved by means of a specified input activity.
  • An input activity is accepted by means of a predetermined acknowledgement that consists of one or a plurality of changes in the color or shape of the input area (the button), and the generation of an acoustic signal.
  • the projector system and the detector system must be disposed so that the virtual touchscreen is displayed in such a way that the operator can stand outside the vehicle and adjust, for example, the pneumatic suspension for the rear axle.
  • the projection can be adapted so that the touchscreen assumes a desired appearance. For example, consideration can be given to the

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
EP13790166.6A 2012-05-14 2013-05-08 Projiziertes virtuelles eingabesystem für ein fahrzeug Withdrawn EP2850506A4 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1250488A SE537730C2 (sv) 2012-05-14 2012-05-14 Projicerat virtuellt inmatningssystem för fordon
PCT/SE2013/050519 WO2013172768A2 (en) 2012-05-14 2013-05-08 Input system

Publications (2)

Publication Number Publication Date
EP2850506A2 true EP2850506A2 (de) 2015-03-25
EP2850506A4 EP2850506A4 (de) 2016-09-28

Family

ID=49584416

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13790166.6A Withdrawn EP2850506A4 (de) 2012-05-14 2013-05-08 Projiziertes virtuelles eingabesystem für ein fahrzeug

Country Status (6)

Country Link
EP (1) EP2850506A4 (de)
CN (1) CN104508598A (de)
BR (1) BR112014028380A2 (de)
RU (1) RU2014150517A (de)
SE (1) SE537730C2 (de)
WO (1) WO2013172768A2 (de)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104020878A (zh) 2014-05-22 2014-09-03 小米科技有限责任公司 触摸输入控制方法及装置
DE102014226546A1 (de) * 2014-12-19 2016-06-23 Robert Bosch Gmbh Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung, Kraftfahrzeug
IT201700091628A1 (it) * 2017-08-08 2019-02-08 Automotive Lighting Italia Spa Sistema di interfaccia uomo-macchina virtuale e corrispondente procedimento di interfaccia uomo-macchina virtuale per un veicolo.
US11144153B2 (en) 2017-12-07 2021-10-12 Elliptic Laboratories As User interface with acoustic proximity and position sensing arrangements
IT201800003722A1 (it) * 2018-03-19 2019-09-19 Candy Spa Elettrodomestico con interfaccia utente
DE102018216662A1 (de) * 2018-09-27 2020-04-02 Continental Automotive Gmbh Armaturenbrettanordnung, Verfahren und Verwendung
DE102019200632B4 (de) * 2019-01-18 2021-11-25 Audi Ag Bediensystem mit portabler Schnittstelleneinheit sowie Kraftfahrzeug mit dem Bediensystem
DE102020201235A1 (de) 2020-01-31 2021-08-05 Ford Global Technologies, Llc Verfahren und System zur Steuerung von Kraftfahrzeugfunktionen

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7050606B2 (en) 1999-08-10 2006-05-23 Cybernet Systems Corporation Tracking and gesture recognition system particularly suited to vehicular control applications
AU2003291304A1 (en) * 2002-11-06 2004-06-03 Julius Lin Virtual workstation
US20060072009A1 (en) * 2004-10-01 2006-04-06 International Business Machines Corporation Flexible interaction-based computer interfacing using visible artifacts
US7248151B2 (en) 2005-01-05 2007-07-24 General Motors Corporation Virtual keypad for vehicle entry control
US20060158616A1 (en) * 2005-01-15 2006-07-20 International Business Machines Corporation Apparatus and method for interacting with a subject in an environment
DE102005059449A1 (de) * 2005-12-13 2007-06-14 GM Global Technology Operations, Inc., Detroit Bediensystem zum Bedienen von Funktionen in einem Fahrzeug
DE112008001396B4 (de) * 2007-06-05 2015-12-31 Mitsubishi Electric Corp. Fahrzeugbedienungsvorrichtung
US8396252B2 (en) 2010-05-20 2013-03-12 Edge 3 Technologies Systems and related methods for three dimensional gesture recognition in vehicles

Also Published As

Publication number Publication date
SE1250488A1 (sv) 2013-11-15
RU2014150517A (ru) 2016-07-10
WO2013172768A2 (en) 2013-11-21
SE537730C2 (sv) 2015-10-06
WO2013172768A3 (en) 2014-03-20
EP2850506A4 (de) 2016-09-28
CN104508598A (zh) 2015-04-08
BR112014028380A2 (pt) 2017-06-27

Similar Documents

Publication Publication Date Title
EP2850506A2 (de) Projiziertes virtuelles eingabesystem für ein fahrzeug
US9037354B2 (en) Controlling vehicle entertainment systems responsive to sensed passenger gestures
US8085243B2 (en) Input device and its method
KR101416378B1 (ko) 영상 이동이 가능한 디스플레이 장치 및 방법
US11006257B2 (en) Systems and methods for locating mobile devices within a vehicle
US10618773B2 (en) Elevator operation control device and method using monitor
EP1393591A2 (de) Automatische anpassung des audio-systems
CN104755308A (zh) 具有手势识别的机动车操作接口
WO2015092905A1 (ja) 投写型映像表示装置及び投写型映像表示方法
US20200023753A1 (en) Vehicle seat haptic system and method
JP2016038621A (ja) 空間入力システム
KR102590132B1 (ko) 디스플레이 장치, 및 디스플레이 장치의 제어방법
US20190258245A1 (en) Vehicle remote operation device, vehicle remote operation system and vehicle remote operation method
EP3980147A1 (de) Kontextuell signifikantes dreidimensionales modell
US20130176218A1 (en) Pointing Device, Operating Method Thereof and Relative Multimedia Interactive System
JP2016029532A (ja) ユーザインターフェース
JP6881004B2 (ja) 表示装置
US11630628B2 (en) Display system
CN102970460A (zh) 调校车用影像装置的方法及其系统
TW201926050A (zh) 車輛多螢幕控制系統及車輛多螢幕控制方法
KR101556520B1 (ko) 단말기, 그를 가지는 차량 및 그 제어 방법
US20210325983A1 (en) Information providing system and information providing method
JP4368233B2 (ja) 空間入力システム
KR20140046949A (ko) 차량 시트 이동 장치, 그리고 이의 차량 시트 제어 방법
US20190354167A1 (en) Analysing device for determining a latency time of an immersive virtual reality system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20141215

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20160825

RIC1 Information provided on ipc code assigned before grant

Ipc: B60K 37/00 20060101ALI20160819BHEP

Ipc: G06F 3/01 20060101AFI20160819BHEP

Ipc: G06F 3/048 20060101ALI20160819BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

INTG Intention to grant announced

Effective date: 20191203

18W Application withdrawn

Effective date: 20191209