WO2020111308A1 - Intuitive interaction method and system for augmented reality display for vehicle - Google Patents

Intuitive interaction method and system for augmented reality display for vehicle Download PDF

Info

Publication number
WO2020111308A1
WO2020111308A1 PCT/KR2018/014824 KR2018014824W WO2020111308A1 WO 2020111308 A1 WO2020111308 A1 WO 2020111308A1 KR 2018014824 W KR2018014824 W KR 2018014824W WO 2020111308 A1 WO2020111308 A1 WO 2020111308A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driver
display
touch pad
touch
Prior art date
Application number
PCT/KR2018/014824
Other languages
French (fr)
Korean (ko)
Inventor
신춘성
강훈종
홍성희
홍지수
김영민
Original Assignee
전자부품연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 전자부품연구원 filed Critical 전자부품연구원
Publication of WO2020111308A1 publication Critical patent/WO2020111308A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present invention relates to a human machine interface (HMI) related technology, and more particularly, to an intuitive interaction method and system for an augmented reality display for a vehicle.
  • HMI human machine interface
  • the present invention has been devised to solve the above problems, and an object of the present invention is to attach a touchpad around the steering wheel and gears to provide an intuitive interaction method and system that enables quick and accurate control of a vehicle AR display while driving. In providing.
  • a vehicle HMI system includes a first touch pad provided in a first area of a vehicle; A second touch pad provided in a second area of the vehicle; And a control unit that recognizes and processes a driver command based on at least one of a first driver operation input through the first touch pad and a second driver operation input through the second touch pad.
  • the first touch pad may be installed on the steering wheel.
  • the first touch pad may include a plurality of touch pads.
  • the plurality of touch pads may be installed at a position where the driver can operate the touch using a thumb while holding the steering wheel.
  • the second touch pad may be installed around the gear box.
  • the controller may display a result of the driver's command execution on the display in augmented reality.
  • the augmented reality display may be the front glass of the vehicle.
  • a vehicle HMI method includes receiving a first driver operation through a first touch pad provided in a first area of a vehicle; Receiving a second driver operation through a second touch pad provided in a second area of the vehicle; Recognizing a driver command based on at least one of the input first driver operation and the second driver operation input through the second touch pad; And processing the recognized driver command.
  • service control of a vehicle AR display is possible using a finger while driving, so that various application services in the vehicle AR display can be easily executed and operated.
  • FIG. 1 is a diagram illustrating a conventional button-based HMI system
  • FIG. 2 is a view provided for conceptual description of a vehicle HMI system according to an embodiment of the present invention
  • FIG. 3 is an internal block diagram of a vehicle HMI system according to an embodiment of the present invention.
  • FIG. 5 is a view provided for a detailed explanation of a vehicle HMI method according to another embodiment of the present invention.
  • HMI human machine interface
  • the HMI by spatial gesture recognition has many errors in driver gesture recognition due to infrared interference and frequent errors in the case of spatial gestures in outdoor environments.
  • the touch pads 110 and 120 are provided on the steering wheel, and the touch pad 130 is also provided around the gear box to enable quick and accurate command input through the driver's gesture during driving. do.
  • Steering wheel touch pad-1 (110) is installed in a position where the driver can hold the steering wheel and touch using a thumb of the left hand, and steering wheel touch pad-2 (120) is in a state where the driver holds the steering wheel. It is installed in a position that can be operated by using the thumb of the right hand.
  • the steering wheel touch pad-1 110 and the steering wheel touch pad-2 120 are used by a driving driver to input a simple operation quickly and easily. For example, it is used for simple operations such as moving up and down, left and right, and selecting a highlight.
  • the console touch pad 130 installed around the gear box is for a driver as well as a driver to input a command through a gesture with a finger. It is possible.
  • Gesture recognition through the touch pads 110, 120, and 130 may minimize external interference and errors, thereby minimizing recognition errors.
  • the front glass of the vehicle may function as the AR display 160, so that the recognition/processing result of the driver's command may be fed back through the AR display 160 and the speaker 170.
  • the AR display 160 enables driver feedback in the same way as a head-up display (HUD).
  • HUD head-up display
  • FIG. 3 is an internal block diagram of an HMI system capable of intuitive interaction for an AR display for a vehicle according to an embodiment of the present invention.
  • Vehicle HMI system according to an embodiment of the present invention, as shown in Figure 3, the steering wheel touchpad-1 (110), the steering wheel touchpad-2 (120), the console touchpad 130, the control unit 140, It includes a vehicle network interface 150, an AR display 160 and a speaker 170.
  • the steering wheel touch pad-1 110, the steering wheel touch pad-2 120, and the console touch pad 130 are means for receiving a driver's touch manipulation, and since they have been described in detail above, detailed descriptions thereof will be omitted. .
  • the control unit 140 recognizes a driver command from a driver manipulation input through the touch pads 110, 120, and 130, processes the recognized command, and processes the result through the AR display 160 and the speaker 170. Give feedback.
  • the controller 140 When vehicle/operation information is required for command processing, the controller 140 receives the corresponding information through the vehicle network interface 150. Then, when vehicle control is required for command processing, the control unit 140 transmits a control request to the vehicle network side through the vehicle network interface 150.
  • FIG. 4 is a flowchart provided to explain the HMI method capable of intuitive interaction for a vehicle AR display according to another embodiment of the present invention.
  • the controller 140 recognizes a driver command from the input operation (S220 ).
  • control unit 140 processes the driver's command recognized in step S220 (S230), and feedbacks the processing result through the AR display 160 and the speaker 170 (S240).
  • control unit 140 may interwork with the vehicle network side through the vehicle network interface 150.
  • FIG. 5 is a view provided for a detailed description of an HMI method capable of intuitive interaction for a vehicle AR display according to another embodiment of the present invention.
  • touch/gesture manipulation is input from the driver through the steering wheel touch pads 110 and 120 installed on the steering wheel and the console touch pad 130 installed around the gear box (S310, S320, and S330).
  • the voice by the user's speech may be input through the microphone (S340, S350).
  • a driver command is recognized (S360), and then the recognized driver command is processed, which may be performed with reference to vehicle/operation information (S370, S380).
  • driver command processing result is fed back through the AR display 160 and the speaker 170 (S390, S400).
  • a touch pad is attached around the steering wheel and the gear box to allow the driver to control and display the AR display more quickly and accurately while driving.
  • the two touch sensors installed on the steering wheel control the screen by recognizing gestures based on the thumb of the driver, and the touch sensors installed around the gear box enable various gestures through detailed manipulation.
  • the technical idea of the present invention can be applied to a computer-readable recording medium containing a computer program that performs functions of the apparatus and method according to the present embodiment. Further, the technical idea according to various embodiments of the present invention may be implemented in the form of computer-readable codes recorded on a computer-readable recording medium.
  • the computer-readable recording medium can be any data storage device that can be read by a computer and stores data.
  • the computer-readable recording medium can be a ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical disk, hard disk drive, and the like.
  • computer-readable codes or programs stored on a computer-readable recording medium may be transmitted through a network connected between computers.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Provided are an intuitive interaction method and system for enabling quick and accurate control of an AR display for a vehicle during driving, by means of touchpads attached to the periphery of a steering wheel and a gear stick. An HMI system for a vehicle, according to an embodiment of the present invention, comprises: a first touchpad provided in a first area of the vehicle; a second touchpad provided in a second area of the vehicle; and a controller for recognizing and processing a driver command on the basis of at least one from among a first driver manipulation inputted through the first touchpad and a second driver manipulation inputted through the second touchpad. Thereby, it is possible to control services of a vehicle AR display by using fingers during driving, so that various application services in the vehicle AR display can be easily executed and manipulated.

Description

차량용 증강현실 디스플레이를 위한 직관적 상호작용 방법 및 시스템Intuitive interactive method and system for augmented reality display for vehicles
본 발명은 HMI(Human Machine Interface) 관련 기술에 관한 것으로, 더욱 상세하게는 차량용 증강현실 디스플레이를 위한 직관적 상호작용 방법 및 시스템에 관한 것이다.The present invention relates to a human machine interface (HMI) related technology, and more particularly, to an intuitive interaction method and system for an augmented reality display for a vehicle.
현재 차량 디스플레이를 위한 상호작용은 도 1에 나타난 바와 같이 차량에 장착된 버튼이 가장 일반적이다. 최근 적외선 카메라 기반 공간 제스쳐를 바탕으로 한 HMI도 등장하였다.Currently, the interaction for a vehicle display is most commonly a button mounted on a vehicle as shown in FIG. 1. Recently, an HMI based on an infrared camera-based spatial gesture has also appeared.
하지만, 이들로 인해 새로운 형태의 증강현실 환경에서 컨텐츠를 구동하고 제어하는 것은 많이 제한적이다.However, due to these, driving and controlling content in a new type of augmented reality environment is very limited.
버튼 방식의 경우 운전자가 운전중에 해당 버튼을 찾고 눌러야하기에 운전에 방해가 될 수 있다. 또한 디스플레이에 증강된 다양한 기능과 서비스를 제어하기에는 한계가 있다.In the case of a button method, it may interfere with driving because the driver has to find and press the corresponding button while driving. In addition, there are limitations in controlling various functions and services enhanced on the display.
공간 제스쳐의 경우 실외 환경에서의 적외선 간섭과 빈번한 에러로 인해 차량에 적용이 어려운 실정이다.Spatial gestures are difficult to apply to vehicles due to infrared interference and frequent errors in outdoor environments.
이에 따라, 보다 자연스럽고 직관적으로 AR(Augmented Reality : 증강현실) 디스플레이 및 컨텐츠를 제어하기 위한 방법이 필요하다.Accordingly, there is a need for a more natural and intuitive method for controlling AR (Augmented Reality) display and content.
본 발명은 상기와 같은 문제점을 해결하기 위하여 안출된 것으로서, 본 발명의 목적은, 운전대와 기어 주변에 터치패드를 부착하여 운전 중에 빠르고 정확하게 차량용 AR 디스플레이를 제어를 가능하게 하는 직관적 상호작용 방법 및 시스템을 제공함에 있다.The present invention has been devised to solve the above problems, and an object of the present invention is to attach a touchpad around the steering wheel and gears to provide an intuitive interaction method and system that enables quick and accurate control of a vehicle AR display while driving. In providing.
상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른, 차량용 HMI 시스템은 차량의 제1 영역에 마련된 제1 터치 패드; 차량의 제2 영역에 마련된 제2 터치 패드; 및 제1 터치 패드를 통해 입력된 제1 운전자 조작 및 제2 터치 패드를 통해 입력된 제2 운전자 조작 중 적어도 하나를 기초로, 운전자 명령을 인식하여 처리하는 제어부;를 포함한다. According to an embodiment of the present invention for achieving the above object, a vehicle HMI system includes a first touch pad provided in a first area of a vehicle; A second touch pad provided in a second area of the vehicle; And a control unit that recognizes and processes a driver command based on at least one of a first driver operation input through the first touch pad and a second driver operation input through the second touch pad.
제1 터치 패드는, 운전대에 설치되어 있을 수 있다. The first touch pad may be installed on the steering wheel.
제1 터치 패드는, 다수의 터치 패드를 포함할 수 있다. The first touch pad may include a plurality of touch pads.
다수의 터치 패드는, 운전자가 운전대를 잡은 상태로 엄지 손가락을 이용하여 터치 조작할 수 있는 위치에 설치되어 있는 것일 수 있다. The plurality of touch pads may be installed at a position where the driver can operate the touch using a thumb while holding the steering wheel.
제2 터치 패드는, 기어 박스 주변에 설치되어 있는 것일 수 있다.The second touch pad may be installed around the gear box.
제어부는, 운전자 명령 실행 결과를 증강현실로 디스플레이에 표시하는 것일 수 있다.The controller may display a result of the driver's command execution on the display in augmented reality.
증강현실 디스플레이는, 차량의 전방 글래스일 수 있다. The augmented reality display may be the front glass of the vehicle.
한편, 본 발명의 다른 실시예에 따른, 차량용 HMI 방법은 차량의 제1 영역에 마련된 제1 터치 패드를 통해, 제1 운전자 조작을 입력 받는 단계; 차량의 제2 영역에 마련된 제2 터치 패드를 통해, 제2 운전자 조작을 입력 받는 단계; 입력된 제1 운전자 조작 및 제2 터치 패드를 통해 입력된 제2 운전자 조작 중 적어도 하나를 기초로, 운전자 명령을 인식하는 단계; 및 인식된 운전자 명령을 처리하는 단계;를 포함한다. Meanwhile, according to another embodiment of the present invention, a vehicle HMI method includes receiving a first driver operation through a first touch pad provided in a first area of a vehicle; Receiving a second driver operation through a second touch pad provided in a second area of the vehicle; Recognizing a driver command based on at least one of the input first driver operation and the second driver operation input through the second touch pad; And processing the recognized driver command.
이상 설명한 바와 같이, 본 발명의 실시예들에 따르면, 운전중에 손가락을 이용해 차량 AR 디스플레이의 서비스 제어가 가능하여, 차량 AR 디스플레이 내의 다양한 응용 서비스를 간편하게 실행하고 조작할 수 있게 된다.As described above, according to embodiments of the present invention, service control of a vehicle AR display is possible using a finger while driving, so that various application services in the vehicle AR display can be easily executed and operated.
도 1은 종래의 버튼 기반 HMI 시스템을 예시한 도면,1 is a diagram illustrating a conventional button-based HMI system,
도 2는 본 발명의 일 실시예에 따른 차량용 HMI 시스템의 개념 설명에 제공되는 도면,2 is a view provided for conceptual description of a vehicle HMI system according to an embodiment of the present invention;
도 3은 본 발명의 일 실시예에 따른 차량용 HMI 시스템의 내부 블럭도,3 is an internal block diagram of a vehicle HMI system according to an embodiment of the present invention,
도 4는 본 발명의 다른 실시예에 차량용 HMI 방법의 설명에 제공되는 흐름도,4 is a flowchart provided in the description of a vehicle HMI method according to another embodiment of the present invention,
도 5는 본 발명의 다른 실시예에 따른 차량용 HMI 방법의 부연 설명에 제공되는 도면이다.5 is a view provided for a detailed explanation of a vehicle HMI method according to another embodiment of the present invention.
이하에서는 도면을 참조하여 본 발명을 보다 상세하게 설명한다.Hereinafter, the present invention will be described in more detail with reference to the drawings.
도 2는 본 발명의 일 실시예에 따른 차량용 AR 디스플레이를 위한 직관적 상호작용이 가능한 HMI(Human Machine Interface) 시스템의 개념 설명에 제공되는 도면이다.2 is a view provided for conceptual description of a human machine interface (HMI) system capable of intuitive interaction for an AR display for a vehicle according to an embodiment of the present invention.
기존 차량용 MHI 시스템은 디스플레이 화면 터치를 이용하는 방식이 이용되엇다. 하지만, 최근 차량의 전방 글래스를 투명 디스플레이로 기능하게 하고 AR(Augmented Reality) 컨텐츠를 제공하고 있는 상황에서, 기존 방식은 적용할 수 없다.In the existing vehicle MHI system, a method of using a display screen touch was used. However, in a situation in which the front windshield of a vehicle is functioning as a transparent display and providing AR (Augmented Reality) content, the existing method cannot be applied.
공간 제스처 인식에 의한 HMI는 실외 환경에서 공간 제스쳐의 경우 적외선 간섭과 빈번한 에러로 운전자 제스쳐 인식에 오류가 많다.The HMI by spatial gesture recognition has many errors in driver gesture recognition due to infrared interference and frequent errors in the case of spatial gestures in outdoor environments.
이에 따라, 본 발명의 실시예에서는, 운전대에 터치 패드들(110, 120)을 구비시키고, 기어 박스 주변에도 터치 패드(130)를 구비시켜, 운전 중에 빠르고 정확하게 운전자 제스처를 통한 명령 입력을 가능하게 한다.Accordingly, in the exemplary embodiment of the present invention, the touch pads 110 and 120 are provided on the steering wheel, and the touch pad 130 is also provided around the gear box to enable quick and accurate command input through the driver's gesture during driving. do.
운전대 터치 패드-1(110)은 운전자가 운전대를 잡은 상태로 왼손의 엄지 손가락을 이용하여 터치 조작할 수 있는 위치에 설치되어 있고, 운전대 터치 패드-2(120)는 운전자가 운전대를 잡은 상태로 오른손의 엄지 손가락을 이용하여 터치 조작할 수 있는 위치에 설치되어 있다.Steering wheel touch pad-1 (110) is installed in a position where the driver can hold the steering wheel and touch using a thumb of the left hand, and steering wheel touch pad-2 (120) is in a state where the driver holds the steering wheel. It is installed in a position that can be operated by using the thumb of the right hand.
운전대 터치 패드-1(110)과 운전대 터치 패드-2(120)는 운전중인 운전자가 빠르고 간편하게 간단한 조작을 입력하기 위해 사용된다. 이를 테면, 하이라이트의 상하좌우 이동 및 선택과 같은 간단한 조작에 이용된다.The steering wheel touch pad-1 110 and the steering wheel touch pad-2 120 are used by a driving driver to input a simple operation quickly and easily. For example, it is used for simple operations such as moving up and down, left and right, and selecting a highlight.
기어 박스 주변에 설치된 콘솔 터치 패드(130)는 운전자는 물론 동승자가 손가락으로 제스처를 통해 명령을 입력하기 위한 것으로, 운전대 터치 패드들(110, 120)과 달리 필기 방식의 문자 입력과 멀티 터치 입력도 가능하다.The console touch pad 130 installed around the gear box is for a driver as well as a driver to input a command through a gesture with a finger. It is possible.
터치 패드들(110, 120, 130)을 통한 제스처 인식은 외부 간섭과 에러가 적어, 인식 오류를 최소화할 수 있다. Gesture recognition through the touch pads 110, 120, and 130 may minimize external interference and errors, thereby minimizing recognition errors.
그리고, 차량의 전방 글래스는 AR 디스플레이(160)로 기능할 수 있어, 운전자 명령에 대한 인식/처리 결과는 AR 디스플레이(160)와 스피커(170)를 통해 피드백할 수 있다.In addition, the front glass of the vehicle may function as the AR display 160, so that the recognition/processing result of the driver's command may be fed back through the AR display 160 and the speaker 170.
특히, AR 디스플레이(160)는 HUD(Head-Up Display)와 같은 방식으로 운전자 피드백이 가능하다.In particular, the AR display 160 enables driver feedback in the same way as a head-up display (HUD).
도 3은 본 발명의 일 실시예에 따른 차량용 AR 디스플레이를 위한 직관적 상호작용이 가능한 HMI 시스템의 내부 블럭도이다.3 is an internal block diagram of an HMI system capable of intuitive interaction for an AR display for a vehicle according to an embodiment of the present invention.
본 발명의 실시예에 따른 차량용 HMI 시스템은, 도 3에 도시된 바와 같이, 운전대 터치패드-1(110), 운전대 터치패드-2(120), 콘솔 터치패드(130), 제어부(140), 차량 네트워크 인터페이스(150), AR 디스플레이(160) 및 스피커(170)를 포함한다.Vehicle HMI system according to an embodiment of the present invention, as shown in Figure 3, the steering wheel touchpad-1 (110), the steering wheel touchpad-2 (120), the console touchpad 130, the control unit 140, It includes a vehicle network interface 150, an AR display 160 and a speaker 170.
운전대 터치패드-1(110), 운전대 터치패드-2(120) 및 콘솔 터치패드(130)는 운전자의 터치 조작을 입력받기 위한 수단으로, 이들에 대해서는 상세히 전술한 바 있으므로, 상세한 설명은 생략한다.The steering wheel touch pad-1 110, the steering wheel touch pad-2 120, and the console touch pad 130 are means for receiving a driver's touch manipulation, and since they have been described in detail above, detailed descriptions thereof will be omitted. .
제어부(140)는 터치패드들(110, 120, 130)을 통해 입력되는 운전자 조작으로부터 운전자 명령을 인식하고, 인식된 명령을 처리하며, 처리 결과를 AR 디스플레이(160)와 스피커(170)를 통해 피드백한다.The control unit 140 recognizes a driver command from a driver manipulation input through the touch pads 110, 120, and 130, processes the recognized command, and processes the result through the AR display 160 and the speaker 170. Give feedback.
명령 처리를 위해 차량/운행 정보가 필요한 경우, 제어부(140)는 차량 네트워크 인터페이스(150)를 통해 해당 정보를 수신한다. 그리고, 명령 처리를 위해 차량 제어가 필요한 경우, 제어부(140)는 차량 네트워크 인터페이스(150)를 통해 차량 네트워크 측에 제어 요청을 전송한다.When vehicle/operation information is required for command processing, the controller 140 receives the corresponding information through the vehicle network interface 150. Then, when vehicle control is required for command processing, the control unit 140 transmits a control request to the vehicle network side through the vehicle network interface 150.
도 4는 본 발명의 다른 실시예에 따른 차량용 AR 디스플레이를 위한 직관적 상호작용이 가능한 HMI 방법의 설명에 제공되는 흐름도이다.4 is a flowchart provided to explain the HMI method capable of intuitive interaction for a vehicle AR display according to another embodiment of the present invention.
도 4에 도시된 바와 같이, 터치패드들(110, 120, 130)을 통해 운전자 조작이 입력되면(S210-Y), 제어부(140)는 입력된 조작으로부터 운전자 명령을 인식한다(S220).As illustrated in FIG. 4, when a driver operation is input through the touch pads 110, 120, and 130 (S210 -Y), the controller 140 recognizes a driver command from the input operation (S220 ).
그리고, 제어부(140)는 S220단계에서 인식된 운전자 명령을 처리하고(S230), 처리 결과를 AR 디스플레이(160)와 스피커(170)를 통해 피드백한다(S240). 명령 처리를 위해, 제어부(140)는 차량 네트워크 인터페이스(150)를 통해 차량 네트워크 측과 연동할 수 있다.Then, the control unit 140 processes the driver's command recognized in step S220 (S230), and feedbacks the processing result through the AR display 160 and the speaker 170 (S240). For command processing, the control unit 140 may interwork with the vehicle network side through the vehicle network interface 150.
도 5는 본 발명의 다른 실시예에 따른 차량용 AR 디스플레이를 위한 직관적 상호작용이 가능한 HMI 방법의 부연 설명에 제공되는 도면이다.FIG. 5 is a view provided for a detailed description of an HMI method capable of intuitive interaction for a vehicle AR display according to another embodiment of the present invention.
도 5에 도시된 바와 같이, 운전대에 설치된 운전대 터치 패드들(110,120)와 기어 박스 주변에 설치된 콘솔 터치 패드(130)를 통해 운전자로부터 터치/제스처 조작이 입력된다(S310, S320, S330).As illustrated in FIG. 5, touch/gesture manipulation is input from the driver through the steering wheel touch pads 110 and 120 installed on the steering wheel and the console touch pad 130 installed around the gear box (S310, S320, and S330).
그리고, 마이크를 통해 사용자의 발화에 의한 음성이 입력될 수도 있다(S340, S350).In addition, the voice by the user's speech may be input through the microphone (S340, S350).
입력된 터치/제스처와 음성을 분석하여 운전자 명령을 인식한다(S360), 다음, 인식된 운전자 명령을 처리하는데, 이는 차량/운행 정보를 참조로 수행될 수 있다.(S370, S380).By analyzing the input touch/gesture and voice, a driver command is recognized (S360), and then the recognized driver command is processed, which may be performed with reference to vehicle/operation information (S370, S380).
다음, 운전자 명령 처리 결과가 AR 디스플레이(160)와 스피커(170)를 통해 피드백 된다(S390, S400).Next, the driver command processing result is fed back through the AR display 160 and the speaker 170 (S390, S400).
지금까지, 차량용 증강현실 디스플레이를 위한 직관적 상호작용 방법 및 시스템에 대해 바람직한 실시예를 들어 상세히 설명하였다.So far, a preferred embodiment of an intuitive interaction method and system for an augmented reality display for a vehicle has been described in detail.
위 실시예에서는, 운전대와 기어 박스 주변에 터치 패드를 부착하여 운전중에 운전자가 보다 빠르고 정확하게 AR 디스플레이 제어, 설정 등을 가능하도록 하였다.In the above embodiment, a touch pad is attached around the steering wheel and the gear box to allow the driver to control and display the AR display more quickly and accurately while driving.
운전대 설치된 2개의 터치 센서는 운전자의 엄지 손가락을 바탕으로 제스쳐를 인식하여 화면을 제어하고, 기어 박스 주변에 설치되는 터치 센서는 상세한 조작을 통한 다양한 제스쳐를 가능하게 한다.The two touch sensors installed on the steering wheel control the screen by recognizing gestures based on the thumb of the driver, and the touch sensors installed around the gear box enable various gestures through detailed manipulation.
위 실시예에 따라, 운전중에 손가락을 이용해 차량 AR 디스플레이의 서비스 제어가 가능하고, 차량용 AR 디스플레이 내의 다양한 응용 서비스를 간편하게 실행하고 조절할 수 있으며, 손가락 제스쳐를 바탕으로 미리 정의된 응용을 실행할 수 있게 된다.According to the above embodiment, it is possible to control the service of the vehicle AR display by using a finger while driving, to easily execute and adjust various application services in the vehicle AR display, and to execute a predefined application based on a finger gesture. .
한편, 본 실시예에 따른 장치와 방법의 기능을 수행하게 하는 컴퓨터 프로그램을 수록한 컴퓨터로 읽을 수 있는 기록매체에도 본 발명의 기술적 사상이 적용될 수 있음은 물론이다. 또한, 본 발명의 다양한 실시예에 따른 기술적 사상은 컴퓨터로 읽을 수 있는 기록매체에 기록된 컴퓨터로 읽을 수 있는 코드 형태로 구현될 수도 있다. 컴퓨터로 읽을 수 있는 기록매체는 컴퓨터에 의해 읽을 수 있고 데이터를 저장할 수 있는 어떤 데이터 저장 장치이더라도 가능하다. 예를 들어, 컴퓨터로 읽을 수 있는 기록매체는 ROM, RAM, CD-ROM, 자기 테이프, 플로피 디스크, 광디스크, 하드 디스크 드라이브, 등이 될 수 있음은 물론이다. 또한, 컴퓨터로 읽을 수 있는 기록매체에 저장된 컴퓨터로 읽을 수 있는 코드 또는 프로그램은 컴퓨터간에 연결된 네트워크를 통해 전송될 수도 있다.On the other hand, the technical idea of the present invention can be applied to a computer-readable recording medium containing a computer program that performs functions of the apparatus and method according to the present embodiment. Further, the technical idea according to various embodiments of the present invention may be implemented in the form of computer-readable codes recorded on a computer-readable recording medium. The computer-readable recording medium can be any data storage device that can be read by a computer and stores data. Of course, the computer-readable recording medium can be a ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical disk, hard disk drive, and the like. In addition, computer-readable codes or programs stored on a computer-readable recording medium may be transmitted through a network connected between computers.
또한, 이상에서는 본 발명의 바람직한 실시예에 대하여 도시하고 설명하였지만, 본 발명은 상술한 특정의 실시예에 한정되지 아니하며, 청구범위에서 청구하는 본 발명의 요지를 벗어남이 없이 당해 발명이 속하는 기술분야에서 통상의 지식을 가진자에 의해 다양한 변형실시가 가능한 것은 물론이고, 이러한 변형실시들은 본 발명의 기술적 사상이나 전망으로부터 개별적으로 이해되어져서는 안될 것이다.In addition, although the preferred embodiments of the present invention have been shown and described above, the present invention is not limited to the specific embodiments described above, and the technical field to which the present invention belongs without departing from the gist of the present invention claimed in the claims. In addition, various modifications can be implemented by those skilled in the art, and these modifications should not be individually understood from the technical idea or prospect of the present invention.

Claims (8)

  1. 차량의 제1 영역에 마련된 제1 터치 패드;A first touch pad provided in a first area of the vehicle;
    차량의 제2 영역에 마련된 제2 터치 패드; 및A second touch pad provided in a second area of the vehicle; And
    제1 터치 패드를 통해 입력된 제1 운전자 조작 및 제2 터치 패드를 통해 입력된 제2 운전자 조작 중 적어도 하나를 기초로, 운전자 명령을 인식하여 처리하는 제어부;를 포함하는 것을 특징으로 하는 차량용 HMI 시스템.And a control unit for recognizing and processing a driver command based on at least one of a first driver operation input through the first touch pad and a second driver operation input through the second touch pad. system.
  2. 청구항 1에 있어서,The method according to claim 1,
    제1 터치 패드는,The first touch pad,
    운전대에 설치되어 있는 것을 특징으로 하는 차량용 HMI 시스템.Vehicle HMI system characterized in that it is installed on the steering wheel.
  3. 청구항 2에 있어서,The method according to claim 2,
    제1 터치 패드는,The first touch pad,
    다수의 터치 패드를 포함하는 것을 특징으로 하는 차량용 HMI 시스템.A vehicle HMI system comprising a plurality of touch pads.
  4. 청구항 3에 있어서,The method according to claim 3,
    다수의 터치 패드는,Many touch pads,
    운전자가 운전대를 잡은 상태로 엄지 손가락을 이용하여 터치 조작할 수 있는 위치에 설치되어 있는 것을 특징으로 하는 차량용 HMI 시스템.Vehicle HMI system, characterized in that it is installed in a position where the driver can hold the steering wheel and use a thumb to touch.
  5. 청구항 1에 있어서,The method according to claim 1,
    제2 터치 패드는,The second touch pad,
    기어 박스 주변에 설치되어 있는 것을 특징으로 하는 차량용 HMI 시스템.Vehicle HMI system characterized in that it is installed around the gear box.
  6. 청구항 1에 있어서,The method according to claim 1,
    제어부는,The control unit,
    운전자 명령 실행 결과를 증강현실로 디스플레이에 표시하는 것을 특징으로 하는 차량용 HMI 시스템.Vehicle HMI system, characterized in that the display of the driver command execution results in augmented reality on the display.
  7. 청구항 6에 있어서,The method according to claim 6,
    증강현실 디스플레이는,The augmented reality display,
    차량의 전방 글래스인 것을 특징으로 하는 차량용 HMI 시스템.Vehicle HMI system characterized in that it is the windshield of the vehicle.
  8. 차량의 제1 영역에 마련된 제1 터치 패드를 통해, 제1 운전자 조작을 입력 받는 단계;Receiving a first driver manipulation through a first touch pad provided in a first area of the vehicle;
    차량의 제2 영역에 마련된 제2 터치 패드를 통해, 제2 운전자 조작을 입력 받는 단계;Receiving a second driver operation through a second touch pad provided in a second area of the vehicle;
    입력된 제1 운전자 조작 및 제2 터치 패드를 통해 입력된 제2 운전자 조작 중 적어도 하나를 기초로, 운전자 명령을 인식하는 단계;Recognizing a driver command based on at least one of the input first driver operation and the second driver operation input through the second touch pad;
    인식된 운전자 명령을 처리하는 단계;를 포함하는 것을 특징으로 하는 차량용 HMI 방법.And processing the recognized driver command.
PCT/KR2018/014824 2018-11-28 2018-11-28 Intuitive interaction method and system for augmented reality display for vehicle WO2020111308A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180149324A KR20200068089A (en) 2018-11-28 2018-11-28 Intuitive Interaction Method and System for Automotive AR Display
KR10-2018-0149324 2018-11-28

Publications (1)

Publication Number Publication Date
WO2020111308A1 true WO2020111308A1 (en) 2020-06-04

Family

ID=70853318

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/014824 WO2020111308A1 (en) 2018-11-28 2018-11-28 Intuitive interaction method and system for augmented reality display for vehicle

Country Status (2)

Country Link
KR (1) KR20200068089A (en)
WO (1) WO2020111308A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021207639A1 (en) 2021-07-16 2023-01-19 Volkswagen Aktiengesellschaft Method for operating a motor vehicle and motor vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7126583B1 (en) * 1999-12-15 2006-10-24 Automotive Technologies International, Inc. Interactive vehicle display system
US20110115606A1 (en) * 2009-11-16 2011-05-19 Broadcom Corporation Touch sensitive panel in vehicle for user identification
US20130106693A1 (en) * 2011-10-31 2013-05-02 Honda Motor Co., Ltd. Vehicle input apparatus
KR20150062317A (en) * 2013-11-29 2015-06-08 현대모비스 주식회사 Multimedia apparatus of an autombile
KR101561917B1 (en) * 2014-04-10 2015-11-20 엘지전자 주식회사 Vehicle control apparatus and method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7126583B1 (en) * 1999-12-15 2006-10-24 Automotive Technologies International, Inc. Interactive vehicle display system
US20110115606A1 (en) * 2009-11-16 2011-05-19 Broadcom Corporation Touch sensitive panel in vehicle for user identification
US20130106693A1 (en) * 2011-10-31 2013-05-02 Honda Motor Co., Ltd. Vehicle input apparatus
KR20150062317A (en) * 2013-11-29 2015-06-08 현대모비스 주식회사 Multimedia apparatus of an autombile
KR101561917B1 (en) * 2014-04-10 2015-11-20 엘지전자 주식회사 Vehicle control apparatus and method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021207639A1 (en) 2021-07-16 2023-01-19 Volkswagen Aktiengesellschaft Method for operating a motor vehicle and motor vehicle

Also Published As

Publication number Publication date
KR20200068089A (en) 2020-06-15

Similar Documents

Publication Publication Date Title
WO2015099293A1 (en) Device and method for displaying user interface of virtual input device based on motion recognition
WO2012033345A1 (en) Motion control touch screen method and apparatus
CN110737374B (en) Operation method and electronic equipment
WO2012033361A2 (en) Method and apparatus for selecting region on screen of mobile device
WO2013141464A1 (en) Method of controlling touch-based input
WO2013048054A1 (en) Method of operating gesture based communication channel and portable terminal system for supporting the same
WO2012115307A1 (en) An apparatus and method for inputting command using gesture
WO2015122559A1 (en) Display device and method of controlling therefor
EP2673701A2 (en) Information display apparatus having at least two touch screens and information display method thereof
WO2015174597A1 (en) Voice-controllable image display device and voice control method for image display device
CN109933199B (en) Control method and device based on gestures, electronic equipment and storage medium
JP7331245B2 (en) Target position adjustment method and electronic device
EP2630561A1 (en) Method and apparatus for recognizing a gesture in a display
WO2019033655A1 (en) Method and apparatus for mistouch prevention, device, and storage medium
WO2013100727A1 (en) Display apparatus and image representation method using the same
WO2011005002A2 (en) Auxiliary touch monitor system which enables independent touch input, and independent touch input method for an auxiliary touch monitor
WO2020111308A1 (en) Intuitive interaction method and system for augmented reality display for vehicle
CN105183217A (en) Touch display device and touch display method
WO2012118271A1 (en) Method and device for controlling contents using touch, recording medium therefor, and user terminal having same
WO2020249103A1 (en) Control method and apparatus for intelligent interactive device
WO2016072610A1 (en) Recognition method and recognition device
WO2019203591A1 (en) High efficiency input apparatus and method for virtual reality and augmented reality
CN110908568B (en) Control method and device for virtual object
WO2013191332A1 (en) System for providing smartphone information in vehicle
CN110162257A (en) Multiconductor touch control method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18941457

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18941457

Country of ref document: EP

Kind code of ref document: A1