KR20150056322A - Apparatus for controlling menu of head-up display and method thereof - Google Patents

Apparatus for controlling menu of head-up display and method thereof Download PDF

Info

Publication number
KR20150056322A
KR20150056322A KR1020130139203A KR20130139203A KR20150056322A KR 20150056322 A KR20150056322 A KR 20150056322A KR 1020130139203 A KR1020130139203 A KR 1020130139203A KR 20130139203 A KR20130139203 A KR 20130139203A KR 20150056322 A KR20150056322 A KR 20150056322A
Authority
KR
South Korea
Prior art keywords
hand
menu
movement
display
head
Prior art date
Application number
KR1020130139203A
Other languages
Korean (ko)
Inventor
박만수
오형석
Original Assignee
현대오트론 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 현대오트론 주식회사 filed Critical 현대오트론 주식회사
Priority to KR1020130139203A priority Critical patent/KR20150056322A/en
Publication of KR20150056322A publication Critical patent/KR20150056322A/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to an apparatus for controlling a menu of a head-up display and a method thereof. The apparatus includes a sensor part to detect at least more than one out of the head-up display, hand gestures, the shape or the direction of hand movement; and a control part to execute a menu to respond to the movement information according to the sensed result after sensing at least one out of hand gestures, and the shape and the direction of hand movement while treating sensor information detected from the sensor part.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a menu control apparatus and method for a head-

The present invention relates to an apparatus and method for controlling a menu of a head-up display, and more particularly, to a menu control apparatus and method for a head-up display which enable a menu of a head-up display to be operated, And methods.

Recently, a user with a head-up display (HUD) equipped with a head-up display (HUD)

The head-up display (HUD) is a device for providing driving information of a vehicle such as driving information, navigation information, and the like within a range that does not deviate from the driver's main visual line in front of the driver during driving of the vehicle or an aircraft. The initial head-up display was developed to provide flight information to an airplane, particularly a fighter aircraft, during flight, and the vehicle head-up display was developed to meet this principle.

Assuming that the driver is driving the vehicle at a speed of about 100 km / h per hour and assuming that the driver puts the field of view on the instrument panel and fixes the field of view on the road takes about 2 seconds, There is always the possibility to be. One of the ways to reduce this risk is developing a head-up display for a vehicle, which displays the instrument panel information (speed, mileage, RPM, etc.) and navigation information on the driver's front sight line, So that the driver can recognize the important driving information and the route information of the vehicle without taking his or her eyes off.

Such a head-up display improves the driver's convenience by displaying dashboard information or navigation information, but the conventional head-up display merely displays navigation information on the head-up display.

Therefore, in order to display other information through the head-up display or to change the operation mode of the head-up display, a menu button of a corresponding device (e.g., navigation) There is a problem in that the possibility of an accident is increased by being directed to a place where the menu button is located.

For example, assuming that the route of the navigation is displayed through the head-up display, it is not possible to operate the menu displayed through the head-up display in order to change the destination of the route, I had to operate the menu by pressing my finger directly. Therefore, the driver can not maintain the front view, and the sight line is directed toward the menu operation direction, thereby increasing the possibility of an accident.

BACKGROUND ART [0002] The background art of the present invention is disclosed in Korean Patent Laid-Open No. 10-2009-0076242 (published on July 13, 2009, a head-up display device for a vehicle and a method for controlling the operation thereof).

SUMMARY OF THE INVENTION The present invention provides a menu control apparatus and method for a head-up display, which is created to solve the above-described problems, and which can operate, change, or operate a menu of a head-up display through a hand operation of a driver in a space It has its purpose.

An apparatus for controlling a menu of a head-up display according to an aspect of the present invention includes: a head-up display; A sensor unit for detecting at least one of a hand movement, a movement pattern and a direction of a hand; And a controller for processing the sensor information detected through the sensor unit to recognize at least one of a hand movement, a motion shape and a direction of a hand, and executing a menu corresponding to motion information according to the recognized result do.

In the present invention, the controller may output a preset menu on the screen of the head-up display according to the recognized result, and may overlap at least one of the recognized hand movement, And recognizes the selection of the corresponding menu according to at least one of a predetermined specific hand gesture, a hand gesture, and a direction, and also recognizes the selection of the corresponding menu based on at least one of the predetermined specific gesture, The selected menu is moved and displayed or a corresponding menu is executed.

In the present invention, the sensor unit may include at least one of a sensor for detecting a driver's hand or a finger moving in a lateral direction, a sensor for detecting movement in a longitudinal direction, and a sensor for detecting movement in a forward / backward direction .

In the present invention, the control unit recognizes the speed and distance of the hand or the finger moving in the lateral direction and the longitudinal direction, recognizes the speed and the distance and depth of the hand or the finger moving forward and backward, And recognizes the shape to be formed.

In the present invention, the motion information is displayed as an icon or a substitute image displayed on the screen of the head-up display corresponding to the movement of the hand or the finger when the driver's hand moves to a predetermined space or area do.

In the present invention, when the driver's hand moves to a preset space or area, the controller displays an icon or a substitute image on the screen of the head-up display to indicate that the menu is ready for selection and execution by the hand- Is displayed.

In the present invention, the control unit may cancel the operation indicating that the user is ready for the menu selection and execution when the hands of the driver are out of the predetermined space or area.

According to another aspect of the present invention, there is provided a method of controlling a menu of a head-up display, the method comprising the steps of: the controller processing sensor information detected through a sensor unit to recognize at least one of a hand movement, And executing a menu corresponding to the motion information according to the recognized result.

According to another aspect of the present invention, there is provided a method of controlling a head-up display, the method comprising: outputting a preset menu on a screen of a head-up display according to the recognized result; And outputting motion information corresponding to the recognized hand movement, hand movement shape and direction, overlapping the output menu, and executing the menu, wherein the step of performing the menu includes a predetermined hand movement, Recognizing a selection of a corresponding menu according to at least one of a direction and a direction; And moving or displaying a position of the selected menu according to at least one of a predetermined specific operation, a movement pattern of a hand, and a direction, or executing a corresponding menu.

In the present invention, in the step of recognizing at least one of the hand movement, the hand movement, and the direction, the control unit recognizes the speed and distance of the hand or the finger moving in the lateral direction and the longitudinal direction, Recognizes the speed, distance and depth of the finger, and recognizes the shape formed by the continuous movement of the hand or the finger.

In the present invention, the motion information is displayed as an icon or a substitute image displayed on the screen of the head-up display corresponding to the movement of the hand or the finger when the driver's hand moves to a predetermined space or area do.

In the present invention, the step of outputting the motion information may include: when the driver's hand is moved to a predetermined space or area, the controller displays the head-up display to indicate that the controller is ready for selection and execution of the menu by hand- And displays an icon or a substitute image on the screen of the display and releases the operation indicating that the user is ready for the menu selection and execution when the driver's hand is out of the predetermined space or area.

According to the present invention, a menu of a head-up display can be operated, changed, or manipulated through a hand operation of a driver in a space, so that a driver's gaze can be fixed forward and a menu can be operated quickly and easily while minimizing divergence of a driver's gaze .

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is an exemplary diagram showing a schematic configuration of a menu control apparatus of a head-up display according to an embodiment of the present invention; FIG.
2 is a flowchart illustrating a menu control method using a head-up display according to an embodiment of the present invention.
FIG. 3 is an exemplary view for explaining a method of controlling a menu using the hand-operated operation in FIG. 2; FIG.

Hereinafter, an embodiment of an apparatus and method for controlling a menu of a head-up display according to the present invention will be described with reference to the accompanying drawings.

In this process, the thicknesses of the lines and the sizes of the components shown in the drawings may be exaggerated for clarity and convenience of explanation. In addition, the terms described below are defined in consideration of the functions of the present invention, which may vary depending on the intention or custom of the user, the operator. Therefore, definitions of these terms should be made based on the contents throughout this specification.

FIG. 1 is a diagram illustrating a schematic configuration of a menu control apparatus for a head-up display according to an embodiment of the present invention.

1, a menu control apparatus 100 of a head-up display according to an exemplary embodiment of the present invention includes a sensor unit 110, a control unit 120, and a head-up display 130. As shown in FIG.

The sensor unit 110 is installed inside the vehicle and detects movement (or hand movement) of the driver's hand (or finger).

At this time, a driver's operation detecting auxiliary means (not shown) (e.g., a ring, a glove, a bracelet, a thimble, etc.) may be worn by the sensor unit 110 so that the sensor unit 110 can accurately detect the driver's manual operation. Of course, a device (e.g., an electronic device) for easy detection of the sensor may be additionally attached to the hand detection auxiliary means (not shown).

The sensor unit 110 includes at least one of a sensor for detecting movement of a driver's hand (or a finger) in a lateral direction, a sensor for detecting movement in a longitudinal direction, and a sensor for detecting movement in a forward / backward direction .

For example, the sensor unit 110 may include at least one of an infrared sensor, an ultrasonic sensor, an image sensor, a high frequency sensor, and an approach sensor. Of course, the present invention is not limited to the above-described sensors, and may further include a sensor more suitable for detecting a hand movement or a hand movement direction.

The sensor unit 110 may be installed on both sides, top / bottom, and rear sides of the vehicle interior. For example, an image sensor (not shown) may be installed on the rear surface of the vehicle so that at least one of a menu, a position of a hand (or a finger), a movement (operation)

A camera (not shown) may be additionally provided to more easily detect at least one of the position, movement (movement), and shape of the hand (or finger).

The control unit 120 may detect at least one of the signals sensed by the sensor unit 110 to detect the driver's hand motion information (e.g., motion direction, motion shape, etc.). That is, the controller 120 recognizes the movement of the driver's hands (or fingers) when the driver's hands (or fingers) continuously move to form a specific shape. An image processing algorithm may be used to recognize the motion pattern of the hand gesture.

The control unit 120 recognizes a hand motion (movement direction, movement shape, etc.) by receiving a detection signal of the sensor unit 110 that detects movement of the hand (or finger).

The control unit 120 includes a horizontal motion recognition unit 121, a longitudinal motion recognition unit 122, a pre / post motion recognition unit 123, a motion shape recognition unit 124, and a motion information output unit 125 . Each of the configuration units 121 to 125 may perform the corresponding function separately or may perform all the functions integrally in the control unit 120. [ For convenience, the functions of the respective constituent units 121 to 125 will be described separately in this embodiment.

The lateral motion recognition unit 121 receives the detection signal of the sensor unit 110 and recognizes the speed and distance of the hand (or finger) moving in the lateral direction.

The longitudinal motion recognition unit 122 receives the detection signal of the sensor unit 110 and recognizes the speed and distance of the hand (or finger) moving in the longitudinal direction.

The pre / post motion recognition unit 123 recognizes a velocity, a distance, and a depth of a hand (or a finger) moving forward / backward (forward or backward) by receiving a detection signal of the sensor unit 110 . The selection and execution of the menu can be detected through the pre / post movement.

The motion shape recognizing unit 124 detects a shape in which the hand (or finger) continuously moves. For example, a motion pattern is detected such as drawing a circle on a space with a hand (or a finger), holding a fist, digging a finger in the shape of a palm or scissors, or reversing the front and back of a palm and a hand.

The motion information of the recognized hand (or finger) is output to the head-up display 130 through the motion information output unit 125.

That is, as shown in FIG. 3, the motion information output unit 125 outputs the motion information to the information (for example, navigation, audio, etc.) displayed on the windshield through the head- And overlaps the motion information. Accordingly, it is possible to check a moving position of a driver's hand (or a finger) in real time to easily select a desired menu.

The head-up display 130 displays the first information (e.g., navigation, audio, vehicle instrument panel, etc.) and the second information (e.g., hand movement, hand movement direction / speed, And displays it as an augmented reality. At this time, the second information may be displayed as an icon, a hand shape, or a three-dimensional shape by adding depth information. The display method of the second information may vary according to the type of the first information already displayed.

FIG. 2 is a flowchart for explaining a menu control method using a head-up display according to an embodiment of the present invention. FIG. 3 is an exemplary view for explaining a method of controlling a menu by using a hand- . Hereinafter, a method of controlling the menu displayed on the head-up display by using the hand gesture in a space inside the vehicle will be described with reference to FIGS. 2 and 3. FIG.

As shown in FIG. 2, the controller 120 receives the information detected by the sensor unit 120 mounted on the vehicle, and recognizes the movement direction of the hand and the hand (or finger) (S101).

The sensor unit 120 includes at least one sensor (for example, an infrared sensor, an ultrasonic sensor, an image sensor, a high frequency sensor, and an approach sensor) for detecting a movement direction of a hand motion .

The control unit 120 outputs a menu corresponding to the predetermined hand movement (or movement shape) and the movement direction of the hand (or finger) (S102).

For example, the control unit 120 may include a menu for executing a corresponding device (e.g., navigation, audio, handsfree, option menu for vehicle control, etc.) according to the movement direction of the hand (or movement) The screen is displayed on the windshield of the vehicle through the head-up display 130. [

The controller 120 outputs motion information according to the recognized hand motion (or motion shape) and the direction of movement of the hand (or finger) (S103).

For example, the control unit 120 displays a predetermined icon (or a substitute image) on the screen of the head-up display 130 displayed in the windshield window according to the movement direction of the hand gesture (or movement shape) and the hand (or finger).

Accordingly, the driver moves the hand (or finger) in a predetermined hand movement (or movement shape) to manipulate the desired menu while viewing the icon (or the alternative image) displayed on the screen.

The controller 120 recognizes the selection of the corresponding menu in accordance with a specific hand gesture (or movement pattern) or a movement direction of a hand (or a finger) preset at a specific position (i.e., a position corresponding to the menu displayed with the icon) ).

In step S105, the control unit 120 displays the selected menu by dragging (or moving) the displayed position according to the movement direction of the specific hand gesture (or movement pattern) or the hand (or finger).

As shown in FIG. 3, the operation of controlling the menu by the hand operation as described above is performed by the driver holding the handle in order to detect the movement of the hand (or movement) or the movement of the hand It begins by moving to space (area). For example, when the driver's hand moves to a predetermined space (area), the controller 120 displays an icon on the screen of the head-up display 130 to indicate that the user is ready for selection and execution of the menu by hand- (E.g., when a hand is lowered or a handle is gripped again), the operation for selecting and executing the menu (e.g., display operation of the icon) is canceled .

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, I will understand the point. Accordingly, the technical scope of the present invention should be defined by the following claims.

110:
120:
121: lateral motion recognition unit
122: vertical motion recognition unit
123: pre / post motion recognition unit
124: motion shape recognition unit
125: Motion information output unit
130: Head-up display

Claims (12)

Head-up display;
A sensor unit for detecting at least one of a hand movement, a movement pattern and a direction of a hand; And
And a controller for recognizing at least one of a hand movement, a hand motion shape and a direction by processing sensor information detected through the sensor unit, and executing a menu corresponding to the motion information according to the recognized result A menu control of the head-up display.
The apparatus of claim 1,
Outputs a preset menu on the screen of the head-up display according to the recognized result, outputs the motion information according to at least one of the recognized hand movement, hand motion shape and direction, overlaps with the output menu, Recognizes the selection of the menu in accordance with at least one of the set specific hand gesture, the movement pattern of the hand, and the direction, moves the position of the selected menu according to at least one of another predetermined specific operation, Or executes the corresponding menu.
The apparatus according to claim 1,
A sensor for detecting movement of the driver's hand or a finger in the lateral direction, a sensor for detecting movement in the longitudinal direction, and a sensor for detecting movement in the forward / backward direction. Menu control device.
The apparatus of claim 1,
Recognizing the speed and distance of the hand or finger moving in the lateral and longitudinal directions,
Recognize the speed, distance and depth of the hand or finger moving before / after,
And recognizes the shape of the hands or fingers formed by the continuous movement of the hands or fingers.
The method of claim 1,
Wherein the icon is an icon or a substitute image displayed on the screen of the head-up display corresponding to the movement of the hand or the finger, and is displayed when the driver's hand moves to a predetermined space or area.
The apparatus of claim 1,
Characterized in that an icon or a substitute image is displayed on the screen of the head-up display to indicate that the hand-operated operation is ready for selection and execution of the menu when the driver's hand moves to a predetermined space or area The menu control device of the display.
7. The apparatus of claim 6,
And when the hands of the driver are out of the preset space or area, the operation for indicating that the menu is ready for selection and execution is canceled.
The control unit processes sensor information detected through the sensor unit to recognize at least one of a hand movement, a hand motion shape, and a direction; And
And executing a menu corresponding to the motion information according to the recognized result.
9. The method of claim 8,
Outputting a preset menu on a screen of a head-up display according to the recognized result; And
And outputting motion information corresponding to the recognized hand movement, hand motion shape and direction, overlapping the output menu,
The step of executing the menu may include recognizing a selection of the menu in accordance with at least one of a preset specific hand gesture, a hand movement shape, and a direction. And
And moving or displaying a position of the selected menu according to at least one of a predetermined specific operation, a movement pattern of a hand, and a direction, or executing a corresponding menu.
9. The method of claim 8,
In the step of recognizing at least one of the hand movement, the hand movement pattern and the direction, the control section recognizes the speed and distance of the hand or the finger moving in the lateral direction and the longitudinal direction, and recognizes the speed and distance of the hand or finger moving forward / And recognizes the depth, and recognizes the shape that the hands or fingers continuously move and form.
9. The apparatus according to claim 8,
Wherein the icon is displayed as an icon or a substitute image displayed on the screen of the head-up display corresponding to the movement of the hand or the finger when the hand of the driver moves to a predetermined space or area.
The method of claim 9, wherein the step of outputting the motion information comprises:
When the driver's hand is moved to a predetermined space or area, the controller displays an icon or a substitute image on the screen of the head-up display to indicate that the user is ready for selection and execution of the menu by hand operation. And when the hand is out of the preset space or area, the operation for indicating that the menu is ready for selection and execution is canceled.
KR1020130139203A 2013-11-15 2013-11-15 Apparatus for controlling menu of head-up display and method thereof KR20150056322A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130139203A KR20150056322A (en) 2013-11-15 2013-11-15 Apparatus for controlling menu of head-up display and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130139203A KR20150056322A (en) 2013-11-15 2013-11-15 Apparatus for controlling menu of head-up display and method thereof

Publications (1)

Publication Number Publication Date
KR20150056322A true KR20150056322A (en) 2015-05-26

Family

ID=53391659

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130139203A KR20150056322A (en) 2013-11-15 2013-11-15 Apparatus for controlling menu of head-up display and method thereof

Country Status (1)

Country Link
KR (1) KR20150056322A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9802622B2 (en) 2015-07-03 2017-10-31 Lg Electronics Inc. Driver assistance apparatus and vehicle including the same
KR20180025378A (en) * 2016-08-30 2018-03-09 자동차부품연구원 System and method for provision of head up display information based on driver's gesture

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9802622B2 (en) 2015-07-03 2017-10-31 Lg Electronics Inc. Driver assistance apparatus and vehicle including the same
KR20180025378A (en) * 2016-08-30 2018-03-09 자동차부품연구원 System and method for provision of head up display information based on driver's gesture

Similar Documents

Publication Publication Date Title
JP5572761B2 (en) Vehicle control device
US20160132126A1 (en) System for information transmission in a motor vehicle
CN106427571B (en) Interactive operating device and method for operating same
JP4351599B2 (en) Input device
US9244527B2 (en) System, components and methodologies for gaze dependent gesture input control
JP5563153B2 (en) Operating device
US10162424B2 (en) Operation apparatus for vehicle
US20110063425A1 (en) Vehicle Operator Control Input Assistance
KR20140070798A (en) A display apparatus capable of moving image and the method thereof
US20130285949A1 (en) Control apparatus and computer program product for processing touchpad signals
US10482667B2 (en) Display unit and method of controlling the display unit
WO2018116565A1 (en) Information display device for vehicle and information display program for vehicle
JP5136948B2 (en) Vehicle control device
KR20150056322A (en) Apparatus for controlling menu of head-up display and method thereof
JP2016149094A (en) Vehicle information processing apparatus
KR20130076215A (en) Device for alarming image change of vehicle
JP6819539B2 (en) Gesture input device
RU2410259C2 (en) Interactive control device and method of operating interactive control device
EP4122739B1 (en) A user interface for a vehicle, a vehicle, and a method for operating a user interface for a vehicle
JP6236211B2 (en) Display device for transportation equipment
JP6984979B2 (en) In-vehicle display system and display setting change method
JP2023066255A (en) Onboard display control system and onboard display control method
JP2011107900A (en) Input display device
JP6371589B2 (en) In-vehicle system, line-of-sight input reception method, and computer program
JP2014191818A (en) Operation support system, operation support method and computer program

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment