KR20140079025A - Method for providing a user interface using leg gesture recognition in a vehicle - Google Patents

Method for providing a user interface using leg gesture recognition in a vehicle Download PDF

Info

Publication number
KR20140079025A
KR20140079025A KR1020120148481A KR20120148481A KR20140079025A KR 20140079025 A KR20140079025 A KR 20140079025A KR 1020120148481 A KR1020120148481 A KR 1020120148481A KR 20120148481 A KR20120148481 A KR 20120148481A KR 20140079025 A KR20140079025 A KR 20140079025A
Authority
KR
South Korea
Prior art keywords
leg
gesture
occupant
passenger
vehicle
Prior art date
Application number
KR1020120148481A
Other languages
Korean (ko)
Inventor
김성운
Original Assignee
현대자동차주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 현대자동차주식회사 filed Critical 현대자동차주식회사
Priority to KR1020120148481A priority Critical patent/KR20140079025A/en
Publication of KR20140079025A publication Critical patent/KR20140079025A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method for manipulating a user interface using leg gesture recognition in a vehicle according to an embodiment of the present invention comprises the steps of: processing passenger captured video information received from a gesture sensing part; recognizing a leg gesture of a passenger from the received passenger captured video information; and generating a control signal for controlling an electronic device within the vehicle corresponding to the recognized leg gesture. Therefore, the present invention enables the passenger to control various electronic devices within the vehicle using leg movements while operating a steering wheel with both hands, thereby enhancing convenience and safety of the passenger.

Description

TECHNICAL FIELD [0001] The present invention relates to a method for operating a user interface using a leg gesture recognition in a vehicle,

The present invention relates to a method of operating a user interface for controlling devices inside a vehicle by recognizing a leg gesture of a vehicle occupant.

In recent years, various electronic devices are mounted inside the vehicle for the convenience of the passengers. Electronic devices such as a navigation system, a mobile phone hands-free system, and a DMB system are mounted on conventional radio and air conditioner-oriented electronic devices.

Electronic devices in existing vehicles provide a user interface through designated buttons, and in recent years, the use of touch screens has also become commonplace. These devices must be operated by hand contact with the passenger. In other words, since it is based on the passenger's gaze and hand movements, safe driving can be expected if a sufficient field of view and good attitude can be provided.

A system that controls vehicle functions without the need for occupant visibility basically requires the passenger's hand gestures. In addition, it is possible to operate the vehicle with one hand, but it is possible for the steering wheel to have both hands on the steering wheel in terms of operation accuracy and concentration, etc., in safety driving.

Therefore, it is necessary to develop the interface technology for the convenience of the user without disturbing the operation.

It is an object of the present invention to provide a method by which two hands of a passenger can control various electronic devices inside a vehicle by the movement of a leg while operating a steering wheel.

A method of operating a user interface using in-vehicle leg gesture recognition according to an exemplary embodiment of the present invention includes the steps of processing image information detected from a passenger input from a gesture detection unit, Recognizing the gesture, and generating a control signal for driving the in-vehicle electronic device corresponding to the recognized leg gesture.

The processing of the image information detected by the gesture sensing unit may include receiving image information that detects an occupant from the gesture sensing unit, generating a human peripheral image based on the image information detected by the occupant, And extracting the human body of the passenger and tracing the image of the leg portion based on the extracted human body to obtain the leg position or the leg locus of the passenger.

The step of recognizing the leg gesture of the occupant may include the steps of determining whether a leg gesture matching the leg position or the leg locus of the occupant is stored and recognizing the stored leg gesture as a leg gesture of the occupant .

The method may further include determining whether there is a request to use the leg gesture recognition function from a passenger prior to inputting the image information that detects the occupant, and determining whether there is a leg gesture recognition function use end request from a passenger .

The method may further include adjusting the deployment of the knee airbag according to the leg position or the leg locus.

A user interface operating system using in-vehicle leg gesture recognition according to an embodiment of the present invention includes a gesture sensing unit for sensing a movement and detecting an occupant, a leg gesture database for storing recognizable leg gesture information, And an electronic control unit for generating a control signal for driving the in-vehicle electronic device on the basis of the image information that detects the input occupant.

The electronic control unit may further include an input unit receiving a request signal for use and termination of a leg gesture recognition function from a passenger, and an output unit displaying an electronic device driving content of the electronic control unit.

In the method of operating the user interface using the leg gesture recognition according to the embodiment of the present invention, the occupant can recognize the leg gesture of the user and manipulate the corresponding in-vehicle electronic device.

Accordingly, the two hands of the passenger can control various electronic devices in the vehicle with the leg gesture while operating the steering wheel, thereby improving the convenience and safety of the occupant.

1 is a schematic diagram illustrating a user interface system using leg gesture recognition in a vehicle according to an embodiment of the present invention.
2 is a flowchart illustrating a method of operating a user interface using leg gesture recognition in a vehicle according to an exemplary embodiment of the present invention.
3 is an illustration of an example of a leg gesture according to an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

The present invention may be embodied in many different forms and is not limited to the embodiments described herein.

In addition, since the components shown in the drawings are arbitrarily shown for convenience of explanation, the present invention is not necessarily limited to those shown in the drawings.

FIG. 1 is a view schematically showing a user interface device using leg gesture recognition according to an embodiment of the present invention. Referring to FIG.

1, a USER INTERFACE (UI) apparatus using leg gesture recognition includes an input unit 100, a gesture sensing unit 110, a leg gesture database 120, an electronic control unit 130 (ECU) and an output unit 140.

The input unit 100 includes a button, a touch screen, and the like. Herein, the input is described as generating an input signal through a button or a touch screen, but it is possible to use a voice or a gesture of a passenger as another input method. Through the input unit 100, the occupant can request to start or stop using the leg gesture recognition function.

The gesture sensing unit 110 includes a camera, an optical sensor, an ultrasonic sensor, an image sensor, and the like. The position of the gesture sensing unit 110 may be either below or above the steering wheel. The gesture sensing unit 110 senses a motion and detects an occupant.

The leg gesture database 120 stores the leg gesture of the occupant. Preferably, the stored leg gesture is preset for a generally defined gesture. For example, a preset leg gesture may be as shown in FIG. 3, and various leg gestures may be possible. That is, in the leg gesture database 120, various leg gesture information such as a paw tapping gesture for tapping the floor with the forefoot, a left and right gesture of the knee to which the soles of the feet are fixed on the floor and rock the knee to the left and right, Is preferably stored.

In addition, the leg gesture database 120 stores the leg gestures registered by the occupant. The occupant can select various leg positions or leg trajectories and store them as leg gestures. That is, it is preferable that the occupant directly stores his or her own leg gesture so that each passenger recognizes different leg positions or trajectories without any error by using the leg gesture.

The electronic control unit 130 performs image processing on the basis of the image information on which the gesture sensing unit 110 has detected the occupant. That is, the human body of the passenger is extracted by removing the peripheral image from the image of the occupant. Then, the extracted human body is classified into head, body, arm, and leg based on the model. The modeled leg portion image is traced to obtain the leg position or leg locus of the passenger.

Also, the electronic control unit 130 determines whether or not the leg gesture matching with the obtained leg position or leg locus is stored in the leg gesture database 120. The electronic control unit 130 recognizes the stored leg gesture as a leg gesture of the occupant if a matching leg gesture is stored. That is, if the matching leg gesture is not stored, the leg trajectory information of the passenger is not recognized as an unidentifiable leg gesture.

In addition, the electronic control unit 130 determines whether to use the leg gesture recognition function according to the input signal of the input unit 100. [ That is, the electronic control unit 130 determines whether there is a request to use or terminate the leg gesture recognition function from the passenger, and controls the gesture sensing unit 110. [

Also, the electronic control unit 130 generates a control signal for driving the in-vehicle electronic device corresponding to the recognized leg gesture. The corresponding in-vehicle electronic device list is preferably stored in a database. The electronic control unit 130 generates a control signal for driving the selected in-vehicle electronic device to provide a desired operation by the occupant. For example, selectable in-vehicle electronic device operations include receiving / turning off the mobile phone while driving, playing / pausing / mute the music, increasing / decreasing the volume, increasing / decreasing the airflow, and manipulating the sunvisor.

Further, the electronic control unit 130 generates a control signal for adjusting the deployment of the knee airbag based on the leg position or leg locus. That is, information such as the position of the leg knee through the leg position, whether or not the driver has folded the leg through the leg locus, and the like can be secured. Therefore, the knee airbag Safety can be ensured.

The output unit 140 includes a touch screen, a speaker, and a mobile phone, a music device, an air conditioner, a sun visor, and the like, which are operated as electronic devices in the vehicle. Also, the operation contents of the in-vehicle electronic device are output to the screen.

2 is a flowchart illustrating a method of operating a user interface using leg gesture recognition according to an embodiment of the present invention.

Referring to FIG. 2, the passenger requests the use of the leg gesture recognition function through the input unit 100 (S100). For example, an occupant may request the use of a leg gesture recognition function for audio system manipulation.

When the occupant requests to use the leg gesture recognizing function, the passenger is detected through the gesture detecting unit 110. [ Then, the electronic control unit 130 receives the image information of the occupant detected from the gesture detection unit 110 (S110).

In step S120, the electronic control unit 130 removes the human peripheral image based on the image information of the occupant, and extracts the human body of the occupant. Then, the electronic control unit 130 classifies the extracted human body as a trunk, an arm, and a leg based on the model, and tracks only the leg portion image (S130).

Then, the electronic control unit 130 traces the image of the leg to acquire the leg position and the leg locus (S140).

Thereafter, the electronic control unit 130 determines whether a leg gesture matching the extracted leg position or leg locus is stored in the leg gesture database 120 (S150).

If the leg gesture matched to the leg gesture database 120 is stored, the electronic control unit 130 recognizes the stored leg gesture as a leg gesture of the occupant (S160).

Thereafter, the electronic control unit 130 generates a control signal for driving the in-vehicle electronic device corresponding to the recognized leg gesture (S170). That is, the electronic control unit 130 generates a control signal for driving the selected in-vehicle electronic device, and provides an operation desired by the occupant. For example, in-vehicle audio system operation during operation of an in-vehicle electronic device corresponding to a preset leg gesture can take a form as shown in FIG. Referring to FIG. 3, the occupant can play or temporarily stop the music through the toe tapping gesture. Also, the occupant can select the music through the gesture of the right and left. In addition, the occupant can turn off the music through the knee up / down gesture.

At this time, the in-vehicle electronic device operation result is displayed through the output unit 140, and the leg gesture recognition function is terminated according to whether the driver requests termination of the leg gesture recognition function (S180).

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, And all changes to the scope that are deemed to be valid.

100: Input unit
110: Gesture detection unit
120: Bridge gesture database
130: Electronic control unit
140:

Claims (7)

A method of operating a user interface using a leg gesture recognition in a vehicle,
Processing the image information of the occupant detected from the gesture detection unit;
Recognizing a leg gesture of a passenger from image information of the occupant; And
And generating a control signal for driving an in-vehicle electronic device corresponding to the recognized leg gesture.
The method according to claim 1,
Wherein the step of processing the image information that detects the occupant inputted from the gesture detection unit comprises:
Receiving image information on which the occupant is detected from the gesture detection unit;
Extracting a human body of a passenger by removing a human peripheral image based on image information of the occupant; And
And tracing an image of a leg part based on the extracted human body to obtain a leg position or a leg locus of a passenger.
3. The method of claim 2,
The step of recognizing the leg gesture of the occupant further comprises:
Determining whether a leg gesture matching the leg position or leg locus of the occupant is stored; And
And recognizing the stored leg gesture as a leg gesture of the occupant.
The method according to claim 1,
Determining whether there is a leg gesture recognition function use request from a passenger prior to inputting the image information that detects the occupant; And
Further comprising the step of determining whether there is a leg gesture recognition function use end request from a passenger.
The method of claim 3,
Further comprising adjusting the deployment of the knee airbag according to the leg position or the leg locus.
A user interface manipulation system using a leg gesture recognition in a vehicle,
A gesture sensing unit for sensing a movement and detecting an occupant;
A leg gesture database storing recognizable leg gesture information; And
And an electronic control unit for generating a control signal for driving the in-vehicle electronic device on the basis of the image information detected from the passenger input from the gesture detection unit,
The electronic control unit is a user interface manipulation system using leg gesture recognition that executes a series of instructions for performing the method of any one of claims 1 to 5.
The method according to claim 6,
An input unit for receiving a request signal for using and terminating the leg gesture recognition function from the occupant; And
And an output unit for displaying contents of driving electronic devices in the vehicle of the electronic control unit.
KR1020120148481A 2012-12-18 2012-12-18 Method for providing a user interface using leg gesture recognition in a vehicle KR20140079025A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120148481A KR20140079025A (en) 2012-12-18 2012-12-18 Method for providing a user interface using leg gesture recognition in a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120148481A KR20140079025A (en) 2012-12-18 2012-12-18 Method for providing a user interface using leg gesture recognition in a vehicle

Publications (1)

Publication Number Publication Date
KR20140079025A true KR20140079025A (en) 2014-06-26

Family

ID=51130349

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120148481A KR20140079025A (en) 2012-12-18 2012-12-18 Method for providing a user interface using leg gesture recognition in a vehicle

Country Status (1)

Country Link
KR (1) KR20140079025A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150106781A (en) * 2014-03-12 2015-09-22 삼성메디슨 주식회사 The method and apparatus for controlling function of an ultrasound apparatus based on foot motion
KR101865362B1 (en) * 2016-12-08 2018-06-07 동명대학교산학협력단 Control system and method for mixed reality using foot gesture
CN108509023A (en) * 2017-02-27 2018-09-07 华为技术有限公司 The control method and device of onboard system
KR20230020310A (en) * 2021-08-03 2023-02-10 재단법인대구경북과학기술원 Electronic apparatus and Method for recognizing a user gesture thereof

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150106781A (en) * 2014-03-12 2015-09-22 삼성메디슨 주식회사 The method and apparatus for controlling function of an ultrasound apparatus based on foot motion
KR101865362B1 (en) * 2016-12-08 2018-06-07 동명대학교산학협력단 Control system and method for mixed reality using foot gesture
CN108509023A (en) * 2017-02-27 2018-09-07 华为技术有限公司 The control method and device of onboard system
US10884510B2 (en) 2017-02-27 2021-01-05 Huawei Technologies Co., Ltd. Method and apparatus for controlling onboard system
US11275449B2 (en) 2017-02-27 2022-03-15 Huawei Technoloies Co., Ltd. Method and apparatus for controlling onboard system
US11847265B2 (en) 2017-02-27 2023-12-19 Huawei Technologies Co., Ltd. Method and apparatus for controlling onboard system
KR20230020310A (en) * 2021-08-03 2023-02-10 재단법인대구경북과학기술원 Electronic apparatus and Method for recognizing a user gesture thereof

Similar Documents

Publication Publication Date Title
KR101459441B1 (en) System and method for providing a user interface using finger start points shape recognition in a vehicle
JP5261554B2 (en) Human-machine interface for vehicles based on fingertip pointing and gestures
KR101550604B1 (en) Vehicle operation device
KR101490908B1 (en) System and method for providing a user interface using hand shape trace recognition in a vehicle
EP3237256B1 (en) Controlling a vehicle
KR101459445B1 (en) System and method for providing a user interface using wrist angle in a vehicle
KR101438615B1 (en) System and method for providing a user interface using 2 dimension camera in a vehicle
US10732760B2 (en) Vehicle and method for controlling the vehicle
KR101537936B1 (en) Vehicle and control method for the same
US10474357B2 (en) Touch sensing display device and method of detecting user input from a driver side or passenger side in a motor vehicle
US10133357B2 (en) Apparatus for gesture recognition, vehicle including the same, and method for gesture recognition
US10095313B2 (en) Input device, vehicle having the input device, and method for controlling the vehicle
US10649587B2 (en) Terminal, for gesture recognition and operation command determination, vehicle having the same and method for controlling the same
US20160349850A1 (en) Detection device and gesture input device
US11023786B2 (en) Device control apparatus
JP2017090613A (en) Voice recognition control system
KR20140079025A (en) Method for providing a user interface using leg gesture recognition in a vehicle
CN116529125A (en) Method and apparatus for controlled hand-held steering wheel gesture interaction
JP5136948B2 (en) Vehicle control device
JP6414420B2 (en) In-vehicle device operation device
KR101500412B1 (en) Gesture recognize apparatus for vehicle
WO2017212569A1 (en) Onboard information processing device, onboard device, and onboard information processing method
KR20140077037A (en) System and method for providing tactile sensations based on gesture
KR20200085970A (en) Vehcle and control method thereof
KR20150056322A (en) Apparatus for controlling menu of head-up display and method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application