KR20140079025A - Method for providing a user interface using leg gesture recognition in a vehicle - Google Patents
Method for providing a user interface using leg gesture recognition in a vehicle Download PDFInfo
- Publication number
- KR20140079025A KR20140079025A KR1020120148481A KR20120148481A KR20140079025A KR 20140079025 A KR20140079025 A KR 20140079025A KR 1020120148481 A KR1020120148481 A KR 1020120148481A KR 20120148481 A KR20120148481 A KR 20120148481A KR 20140079025 A KR20140079025 A KR 20140079025A
- Authority
- KR
- South Korea
- Prior art keywords
- leg
- gesture
- occupant
- passenger
- vehicle
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
The present invention relates to a method of operating a user interface for controlling devices inside a vehicle by recognizing a leg gesture of a vehicle occupant.
In recent years, various electronic devices are mounted inside the vehicle for the convenience of the passengers. Electronic devices such as a navigation system, a mobile phone hands-free system, and a DMB system are mounted on conventional radio and air conditioner-oriented electronic devices.
Electronic devices in existing vehicles provide a user interface through designated buttons, and in recent years, the use of touch screens has also become commonplace. These devices must be operated by hand contact with the passenger. In other words, since it is based on the passenger's gaze and hand movements, safe driving can be expected if a sufficient field of view and good attitude can be provided.
A system that controls vehicle functions without the need for occupant visibility basically requires the passenger's hand gestures. In addition, it is possible to operate the vehicle with one hand, but it is possible for the steering wheel to have both hands on the steering wheel in terms of operation accuracy and concentration, etc., in safety driving.
Therefore, it is necessary to develop the interface technology for the convenience of the user without disturbing the operation.
It is an object of the present invention to provide a method by which two hands of a passenger can control various electronic devices inside a vehicle by the movement of a leg while operating a steering wheel.
A method of operating a user interface using in-vehicle leg gesture recognition according to an exemplary embodiment of the present invention includes the steps of processing image information detected from a passenger input from a gesture detection unit, Recognizing the gesture, and generating a control signal for driving the in-vehicle electronic device corresponding to the recognized leg gesture.
The processing of the image information detected by the gesture sensing unit may include receiving image information that detects an occupant from the gesture sensing unit, generating a human peripheral image based on the image information detected by the occupant, And extracting the human body of the passenger and tracing the image of the leg portion based on the extracted human body to obtain the leg position or the leg locus of the passenger.
The step of recognizing the leg gesture of the occupant may include the steps of determining whether a leg gesture matching the leg position or the leg locus of the occupant is stored and recognizing the stored leg gesture as a leg gesture of the occupant .
The method may further include determining whether there is a request to use the leg gesture recognition function from a passenger prior to inputting the image information that detects the occupant, and determining whether there is a leg gesture recognition function use end request from a passenger .
The method may further include adjusting the deployment of the knee airbag according to the leg position or the leg locus.
A user interface operating system using in-vehicle leg gesture recognition according to an embodiment of the present invention includes a gesture sensing unit for sensing a movement and detecting an occupant, a leg gesture database for storing recognizable leg gesture information, And an electronic control unit for generating a control signal for driving the in-vehicle electronic device on the basis of the image information that detects the input occupant.
The electronic control unit may further include an input unit receiving a request signal for use and termination of a leg gesture recognition function from a passenger, and an output unit displaying an electronic device driving content of the electronic control unit.
In the method of operating the user interface using the leg gesture recognition according to the embodiment of the present invention, the occupant can recognize the leg gesture of the user and manipulate the corresponding in-vehicle electronic device.
Accordingly, the two hands of the passenger can control various electronic devices in the vehicle with the leg gesture while operating the steering wheel, thereby improving the convenience and safety of the occupant.
1 is a schematic diagram illustrating a user interface system using leg gesture recognition in a vehicle according to an embodiment of the present invention.
2 is a flowchart illustrating a method of operating a user interface using leg gesture recognition in a vehicle according to an exemplary embodiment of the present invention.
3 is an illustration of an example of a leg gesture according to an embodiment of the present invention.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The present invention may be embodied in many different forms and is not limited to the embodiments described herein.
In addition, since the components shown in the drawings are arbitrarily shown for convenience of explanation, the present invention is not necessarily limited to those shown in the drawings.
FIG. 1 is a view schematically showing a user interface device using leg gesture recognition according to an embodiment of the present invention. Referring to FIG.
1, a USER INTERFACE (UI) apparatus using leg gesture recognition includes an
The
The
The
In addition, the
The
Also, the
In addition, the
Also, the
Further, the
The output unit 140 includes a touch screen, a speaker, and a mobile phone, a music device, an air conditioner, a sun visor, and the like, which are operated as electronic devices in the vehicle. Also, the operation contents of the in-vehicle electronic device are output to the screen.
2 is a flowchart illustrating a method of operating a user interface using leg gesture recognition according to an embodiment of the present invention.
Referring to FIG. 2, the passenger requests the use of the leg gesture recognition function through the input unit 100 (S100). For example, an occupant may request the use of a leg gesture recognition function for audio system manipulation.
When the occupant requests to use the leg gesture recognizing function, the passenger is detected through the
In step S120, the
Then, the
Thereafter, the
If the leg gesture matched to the
Thereafter, the
At this time, the in-vehicle electronic device operation result is displayed through the output unit 140, and the leg gesture recognition function is terminated according to whether the driver requests termination of the leg gesture recognition function (S180).
While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, And all changes to the scope that are deemed to be valid.
100: Input unit
110: Gesture detection unit
120: Bridge gesture database
130: Electronic control unit
140:
Claims (7)
Processing the image information of the occupant detected from the gesture detection unit;
Recognizing a leg gesture of a passenger from image information of the occupant; And
And generating a control signal for driving an in-vehicle electronic device corresponding to the recognized leg gesture.
Wherein the step of processing the image information that detects the occupant inputted from the gesture detection unit comprises:
Receiving image information on which the occupant is detected from the gesture detection unit;
Extracting a human body of a passenger by removing a human peripheral image based on image information of the occupant; And
And tracing an image of a leg part based on the extracted human body to obtain a leg position or a leg locus of a passenger.
The step of recognizing the leg gesture of the occupant further comprises:
Determining whether a leg gesture matching the leg position or leg locus of the occupant is stored; And
And recognizing the stored leg gesture as a leg gesture of the occupant.
Determining whether there is a leg gesture recognition function use request from a passenger prior to inputting the image information that detects the occupant; And
Further comprising the step of determining whether there is a leg gesture recognition function use end request from a passenger.
Further comprising adjusting the deployment of the knee airbag according to the leg position or the leg locus.
A gesture sensing unit for sensing a movement and detecting an occupant;
A leg gesture database storing recognizable leg gesture information; And
And an electronic control unit for generating a control signal for driving the in-vehicle electronic device on the basis of the image information detected from the passenger input from the gesture detection unit,
The electronic control unit is a user interface manipulation system using leg gesture recognition that executes a series of instructions for performing the method of any one of claims 1 to 5.
An input unit for receiving a request signal for using and terminating the leg gesture recognition function from the occupant; And
And an output unit for displaying contents of driving electronic devices in the vehicle of the electronic control unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120148481A KR20140079025A (en) | 2012-12-18 | 2012-12-18 | Method for providing a user interface using leg gesture recognition in a vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120148481A KR20140079025A (en) | 2012-12-18 | 2012-12-18 | Method for providing a user interface using leg gesture recognition in a vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20140079025A true KR20140079025A (en) | 2014-06-26 |
Family
ID=51130349
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020120148481A KR20140079025A (en) | 2012-12-18 | 2012-12-18 | Method for providing a user interface using leg gesture recognition in a vehicle |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20140079025A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150106781A (en) * | 2014-03-12 | 2015-09-22 | 삼성메디슨 주식회사 | The method and apparatus for controlling function of an ultrasound apparatus based on foot motion |
KR101865362B1 (en) * | 2016-12-08 | 2018-06-07 | 동명대학교산학협력단 | Control system and method for mixed reality using foot gesture |
CN108509023A (en) * | 2017-02-27 | 2018-09-07 | 华为技术有限公司 | The control method and device of onboard system |
KR20230020310A (en) * | 2021-08-03 | 2023-02-10 | 재단법인대구경북과학기술원 | Electronic apparatus and Method for recognizing a user gesture thereof |
-
2012
- 2012-12-18 KR KR1020120148481A patent/KR20140079025A/en not_active Application Discontinuation
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150106781A (en) * | 2014-03-12 | 2015-09-22 | 삼성메디슨 주식회사 | The method and apparatus for controlling function of an ultrasound apparatus based on foot motion |
KR101865362B1 (en) * | 2016-12-08 | 2018-06-07 | 동명대학교산학협력단 | Control system and method for mixed reality using foot gesture |
CN108509023A (en) * | 2017-02-27 | 2018-09-07 | 华为技术有限公司 | The control method and device of onboard system |
US10884510B2 (en) | 2017-02-27 | 2021-01-05 | Huawei Technologies Co., Ltd. | Method and apparatus for controlling onboard system |
US11275449B2 (en) | 2017-02-27 | 2022-03-15 | Huawei Technoloies Co., Ltd. | Method and apparatus for controlling onboard system |
US11847265B2 (en) | 2017-02-27 | 2023-12-19 | Huawei Technologies Co., Ltd. | Method and apparatus for controlling onboard system |
KR20230020310A (en) * | 2021-08-03 | 2023-02-10 | 재단법인대구경북과학기술원 | Electronic apparatus and Method for recognizing a user gesture thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101459441B1 (en) | System and method for providing a user interface using finger start points shape recognition in a vehicle | |
JP5261554B2 (en) | Human-machine interface for vehicles based on fingertip pointing and gestures | |
KR101550604B1 (en) | Vehicle operation device | |
KR101490908B1 (en) | System and method for providing a user interface using hand shape trace recognition in a vehicle | |
EP3237256B1 (en) | Controlling a vehicle | |
KR101459445B1 (en) | System and method for providing a user interface using wrist angle in a vehicle | |
KR101438615B1 (en) | System and method for providing a user interface using 2 dimension camera in a vehicle | |
US10732760B2 (en) | Vehicle and method for controlling the vehicle | |
KR101537936B1 (en) | Vehicle and control method for the same | |
US10474357B2 (en) | Touch sensing display device and method of detecting user input from a driver side or passenger side in a motor vehicle | |
US10133357B2 (en) | Apparatus for gesture recognition, vehicle including the same, and method for gesture recognition | |
US10095313B2 (en) | Input device, vehicle having the input device, and method for controlling the vehicle | |
US10649587B2 (en) | Terminal, for gesture recognition and operation command determination, vehicle having the same and method for controlling the same | |
US20160349850A1 (en) | Detection device and gesture input device | |
US11023786B2 (en) | Device control apparatus | |
JP2017090613A (en) | Voice recognition control system | |
KR20140079025A (en) | Method for providing a user interface using leg gesture recognition in a vehicle | |
CN116529125A (en) | Method and apparatus for controlled hand-held steering wheel gesture interaction | |
JP5136948B2 (en) | Vehicle control device | |
JP6414420B2 (en) | In-vehicle device operation device | |
KR101500412B1 (en) | Gesture recognize apparatus for vehicle | |
WO2017212569A1 (en) | Onboard information processing device, onboard device, and onboard information processing method | |
KR20140077037A (en) | System and method for providing tactile sensations based on gesture | |
KR20200085970A (en) | Vehcle and control method thereof | |
KR20150056322A (en) | Apparatus for controlling menu of head-up display and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |