KR20120054739A - User interface providing system using user identification - Google Patents

User interface providing system using user identification Download PDF

Info

Publication number
KR20120054739A
KR20120054739A KR1020100116001A KR20100116001A KR20120054739A KR 20120054739 A KR20120054739 A KR 20120054739A KR 1020100116001 A KR1020100116001 A KR 1020100116001A KR 20100116001 A KR20100116001 A KR 20100116001A KR 20120054739 A KR20120054739 A KR 20120054739A
Authority
KR
South Korea
Prior art keywords
driver
user interface
user
passenger
display unit
Prior art date
Application number
KR1020100116001A
Other languages
Korean (ko)
Inventor
김성운
Original Assignee
현대자동차주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 현대자동차주식회사 filed Critical 현대자동차주식회사
Priority to KR1020100116001A priority Critical patent/KR20120054739A/en
Publication of KR20120054739A publication Critical patent/KR20120054739A/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Human Computer Interaction (AREA)
  • Remote Sensing (AREA)
  • Computer Security & Cryptography (AREA)
  • Automation & Control Theory (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

PURPOSE: A user interface providing system using user identification is provided to distinguish a driver from fellow passengers and identify the driver and the fellow passengers, thereby providing a different user interface according to a situation of each user. CONSTITUTION: A camera module(10) receives an image about a human body motion of a user. A display unit(20) outputs a user interface about a peripheral device of a vehicle. A controller(30) generates human body image coordinate information based on the image. The controller distinguishes a driver from fellow passengers using the generated coordinate information. The controller provides user interfaces which are different according to the driver and the fellow passengers onto the display unit.

Description

Vehicle user interface provision system through user recognition {USER INTERFACE PROVIDING SYSTEM USING USER IDENTIFICATION}

The present invention relates to a vehicle user interface providing system through user recognition, and more particularly, to a vehicle user interface providing system for recognizing a driver and a passenger and providing an optimized user interface suitable for each user's specificity.

Cars may be equipped with a variety of peripherals, such as navigation, DMB, AV devices. Recently, due to the development of a user interface technology for such peripheral devices, various technologies have been developed to allow a driver to easily operate peripheral devices while the vehicle is in operation.

However, despite the development of the user interface technology, the complex functions of various peripheral devices have a problem that prevents the driver from driving safely. In other words, the driver has to operate all the functions of the various peripheral devices, or in a situation in which the driver can operate, causing the driver to concentrate on driving and increase the risk of an accident.

Accordingly, the present invention has been made to solve the above problems, and has an object to provide a vehicle user interface providing system for distinguishing and recognizing a driver and a passenger and providing different user interfaces according to the specificity of each user. have.

In order to achieve the above object, a vehicle user interface providing system through user recognition according to the present invention includes a camera module for receiving an image of the user's human body motion; A display unit configured to output a user interface for a vehicle peripheral device; And generate human body image coordinate information based on the image input by the camera module, distinguish the driver and the passenger using the generated coordinate information, and display different user interfaces according to the distinguished driver and the passenger. A control unit provided to the; Characterized in that it comprises a.

In this case, the human body image coordinate information preferably includes coordinate information on at least one of the user's hand and arm.

The control unit may provide a user interface on which the menu icons are formed, respectively, according to the distinguished driver and the passenger.

The controller may differently provide an activation menu icon of a user interface output on the display unit according to the distinguished driver and passenger.

The controller may divide the display into a first area on the driver's side and a second area on the passenger's side, and provide a user interface menu icon in a corresponding area selected according to the distinguished driver and the passenger.

According to the vehicle user interface providing system through user recognition according to the present invention, it is possible to help the driver's safe driving by distinguishing and recognizing the driver and the passenger and providing different optimal user interfaces for each user.

In particular, the user interface provided to the driver can be activated by providing a simple and minimized function so as not to disturb the driver's driving concentration while driving, thereby preventing the driver's concentration dispersion through the complicated vehicle peripheral operation and preventing an accident.

1 is a view showing the configuration of a vehicle user interface providing system through user recognition according to an embodiment of the present invention.
2 is a flowchart illustrating a vehicle user interface providing method through user recognition according to an embodiment of the present invention.
3 is a schematic diagram illustrating a camera module for receiving an upper body image of a driver and a passenger according to an exemplary embodiment of the present invention.
4 is a schematic diagram illustrating a rear projection touch-sensitive display unit and a camera module receiving an image of a driver's or passenger's hand or arm according to another embodiment of the present invention.
5 to 7 are schematic views showing a state in which different user interfaces are provided on the display unit according to the driver and the passenger according to the embodiment of the present invention.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art may easily implement the present invention.

1 illustrates a configuration of a vehicle user interface providing system according to an exemplary embodiment of the present invention. The vehicle user interface providing system through user recognition according to the present invention includes a camera module 10, a display unit 20, and a controller 30. The camera module 10 receives an image of a user's human motion and transmits the image to the controller 30, and the display unit 20 outputs a user interface for the vehicle peripheral device. The vehicle peripheral device defined in the present invention refers to various convenience devices provided in a vehicle such as navigation, DMB, and vehicle AV device. The display unit 20 is also controlled by the control unit 30 of the present invention and may output a user interface provided by the control unit 30.

The controller 30 of the present invention determines whether the user using the user interface is a driver or a passenger based on the image input by the camera module 10 and provides a differentiated user interface according to each user. That is, the controller may generate coordinate information about the human body image based on the image input to the camera module 10, and distinguish the driver and the passenger using the generated coordinate information.

2 illustrates a vehicle user interface providing method through user recognition according to an embodiment of the present invention. First, when the user interface providing system of the present invention is driven (S110) receives a human body motion of the user through the camera module. If the user's body motion is not input, the device waits until the user's body motion is input (S112).

When the user's human body motion is input through the camera module, the controller of the present invention extracts a human body image from the input image and generates coordinate information thereof (S120). Using the generated coordinate information, the controller of the present invention distinguishes the driver and the passenger (S130), and generates different user interfaces according to the distinguished users (S140). The user interface generated by the control unit may be output through the display unit of the present invention (S150).

In the present invention, various methods may be used to distinguish the driver from the passenger through the image input to the camera module. First, the present invention may include a camera module 10 for photographing the upper body of the driver 1 and the passenger 2 as shown in FIG. The camera module 10 may include an illumination unit 12 and may receive an image including the driver 1 and the passenger 2 and transmit the image to the controller 30.

In this case, the controller 30 may remove the background image previously input in order to separate the background image and the human body image from the image received through the camera module 10. In this case, a predetermined image mass may be defined as the upper body shape of the human body in the extracted human body image, and in the case of several people, various human body image masses may be labeled. Next, the body, the head and both arms and the like are modeled with respect to the joint in the image mass.

The human body and the joint modeled in the present invention can be tracked in the form of coordinates in space to detect motion, and the user's hand and arm can be distinguished and recognized by tracking the modeled human body and the joint. In this case, the coordinates may be analyzed to determine whether the corresponding hand or arm corresponds to the body of the driver 1 or the body of the passenger 2.

Meanwhile, according to another exemplary embodiment of the present invention, as shown in FIG. 4, the rear projection touch recognition display unit may be used as the display unit 20, and the user's hand or arm photographed on the display unit 20 may be photographed. It may include a camera module 10. The rear projection method refers to a method of allowing an image to be viewed from the front by receiving a light beam from the projector 22 provided at the rear of the display unit 20.

In this case, the camera module 10 photographing the display unit 20 may acquire an image of a user's hand or arm to make a touch input to the display unit 20. The image photographed by the camera module 10 is transmitted to the controller 30.

In the present invention, the controller 30 may remove the background image input in advance to separate the background image and the human body image from the image received from the camera module 10. At this time, the image mass estimated to be an arm or a finger is extracted from the extracted human body image, and a process of recognizing whether the arm and a finger are correct is performed.

When the user's arm and finger fit, it generates coordinate information and analyzes the stretched shape and direction of the arm or finger. That is, the driver 1 and the passenger 2 may be distinguished by analyzing whether the input image information is a left hand or a right hand, and whether the input image information is left or right with respect to the display unit 20.

When the controller 30 distinguishes between the driver and the passenger through the above process, a different user interface may be provided on the display unit 20 according to the distinguished users. 5 to 7 illustrate different embodiments of a method for providing a user interface according to the present invention.

First, as shown in FIG. 5, in the present invention, a user interface on which a different menu icon is formed may be provided on the display unit according to the distinguished driver 1 and the passenger 2.

That is, when the hand or arm of the driver 1 attempts to operate the display unit 20 as shown in FIG. 5 (a), the controller recognizes this, and icons such as navigation and music playback that the driver 1 can execute are displayed. The formed user interface may be output. In addition, when the hand or arm of the passenger 2 attempts to operate the display unit 20 as shown in FIG. 5 (b), the controller recognizes this and an icon such as a DMB or a photo book that the passenger 2 can execute is formed. You can have the user interface output.

As described above, according to the exemplary embodiment of the present invention, the user interface on which the driver's menu icon is formed may be distinguished from the user interface on which the passenger's menu icon is formed. In the present invention, the driver's menu icon and the passenger's menu icon are menu icons provided for the driver and the passenger, respectively, and the driver's menu icon may be provided as a menu icon providing only a limited function for driver's driving concentration. However, specific embodiments of the driver's menu icon and the passenger's menu icon may vary according to settings.

Next, according to another exemplary embodiment of the present invention, as shown in FIG. 6, the user interface having the same menu icon formed on the display unit is output to the driver 1 and the passenger 2, but the driver 1 recognized by the controller 1. ) And the passenger 2 can be provided so that the activation of the menu icon formed on the user interface is different.

That is, as shown in FIG. 6, the basic menu icons (navigation, DMB, photo album) may be output on the display unit 20 in the same way, but the hand or arm of the driver 1 operates the display unit 20. When attempting to do so, the control unit recognizes this and may activate only the navigation icon that is a driver menu icon. In addition, when the hand or arm of the passenger 2 attempts to operate the display unit 20, all menu icons may be activated, or menu icons designated as passenger menu icons may be activated.

On the other hand, according to another embodiment of the present invention for convenience of the user interface of the driver and passengers, as shown in Figure 7, the menu in different areas on the display unit according to the driver (1) and the passenger (2) You can provide an icon. That is, in the present invention, the display unit 20 may be divided into a first area on the driver's side and a second area on the passenger's side, and a user interface menu icon may be provided in a corresponding area selected according to the separated driver and the passenger. If the driver's seat is on the left side and the passenger seat is on the right side, the left and right sides of the display unit 20 may be set as the first area and the second area, respectively.

At this time, when the hand or arm of the driver 1 attempts to operate the display unit 20 as illustrated in FIG. 7A, the controller recognizes this and the driver menu icon is displayed on the left side (first area) of the display unit 20. Can be output. In addition, when the hand or arm of the passenger 2 attempts to operate the display unit 20 as shown in FIG. 7B, the controller recognizes this and the menu icon for the passenger on the right side (the second area) of the display unit 20. Can be output. On the other hand, when the driver's seat is on the right side and the passenger seat is on the left side, the first area may be formed on the right side, and the second area may be formed on the left side.

The user interface according to the embodiment of the present invention as described above may be output on the display unit 20. According to an embodiment of the present invention, the display unit 20 may be preferably a touch recognition screen, and the user may execute a corresponding menu by touching a menu icon of a user interface output on the touch recognition screen.

In the above described the present invention through specific embodiments, those skilled in the art can make modifications, changes without departing from the spirit and scope of the present invention. Therefore, what can be easily inferred by the person of the technical field to which this invention belongs from the detailed description and the Example of this invention is interpreted as belonging to the scope of the present invention.

1: Driver 2: Passenger
10: camera module 20: display unit
30:

Claims (5)

A camera module for receiving an image of a user's human body motion;
A display unit configured to output a user interface for a vehicle peripheral device; And
Generates human body image coordinate information based on the image input by the camera module, distinguishes the driver and the passenger using the generated coordinate information, and provides different user interfaces on the display unit according to the distinguished driver and the passenger. A control unit;
Vehicle user interface providing system through the user recognition, comprising a.
The method of claim 1,
The human body image coordinate information includes a vehicle user interface providing system through the user recognition, characterized in that the coordinate information for at least one of the user's hand and arm.
The method of claim 1,
The control unit
And a user interface having different menu icons formed according to the distinguished driver and passengers on the display unit.
The method of claim 1,
The control unit
According to the distinguished driver and passengers, the vehicle user interface providing system through user recognition, characterized in that for differently providing the activation menu icon of the user interface output on the display.
The method of claim 1,
The control unit
The display unit is divided into a first area on the driver's side and a second area on the passenger's side, and a user interface menu icon is provided in a corresponding area selected according to the distinguished driver and the passenger. system.
KR1020100116001A 2010-11-22 2010-11-22 User interface providing system using user identification KR20120054739A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100116001A KR20120054739A (en) 2010-11-22 2010-11-22 User interface providing system using user identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100116001A KR20120054739A (en) 2010-11-22 2010-11-22 User interface providing system using user identification

Publications (1)

Publication Number Publication Date
KR20120054739A true KR20120054739A (en) 2012-05-31

Family

ID=46270444

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100116001A KR20120054739A (en) 2010-11-22 2010-11-22 User interface providing system using user identification

Country Status (1)

Country Link
KR (1) KR20120054739A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9245179B2 (en) 2013-03-29 2016-01-26 Hyundai Motor Company Driver recognition system and recognition method for vehicle
KR20190088090A (en) * 2017-12-26 2019-07-26 엘지전자 주식회사 Display device mounted on vehicle
WO2020141820A1 (en) * 2019-01-02 2020-07-09 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
CN114968434A (en) * 2021-02-22 2022-08-30 上海博泰悦臻网络技术服务有限公司 Method, system, medium and device for using one screen of different person
EP3456576B1 (en) * 2017-09-15 2023-08-30 LG Electronics Inc. Vehicle control device and vehicle including the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9245179B2 (en) 2013-03-29 2016-01-26 Hyundai Motor Company Driver recognition system and recognition method for vehicle
EP3456576B1 (en) * 2017-09-15 2023-08-30 LG Electronics Inc. Vehicle control device and vehicle including the same
KR20190088090A (en) * 2017-12-26 2019-07-26 엘지전자 주식회사 Display device mounted on vehicle
WO2020141820A1 (en) * 2019-01-02 2020-07-09 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
KR20200087358A (en) * 2019-01-02 2020-07-21 삼성전자주식회사 Electronic apparatus and controlling method of the electronic apparatus
US11623525B2 (en) 2019-01-02 2023-04-11 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
CN114968434A (en) * 2021-02-22 2022-08-30 上海博泰悦臻网络技术服务有限公司 Method, system, medium and device for using one screen of different person

Similar Documents

Publication Publication Date Title
US10481757B2 (en) Eye gaze control system
JP6214752B2 (en) Display control device, display control method for display control device, gaze direction detection system, and calibration control method for gaze direction detection system
KR102029842B1 (en) System and control method for gesture recognition of vehicle
JP6015547B2 (en) Line-of-sight input device
KR20140079162A (en) System and method for providing a user interface using finger start points shape recognition in a vehicle
US20160004321A1 (en) Information processing device, gesture detection method, and gesture detection program
US20180232195A1 (en) Electronic device and method for sharing images
CN105584368A (en) System For Information Transmission In A Motor Vehicle
KR20110117966A (en) Apparatus and method of user interface for manipulating multimedia contents in vehicle
KR101879334B1 (en) Apparatus for indentifying a proximity object and method for controlling the same
KR20120054739A (en) User interface providing system using user identification
CN113994312A (en) Method for operating a mobile terminal by means of a gesture recognition and control device, motor vehicle and head-mounted output device
KR20140072734A (en) System and method for providing a user interface using hand shape trace recognition in a vehicle
KR101806172B1 (en) Vehicle terminal control system and method
KR20140079160A (en) System and method for providing a user interface using 2 dimension camera in a vehicle
JP4848997B2 (en) Incorrect operation prevention device and operation error prevention method for in-vehicle equipment
WO2014103217A1 (en) Operation device and operation detection method
JP5875337B2 (en) Input device
JP5136948B2 (en) Vehicle control device
CN111638786B (en) Display control method, device, equipment and storage medium of vehicle-mounted rear projection display system
JP5912177B2 (en) Operation input device, operation input method, and operation input program
KR101709129B1 (en) Apparatus and method for multi-modal vehicle control
KR20140079025A (en) Method for providing a user interface using leg gesture recognition in a vehicle
JP4849193B2 (en) Incorrect operation prevention device and operation error prevention method for in-vehicle equipment
JP6390380B2 (en) Display operation device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application