KR20110035162A - Method and mobile terminal for display processing using eyes recognition - Google Patents

Method and mobile terminal for display processing using eyes recognition Download PDF

Info

Publication number
KR20110035162A
KR20110035162A KR1020090092766A KR20090092766A KR20110035162A KR 20110035162 A KR20110035162 A KR 20110035162A KR 1020090092766 A KR1020090092766 A KR 1020090092766A KR 20090092766 A KR20090092766 A KR 20090092766A KR 20110035162 A KR20110035162 A KR 20110035162A
Authority
KR
South Korea
Prior art keywords
gaze
mobile terminal
information
screen
size
Prior art date
Application number
KR1020090092766A
Other languages
Korean (ko)
Other versions
KR101242531B1 (en
Inventor
강석훈
김건오
김경진
박명순
서승교
양재모
유병철
임민주
홍상우
Original Assignee
에스케이텔레콤 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에스케이텔레콤 주식회사 filed Critical 에스케이텔레콤 주식회사
Priority to KR1020090092766A priority Critical patent/KR101242531B1/en
Publication of KR20110035162A publication Critical patent/KR20110035162A/en
Application granted granted Critical
Publication of KR101242531B1 publication Critical patent/KR101242531B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00597Acquiring or recognising eyes, e.g. iris verification
    • G06K9/00604Acquisition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Abstract

The present invention relates to a screen processing method and a mobile terminal using gaze recognition, and the user can adjust the size of a specific area of the desired screen without manipulating an input device several times, and stably operate the mobile terminal with one hand. It is to provide convenience of operation that can be performed. According to the present invention, a mobile terminal displaying information on a screen captures a user image with at least one camera unit. The user's eyes are tracked by analyzing the captured user image. The mobile terminal adjusts, ie enlarges or reduces, the size of the information portion in which the gaze is concentrated during gaze tracking. Also, when the line of sight moves after the specific area where the line of sight is concentrated is enlarged or reduced, the mobile terminal may move and display a portion that is enlarged or reduced along the line of sight. Therefore, the user may enlarge or reduce the corresponding area by simply focusing the gaze on a specific area that the user wants to enlarge or reduce from the information displayed on the screen.

Description

Method and mobile terminal for display processing using eyes recognition

The present invention relates to a screen processing technology, and more particularly, to a screen processing method and a mobile terminal using gaze recognition for adjusting and displaying a size of a portion where a user's gaze is concentrated on a screen of a mobile terminal.

Information and communication technology using wired and wireless communication networks is growing very fast in a short period of time due to new technologies and functions that stimulate consumer purchasing. In particular, due to the development of the technology of the mobile terminal, beyond the step of performing a simple application, various applications are installed to meet the needs of users. As a result, the user may use voice information, text information, image information, moving picture expert group (MPEG) layer 3 (MPEG), a game, and the like through the mobile terminal.

On the other hand, the mobile terminal displays small information such as an image, text, etc. displayed on the screen due to the limitation of the screen size. Therefore, in order to check the information displayed in a specific area on the screen, the user enlarges the corresponding area by operating an input device such as a keypad or a touch pad. That is, the user selects a screen processing mode that can process information displayed on a screen by operating an input device, and selects a specific area to be enlarged in the selected screen processing mode. The user must call a function key for processing the selected specific area, and select a function key for enlarging the selected specific area among the called function keys.

As described above, there is an inconvenience in that the user has to manipulate the input device several times in order to enlarge information of a specific area on the screen of the mobile terminal. In order to operate the input device many times, it is usually necessary to operate the input device of the mobile terminal with the other hand while holding the mobile terminal with one hand, so that both hands cannot be used freely (for example, in a complicated subway or bus). ), It may be inconvenient to use the function of enlarging information in a specific area.

In addition, an input error may occur when the input device is manipulated several times. The reason is that as the number of key inputs increases, the probability of occurrence of an input error increases. In particular, an input error due to key input is more frequently generated in a touch screen type mobile terminal than a keypad type. In other words, the touch screen type mobile terminal displays a virtual keypad on the screen and receives a key input through a user's touch. In the process of touching a key, an input error occurs frequently by touching another key adjacent to a target key. Because.

Accordingly, an object of the present invention is to provide a screen processing method and a mobile terminal using gaze recognition, which can adjust the size of a specific area of the screen by concentrating the gaze without separately manipulating an input device several times.

Another object of the present invention is to provide a screen processing method and a mobile terminal using gaze recognition, which provides an operation convenience that can stably operate a mobile terminal with one hand.

In order to achieve the above object, the present invention provides a mobile terminal for processing a screen by using a gaze recognition comprising a display unit, at least one camera unit and a control unit. The display unit displays information on the screen. The at least one camera unit photographs the user. The controller receives the user image photographed by the camera unit, analyzes the user image to track the gaze of the user in real time, and adjusts the size of the information portion where the gaze is concentrated during the gaze tracking. Mark on.

The mobile terminal according to the present invention further includes a storage unit for storing a size adjusting method according to the type of information displayed on the screen. In this case, the control unit may extract a size adjustment method according to the type of information displayed on the screen from the storage unit, and adjust and display the size of the information portion where the gaze is concentrated according to the extracted size adjustment method.

The mobile terminal according to the present invention further includes an input unit for receiving a user's selection signal. When the magnification request signal is input to the selection signal through the input unit, the controller enlarges and displays the information portion in which the gaze is concentrated during eye gaze tracking. When the reduction request signal is inputted to the selection signal through the input unit, the controller reduces and displays the information portion in which the eyes are concentrated during eye tracking.

In the portable terminal according to the present invention, the input unit may include a size adjusting key including an enlargement key for inputting the enlargement request signal and a reduction key for inputting the reduction request signal.

The present invention also provides a photographing step in which a mobile terminal displaying information on a screen captures a user image with at least one camera unit, a tracking step in which the mobile terminal analyzes the photographed user image to track the gaze of the user in real time; The mobile terminal provides a screen processing method of a mobile terminal using gaze recognition including a display step of adjusting and displaying a size of the information portion in which the gaze is concentrated during the gaze tracking.

In the screen processing method of the mobile terminal according to the present invention, in the tracking step, the mobile terminal extracts a face image from the user image captured by the camera unit, and the mobile terminal extracts eyes from the extracted face image. The method may include tracking the eyes of the user by tracking the positional change of the extracted eye.

In the screen processing method of a mobile terminal according to the present invention, the displaying step includes: calculating, by the mobile terminal, a position of the information on which the gaze is concentrated during the gaze tracking; and centering the calculated position of the mobile terminal And adjusting and displaying the size of the information.

In the screen processing method of a mobile terminal according to the present invention, the calculating step includes the step of calculating, by the mobile terminal, the time at which the gaze stops during the gaze tracking, and the time calculated by the mobile terminal exceeds a threshold time. Determining that the eye is concentrated, and calculating, by the mobile terminal, the position of the information where the eye is concentrated.

In the screen processing method of the mobile terminal according to the present invention, the step of calculating the position comprises the step of calculating the gaze direction by the mobile terminal the image information of the extracted eye, the mobile terminal is the calculated gaze direction and the screen Computing the relative position of the eye with respect to the screen from the intersection with the calculating may include calculating the position of the information focused the gaze.

In the screen processing method of a mobile terminal according to the present invention, the control display step includes the mobile terminal to move the calculated position to the center of the screen, the mobile terminal is the size of the information with respect to the center of the screen It may include adjusting to display.

In the screen processing method of the mobile terminal according to the present invention, in the adjustment display step, the mobile terminal may adjust the size of the area around the calculated position.

In the screen processing method of the mobile terminal according to the present invention, the control display step of the mobile terminal starting the size adjustment of the information when the calculated time exceeds the threshold time, the mobile terminal is the gaze When moving from the centralized location of the information may include the step of stopping the resizing of the information.

In the screen processing method of the mobile terminal according to the present invention, the scaling ratio of the information in the adjustment display step may be proportional to the calculated time.

The screen processing method of the mobile terminal according to the present invention may further include the step of restoring the size-adjusted area to the original state when the gaze is out of the screen, which is performed after the display step. .

In the screen processing method of the mobile terminal according to the present invention, when the gaze performed after the display step moves on the screen, the mobile terminal continuously adjusts the size of the information portion on the path through which the gaze moves. And resizing and restoring the restoring display.

In the screen processing method of the mobile terminal according to the present invention, in the step of adjusting and restoring the size, the mobile terminal displays the information part of the point where the gaze is located by adjusting the size and restores the information part of the point where the gaze has passed. I can display it.

In the screen processing method of the mobile terminal according to the present invention, the displaying step includes the step of enlarging and displaying the information portion in which the gaze is concentrated during the gaze tracking, when the mobile terminal receives the magnification request signal. In response to receiving the reduction request signal, the mobile terminal may include at least one of reducing and displaying the information portion in which the gaze is concentrated among the gaze tracking.

In the screen processing method of the mobile terminal according to the present invention, in the display step, the mobile terminal may receive the enlargement request signal or the reduction request signal through at least one of a key input, a voice input, and a gesture input.

In the screen processing method of the portable terminal according to the present invention, in the display step, the portable terminal may display a pointer on the information portion where the gaze is concentrated.

The information may include at least one of a menu item, text information, a virtual keypad, image information, and video information.

According to the present invention, since the mobile terminal tracks the gaze of the user and adjusts the size of the portion where the gaze is concentrated on the screen during the gaze tracking, that is, enlarges or reduces the display, the user displays the input unit (input device) several times. Even if you do not operate, you can easily zoom in or out on the screen by focusing your attention.

When the gaze of the user moves in the screen while the portion where the gaze is concentrated is enlarged or reduced, the mobile terminal may continuously adjust and restore the size of the information in the direction in which the gaze moves. That is, the mobile terminal displays the information portion of the point where the line of sight is located by adjusting the size of the information portion on the path along which the line of sight moves, and restores and displays the information portion of the point where the line of sight passes.

In addition, the user can use the screen size adjustment function of the mobile terminal only by focusing the gaze while holding the mobile terminal with one hand and restoring the resized screen to its original state by moving the gaze. Operation convenience for stably operating the mobile terminal can be provided.

Hereinafter, with reference to the accompanying drawings will be described in detail an embodiment of the present invention.

1 is a block diagram illustrating a configuration of a mobile terminal for processing a screen using gaze recognition according to an exemplary embodiment of the present invention.

Referring to FIG. 1, the mobile terminal 10 according to the present embodiment includes an input unit 11, a storage unit 13, a camera unit 15, a display unit 17, and a controller 19. Process the screen using recognition.

The mobile terminal 10 according to the present embodiment is a terminal having a camera unit 15 for gaze recognition, and includes a mobile communication terminal, a notebook, a personal digital assistant (PDA), a portable multimedia player (PMP), and a DMB. Digital broadcast receivers such as (Digital Multimedia Broadcasting) or DVB (Digital Video Broadcasting), but are not limited thereto.

The input unit 11 provides a plurality of keys for the operation of the mobile terminal 10, and generates and transmits a selection signal according to the user's key selection to the controller 19. The user may request display of information on the screen of the display unit 17 through the input unit 11 and request execution of a screen size adjustment mode that enlarges or reduces information displayed on the screen through gaze recognition. The input unit 11 includes a magnification key for inputting a magnification request signal for requesting an enlargement of the information part focused on the gaze, and a reduction key for inputting a reduction request signal for requesting reduction of the information part focused on the gaze. A key may be provided. The input unit 11 may further include a size control key or may map a corresponding function to at least one of the existing keys. The input unit 11 may be a keypad, a pointing device such as a touch pad, or an input device such as a touch screen.

The storage unit 13 stores a program necessary for controlling the operation of the mobile terminal 10 and data generated while the program is executed, and includes one or more volatile memory devices and nonvolatile memory devices. The storage unit 13 stores an execution program that processes the screen by using gaze recognition, for example, to enlarge or reduce the information displayed in the area where the gaze is concentrated.

In this case, the information displayed on the screen is information that can be displayed as an image on the mobile terminal 10 and includes, but is not limited to, a menu item, text information, virtual keypad, image information, or video information. The storage unit 13 may store a size adjusting method according to the type of information. The resizing method includes a first resizing method for partially resizing only an area in the vicinity of a part where the gaze is concentrated, and a second resizing method for resizing the information as a whole based on a part where the gaze is concentrated. It includes, but is not limited to. The scaling method may be set by the user or may be set by default. For example, the first resizing method may be set as the resizing method for the menu item, the text information, and the virtual keypad. The second scaling method may be set as a scaling method for the image information and the video information.

The camera unit 15 performs a photographing function of a subject, and in particular, acquires an image of a user for processing a screen by using gaze recognition. In this case, the camera unit 15 may include an image sensor, a signal processor, and an image processor. The image sensor converts the optical signal of the captured image into an analog signal. The signal processor converts an analog signal into a digital signal. The image processing unit obtains image data by processing the image signal input through the signal processing unit, and outputs the acquired image data through the control unit 19 or the display unit 17 or stores it in the storage unit 13. The mobile terminal 10 may include at least one or more camera units 15 to track the eyes of the user, or the camera unit 15 may include at least one or more image sensors.

The display unit 17 displays information stored in the storage unit 13 including various function menus executed in the mobile terminal 10. The display unit 17 adjusts the size of the information of the portion where the eyes are concentrated under the control of the controller 19 and displays it on the screen. As the display unit 17, a liquid crystal display (LCD) or a touch screen may be used. The touch screen simultaneously serves as a display device and an input device.

The controller 19 is a microprocessor that performs the overall control operation of the mobile terminal 10. In particular, the controller 19 controls real-time screen processing using gaze recognition. That is, the controller 19 receives a user image photographed by the camera unit 15. The controller 19 analyzes the received user image to track the eyes of the user in real time. In addition, the controller 19 adjusts and displays the size of the information portion where the gaze is concentrated during gaze tracking.

At this time, when the control unit 19 receives the magnification request signal, the control unit 19 enlarges and displays the information portion where the eyes are concentrated. When the controller 19 receives the reduction request signal, the controller 19 reduces and displays the information portion where the gaze is concentrated. Although the enlargement request signal and the reduction request signal have been disclosed as an example of being input to the controller 19 through a specific key of the input unit 11, for example, an enlargement key and a reduction key, the present invention is not limited thereto. For example, the enlargement request signal or the reduction request signal may be input to the controller 19 through a voice input or a gesture input, but is not limited thereto.

The controller 19 may track the eyes of the user as follows. First, the controller 19 extracts a face image from the user image captured by the camera unit 15. The controller 19 extracts eyes from the extracted face image, and tracks the user's gaze from the positional change of the extracted eyes. For example, the facial image may be extracted using one of principal component analysis (PCA), Fisher Discriminant Analysis (FDA), and Independent Component Analysis (ICA). The controller 19 extracts an eye, which is a face component, from the extracted face image by using an adaboost or a support vector machine (SVM). The controller 19 tracks the eyes of the user from the extracted change in the position of the eye. Here, adaboost is a learning algorithm for extracting the shape of an object. "A decision-theoretic generalization of on-line learning and an application to boosting" by Yoav Freund and Robert E. Schapire, In Computational Learning Theory: Eurocolt '95, pp. 23-37, Springer-Verlag, 1995.

In addition, the controller 19 may use a face modeling technique such as ASM (Active Shape Modeling) or AAM (Active Appearance Modeling) to extract a face image from the received user image and extract eyes from the extracted face image. .

The controller 19 may adjust and display the size of the information portion where the gaze is concentrated during gaze tracking as follows. First, the controller 19 calculates the position of the information on which the gaze is concentrated during gaze tracking. At this time, the controller 19 calculates a time at which the gaze stops during eye gaze tracking. The control unit 19 determines whether the calculated time exceeds the set threshold time. If the threshold time is exceeded, the controller 19 determines that the user is focusing the gaze. And the control unit 19 calculates the position of the information on which the eyes are concentrated. On the other hand, when the gaze moves before the threshold time is exceeded, the controller 19 determines that the user is not focusing the gaze.

The controller 19 may calculate the location of the information on which the gaze is concentrated as follows. That is, the controller 19 calculates the gaze direction from the extracted image information of the eye. The controller 19 may calculate the relative position information of the eye with respect to the screen from the intersection of the calculated gaze direction and the screen to calculate the position of the information on which the eyes are concentrated. Alternatively, the controller 19 may calculate the position of the eye where the eye is concentrated by calculating the relative position information of the eye with respect to the screen using an infrared sensor (not shown).

The controller 19 adjusts and displays the size of the information based on the calculated position. In this case, the controller 19 may adjust the size of information and display it on the screen as follows. That is, the controller 19 may adjust and display the size of the information based on the calculated current position. Alternatively, the controller 19 may move the calculated position to the center of the screen and adjust the size of information with respect to the center of the screen. In this case, the controller 19 may adjust the size of the information as a whole about the calculated position or the moved center position, or may partially adjust the size of only the area around the corresponding position. For example, FIG. 5 illustrates the latter, and FIG. 6 illustrates the former. 5 and 6 will be described later.

Alternatively, the controller 19 may adjust the size according to the type of information displayed on the screen and display it on the screen as follows, but is not limited thereto. For example, when the information is a menu item, the controller 19 may partially enlarge or reduce only an area in the vicinity of the current position of the specific menu item where the eyes are concentrated. When the information is text information, the controller 19 may enlarge or reduce the text portion where the gaze is concentrated in the same manner as the menu item. When the information is a virtual keypad, the controller 19 may enlarge or reduce the key portion where the eyes are concentrated in the same manner as the menu item. When the information is image information or video information, the controller 19 may enlarge or reduce the image information or video information as a whole based on the portion where the eyes are concentrated.

Alternatively, the controller 19 may display a pointer at a portion where the eyes are concentrated. That is, a pointer may be displayed so that a user may select a corresponding part after enlarging or reducing a specific part of the information by focusing attention.

On the other hand, when enlarging or reducing the area where the line of sight is concentrated, the control unit 19 starts to enlarge or reduce the information when the time when the line of sight stops exceeds the threshold time. The controller 19 stops the expansion or reduction of the information when the gaze moves from the location of the focused information to another place. In this case, the rate of adjustment of the information is proportional to the time when the gaze stops, and the information may be enlarged or reduced in a continuous or stepwise proportion to the time when the gaze stops. The scaling factor of the information may be set by the user or may be set by default.

After the information is enlarged or reduced, the controller 19 continuously tracks the eyes of the user. If the user's gaze leaves the screen, the controller 19 restores the size-adjusted area to its original state. In this case, when the information is enlarged or reduced based on the calculated current position, the controller 19 restores the corresponding area to its original state. When the calculated position is moved to the center of the screen, the control unit 19 restores to the state before the enlargement or reduction, or restores the area whose size has been adjusted with respect to the center of the screen.

On the other hand, when the gaze of the user moves in the screen while the portion where the gaze is concentrated is enlarged or reduced, the controller 19 may continuously adjust and restore the size of the information in the direction in which the gaze moves. That is, the controller 19 may continuously adjust and restore the size of the information portion on the path along which the line of sight moves. At this time, the information part of the point where the gaze is located is enlarged or reduced and displayed, and the information part of the point where the gaze has passed is restored and displayed.

Meanwhile, although the mobile terminal 10 according to the present exemplary embodiment discloses an example in which the controller 19 performs screen processing on a portion where the gaze is concentrated, the mobile terminal 10 includes a screen processing unit separate from the controller 19 to concentrate the gaze. Screen processing can be performed on the part.

As such, the mobile terminal 10 according to the present exemplary embodiment tracks the eyes of the user and displays the enlarged or reduced portion of the eye on the screen during tracking of the eyes, so that the user manipulates the input unit 11 several times. You can easily zoom in or out on the screen by focusing your attention. When the line of sight of the user moves within the screen in a state where the line of sight is concentrated or reduced, the mobile terminal 100 may continuously adjust and restore the size of the information in the direction in which the line of sight moves. That is, the size of the information portion on the path through which the line of sight moves may be continuously adjusted and restored to be displayed. In addition, the user can use the screen size control function of the mobile terminal 100 only by focusing the gaze while holding the mobile terminal 100 with one hand, and restores the screen to which the size is adjusted by moving the gaze. Therefore, it is possible to provide an operation convenience that can stably operate the mobile terminal 100 even with one hand.

An example of the screen processing method using the gaze recognition of the mobile terminal 10 according to the present exemplary embodiment will be described with reference to FIGS. 1 to 4 as follows. 2 is a flowchart illustrating a screen processing method of a mobile terminal using gaze recognition according to an exemplary embodiment of the present invention. 3 is a detailed flowchart of the eye tracking step of FIG. 2. 4 is a detailed flowchart according to the step of adjusting and displaying the size of the information portion in which the eyes of FIG. 2 are concentrated.

First, in step S20, the mobile terminal 10 displays the information on the screen of the display unit 17. The information includes menu items, text information, virtual keypads, image information or video information.

Next, in step S25, the mobile terminal 10 selects a screen size adjustment mode according to a user's selection signal. At this time, when the magnification request signal is input as the user selection signal, the mobile terminal 10 executes the screen magnification mode. When the reduction request signal is input as the user's selection signal, the mobile terminal 10 executes a screen reduction mode. Meanwhile, in the present exemplary embodiment, an example in which the enlargement request signal and the reduction request signal are input to the mobile terminal 10 through a specific key of the input unit 11, for example, an expansion key and a reduction key is disclosed, but is not limited thereto. For example, the enlargement request signal or the reduction request signal may be input to the mobile terminal 10 through a voice input or a gesture input, but is not limited thereto.

Meanwhile, in the present embodiment, an example in which the screen size adjustment mode is performed according to a user's selection signal after performing step S20 is disclosed, but may be set as a default.

Next, in operation S30, the mobile terminal 10 captures an image of the user including the face of the user through the camera unit 15.

Next, in step S40, the mobile terminal 10 tracks the user's gaze in real time by analyzing the photographed user image. The step S40 will be described in detail with reference to FIG. 3 as follows. First, in step S41, the mobile terminal 10 extracts a face image from the captured image. Subsequently, in step S43, the mobile terminal 10 extracts eyes from the extracted face image. In operation S45, the mobile terminal 10 tracks the gaze of the user from the change in the position of the extracted eye. For example, the mobile terminal 10 may extract the other party's face image from the captured image using one of PCA, FDA, and ICA. The mobile terminal 10 extracts eyes from the extracted face image using adaboost or SVM. The mobile terminal 10 may track the eyes of the user from the positional change of the extracted eye.

Next, in step S50, the mobile terminal 10 displays the information by adjusting the size of the information portion where the eyes are concentrated. The step S50 will be described in detail with reference to FIG. 4 as follows.

First, in step S51, the mobile terminal 10 calculates the time for which the gaze stops on the screen. Next, in step S53, the mobile terminal 10 determines whether the calculated time exceeds the set threshold time.

If the determination result of step S53 does not exceed, the mobile terminal 10 performs step S51. At this time, if the line of sight moves before exceeding the threshold time, the mobile terminal 10 maintains the current display state of the information.

When the determination result of step S53 is exceeded, in step S55, the mobile terminal 10 determines that the user is focusing the gaze, and calculates the location of the focused information. At this time, the mobile terminal 10 calculates the gaze direction from the extracted image information of the eye. The mobile terminal 10 may calculate the position of the eye focused information by calculating the relative position information of the eye with respect to the screen from the intersection of the calculated eye gaze direction and the screen. Alternatively, the mobile terminal 10 may calculate the position of the eye focused information by calculating the relative position information of the eye with respect to the screen using an infrared sensor.

Next, in step S56, the mobile terminal 10 determines whether the size control key selected in step S25 is an enlarged key or a reduced key.

When the enlarged key is selected as a result of the determination in step S56, the portable terminal 10 enlarges the information based on the calculated position in the step S57. For example, the mobile terminal 10 can enlarge the information as follows. That is, the portable terminal 10 starts to expand the information when the time to stop the gaze exceeds the threshold time. The mobile terminal 10 stops the expansion of the information when the gaze moves from the location of the focused information to another place.

At this time, the magnification of the information is proportional to the time when the gaze stops, and the information can be enlarged continuously or stepwise in proportion to the time when the gaze stops. In operation S57, the mobile terminal 10 may enlarge and display information based on the calculated current position. Alternatively, the mobile terminal 10 may move the calculated position to the center of the screen, and then enlarge information about the center of the screen. In this case, the mobile terminal 10 may enlarge the information as a whole about the calculated position or the moved center position, or may partially enlarge only the area around the position.

If the reduced key is selected as a result of the determination in step S56, the portable terminal 10 enlarges the information based on the calculated position in step S58. Since the method of reducing information is also performed in the same way as the method of expanding, detailed description thereof will be omitted.

Next, in step S60, the mobile terminal 10 determines whether the gaze deviates from the screen after the information is enlarged or reduced. That is, after the information is enlarged or reduced, the mobile terminal 10 continuously tracks the eyes of the user. The mobile terminal 10 determines whether the line of sight moves, but determines whether the line of sight leaves the screen.

As a result of the determination in step S60, when the line of sight does not leave the screen, the mobile terminal 10 maintains the display state of the information according to step S50.

When the gaze deviates from the screen as a result of the determination in step S60, the mobile terminal 10 restores the adjusted portion to its original state in step S70. For example, if the user's gaze leaves the screen, the mobile terminal 10 restores the size-adjusted area to its original state. In this case, when the size of the information is adjusted based on the calculated current position, the mobile terminal 10 restores the corresponding area to its original state. When the calculated position is moved to the center of the screen, the mobile terminal 10 restores to the state before the resizing or at the opposite magnification corresponding to the scaled magnification with the calculated position centered on the screen. Information can be restored.

The screen processing method using the gaze recognition of the mobile terminal 10 according to the present exemplary embodiment will be described with reference to the screen examples of FIGS. 1, 5, and 6 as follows. In this case, when the screen is processed by using gaze recognition, since the information portion where the gaze is concentrated is enlarged or reduced and displayed, an example in which the information portion where the gaze is concentrated is enlarged.

First, when the information is enlarged and displayed by focusing the eyes, as shown in FIG. 5, an area near the center may be enlarged around the information portion where the eyes are concentrated. That is, as shown in FIG. 5A, the image 81 is displayed as information on the screen of the display unit 17. The image 81 exemplifies image information of a moon on a mountain, but is not limited thereto. When the user focuses the gaze on the moon on the screen, as shown in FIG. 5B, the display unit 17 of the portable terminal enlarges and displays an area near the moon. Reference numeral 83 denotes a part where the gaze is concentrated, and may or may not be actually displayed. Reference numeral 85 denotes an enlarged image.

In addition, when the information is enlarged and displayed by focusing the eyes, as shown in FIG. 6, the focused information portion may be moved to the center of the screen and then enlarged and displayed. That is, when the user focuses the gaze on the moon as shown in FIG. 6 (a), the mobile terminal 10 moves the moon to the center O of the screen as shown in FIG. 6 (b). The mobile terminal 10 enlarges and displays an image around the moon located at the center of the screen. Reference numeral 87 denotes an enlarged image.

Alternatively, when the information is enlarged and displayed by focusing the eye, the mobile terminal 10 may move the focused information part to the center of the screen, and then enlarge and display the area near the moon where the eye is concentrated.

On the other hand, the method of reducing and displaying the information by focusing the eye proceeds in the same manner as the above-described magnification method except that the information is focused on the contrary, in contrast to the above-described magnification, and thus the detailed description is omitted. .

Another example of a screen processing method of a mobile terminal using gaze recognition according to an embodiment of the present invention will be described with reference to FIGS. 1 and 7. 7 is a flowchart illustrating another example of a screen processing method of a mobile terminal using gaze recognition according to an exemplary embodiment of the present invention.

First, since the steps S20 to S50 proceed in the same manner as the steps disclosed in FIGS. 2 to 4, detailed descriptions are omitted, and the following description will be given from step S60 performed after the step S50.

Next, in step S60, the mobile terminal 10 determines whether there is a movement of the gaze after the size of the information is adjusted. That is, after the size of the information is adjusted, the mobile terminal 10 continuously tracks the eyes of the user. In addition, the mobile terminal 10 determines whether the gaze moves.

When there is no movement of the gaze as a result of the determination in step S60, the mobile terminal 10 maintains the display state of the information where the gaze is concentrated.

If there is a movement of the gaze as a result of the determination in step S60, the mobile terminal 10 moves the portion of the information whose size is adjusted in the gaze moving direction in step S70. That is, when the gaze of the user moves in the screen while the size of the portion where the gaze is concentrated is adjusted, the mobile terminal 10 continuously adjusts and restores the information in the direction in which the gaze moves. At this time, the information part of the point where the gaze is located is adjusted and displayed, and the information part of the point where the gaze has passed is restored and displayed.

Therefore, after the user watches the specific area of the information, the user can check the content by continuously adjusting the size of the information displayed on the screen through the eye movement even without monitoring the other part for a certain time.

Next, in step S80, the mobile terminal 10 determines whether the moving line of sight deviates from the screen.

If the gaze does not deviate from the screen as a result of the determination in step S80, the mobile terminal 10 performs again from step S60.

In addition, when the gaze deviates from the screen as a result of the determination in step S80, the mobile terminal 10 performs the process again from step S40. For example, when the line of sight leaves the screen, the mobile terminal 10 may be performed again from the step S40, or if the line-of-line departure time exceeds a predetermined time, the portable terminal 10 may be performed again from the step S40. At this time, when the line of sight deviates from the screen, the mobile terminal 10 displays the information in its original state.

Another example of the screen processing method using the gaze recognition of the mobile terminal 10 according to the present embodiment will be described with reference to the screen example of FIG. 8 as follows. In this case, FIG. 8 discloses an example in which the portion of the information where the eyes are concentrated is enlarged.

First, as shown in FIG. 8A, an image 81 is displayed as information on a screen of the display unit 17. The image 81 exemplifies image information of a moon on a mountain, but is not limited thereto.

Next, when the user focuses the gaze on the moon on the screen, as shown in FIG. 8B, the mobile terminal 10 enlarges and displays an area near the moon. Reference numeral 83 denotes a portion in which the line of sight is concentrated, and may or may not be actually displayed. Reference numeral 85 denotes an enlarged image.

As shown in FIG. 8C, when the user's gaze moves from the moon to the left, the enlarged portion of the information is moved in the gaze moving direction. At this time, the information part of the point where the gaze is located, for example, the mountain part, is enlarged and displayed, and the information part of the point where the gaze has passed, for example, is restored and displayed.

On the other hand, the embodiments of the present invention disclosed in the specification and drawings are merely presented specific examples to aid understanding, and are not intended to limit the scope of the present invention. In addition to the embodiments disclosed herein, it is apparent to those skilled in the art that other modifications based on the technical idea of the present invention may be implemented.

The present invention relates to a screen processing method and a mobile terminal using gaze recognition, and the mobile terminal tracks the gaze of the user and adjusts, ie, enlarges or reduces the size of the information portion of the screen where the gaze is concentrated during gaze tracking. Therefore, the user can easily enlarge or reduce a desired portion of the screen through attentional attention even without operating the input unit several times. In addition, since the user can perform the screen size adjustment function of the mobile terminal only by focusing the gaze while holding the mobile terminal with one hand, the user can be provided with the convenience of operating the mobile terminal stably with one hand.

1 is a block diagram illustrating a configuration of a mobile terminal for processing a screen using gaze recognition according to an exemplary embodiment of the present invention.

2 is a flowchart illustrating an example of a screen processing method of a mobile terminal using gaze recognition according to an exemplary embodiment of the present invention.

3 is a detailed flowchart of the eye tracking step of FIG. 2.

FIG. 4 is a detailed flowchart according to a step of adjusting and displaying a size of an information portion in which the eyes of FIG. 2 are concentrated.

5 and 6 illustrate screen examples according to the screen processing method of FIG. 2.

7 is a flowchart illustrating another example of a screen processing method of a mobile terminal using gaze recognition according to an exemplary embodiment of the present invention.

8 is a diagram illustrating a screen according to the screen processing method of FIG. 7.

Description of the Related Art [0002]

10: mobile terminal

11 input unit

13: storage

15: camera unit

17 display unit

19: control unit

Claims (20)

  1. A display unit for displaying information on the screen;
    At least one camera unit for photographing a user;
    A control unit configured to receive a user image photographed by the camera unit, analyze the user image to track a user's gaze in real time, and adjust the size of the information portion where the gaze is concentrated during the gaze tracking to display on the display unit ;
    A mobile terminal for processing a screen using gaze recognition, characterized in that it comprises a.
  2. The method of claim 1,
    A storage unit which stores a resizing method according to the type of information displayed on the screen;
    More,
    The control unit extracts a size adjusting method according to the type of information displayed on the screen from the storage unit, and adjusts and displays the size of the information portion where the gaze is concentrated according to the extracted size adjusting method. A mobile terminal that processes screens using recognition.
  3. The method of claim 1,
    An input unit configured to receive a user selection signal;
    More,
    When the magnification request signal is input to the selection signal through the input unit, the control unit enlarges and displays the information portion in which the gaze is concentrated among the gaze tracking,
    When the reduction request signal is inputted to the selection signal through the input unit, the controller reduces and displays the information portion in which the gaze is concentrated among the gaze tracking, and the mobile terminal processes the screen using gaze recognition. .
  4. The method of claim 3, wherein the input unit,
    And a size control key including an enlarged key for inputting the enlarged request signal and a reduced key for inputting the reduced request signal.
  5. A photographing step of photographing, by a mobile terminal displaying information on the screen, a user image with at least one camera unit;
    A tracking step of the mobile terminal analyzing the photographed user image to track the gaze of the user in real time;
    A display step of displaying, by the mobile terminal, the size of the information portion in which the gaze is concentrated during the gaze tracking;
    Screen processing method of a mobile terminal using gaze recognition, characterized in that it comprises a.
  6. The method of claim 5, wherein the tracking step,
    Extracting a face image from the user image photographed by the camera unit;
    The mobile terminal extracts eyes from the extracted face image, and tracks the eyes of the user by tracking the position change of the extracted eyes;
    Screen processing method of a mobile terminal using gaze recognition, characterized in that it comprises a.
  7. The method of claim 5, wherein the displaying step,
    Calculating, by the mobile terminal, a position of the information where the gaze is concentrated during the gaze tracking;
    A control display step of displaying, by the mobile terminal, the size of the information based on the calculated position;
    Screen processing method of a mobile terminal using gaze recognition, characterized in that it comprises a.
  8. The method of claim 7, wherein the calculating step,
    Calculating, by the mobile terminal, the time at which the gaze stops during the gaze tracking;
    Determining, by the portable terminal, to focus the gaze when the calculated time exceeds a threshold time;
    Calculating, by the portable terminal, the location of the information where the gaze is concentrated;
    Screen processing method of a mobile terminal using gaze recognition, characterized in that it comprises a.
  9. The method of claim 8, wherein the calculating of the position comprises:
    Calculating, by the portable terminal, a gaze direction based on the extracted image information of the eye;
    Calculating, by the portable terminal, the position of the eye on which the gaze is concentrated by calculating a relative position of the eye with respect to the screen from an intersection of the calculated gaze direction and the screen;
    Screen processing method of a mobile terminal using gaze recognition, characterized in that it comprises a.
  10. The method of claim 7, wherein the adjustment display step,
    Moving, by the mobile terminal, the calculated position to the center of the screen;
    Displaying, by the portable terminal, the size of the information with respect to the center of the screen;
    Screen processing method of a mobile terminal using gaze recognition, characterized in that it comprises a.
  11. The method of claim 7, wherein in the adjustment display step,
    And the mobile terminal adjusts the size of an area near the calculated position based on the calculated position.
  12. The method of claim 7, wherein the adjustment display step,
    Starting the size adjustment of the information when the calculated time exceeds the threshold time;
    Stopping the size adjustment of the information when the mobile terminal moves from the location of the information in which the gaze is concentrated;
    Screen processing method of a mobile terminal using gaze recognition, characterized in that it comprises a.
  13. The method of claim 12, wherein in the adjustment display step,
    The size control rate of the information is proportional to the calculated time screen processing method of a mobile terminal using gaze recognition.
  14. The method according to claim 5, which is performed after the displaying step,
    Restoring, by the portable terminal, the size-adjusted area to the original state when the gaze is out of the screen;
    Screen processing method of a mobile terminal using gaze recognition further comprising a.
  15. The method according to claim 5, which is performed after the displaying step,
    A size adjusting and restoring step of displaying, by the mobile terminal, continuously adjusting and restoring the size of the information portion on the path along which the line of sight moves when the line of sight moves on the screen;
    Screen processing method of a mobile terminal using gaze recognition further comprising a.
  16. The method of claim 15, wherein in the size adjustment and restoration step,
    The mobile terminal is a screen processing method of a mobile terminal using gaze recognition, characterized in that the information portion of the point where the gaze is located is adjusted to display the information portion, the information portion of the point passed by the gaze is restored and displayed.
  17. The method of claim 5, wherein the displaying step,
    When the mobile terminal receives the magnification request signal, enlarging and displaying the information portion in which the gaze is concentrated in the gaze tracking;
    When the mobile terminal receives the reduction request signal, the mobile terminal reduces and displays the information portion in which the gaze is concentrated in the gaze tracking;
    At least one of the screen processing method of a mobile terminal using gaze recognition.
  18. The method of claim 17, wherein in the displaying step,
    The mobile terminal receives the enlargement request signal or the reduction request signal through at least one of a key input, a voice input, and a gesture input.
  19. The method of claim 5, wherein in the displaying step,
    And displaying a pointer on the information portion where the gaze is concentrated, by the mobile terminal.
  20. The method according to any one of claims 5 to 19,
    The information may include at least one of a menu item, text information, a virtual keypad, image information, and video information.
KR1020090092766A 2009-09-30 2009-09-30 Method and mobile terminal for display processing using eyes recognition KR101242531B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090092766A KR101242531B1 (en) 2009-09-30 2009-09-30 Method and mobile terminal for display processing using eyes recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090092766A KR101242531B1 (en) 2009-09-30 2009-09-30 Method and mobile terminal for display processing using eyes recognition

Publications (2)

Publication Number Publication Date
KR20110035162A true KR20110035162A (en) 2011-04-06
KR101242531B1 KR101242531B1 (en) 2013-03-12

Family

ID=44043539

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090092766A KR101242531B1 (en) 2009-09-30 2009-09-30 Method and mobile terminal for display processing using eyes recognition

Country Status (1)

Country Link
KR (1) KR101242531B1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101254037B1 (en) * 2009-10-13 2013-04-12 에스케이플래닛 주식회사 Method and mobile terminal for display processing using eyes and gesture recognition
KR101479471B1 (en) * 2012-09-24 2015-01-13 네이버 주식회사 Method and system for providing advertisement based on user sight
WO2015037767A1 (en) * 2013-09-16 2015-03-19 Lg Electronics Inc. Image display apparatus and method for operating the same
US9019170B2 (en) 2013-03-14 2015-04-28 Lg Electronics Inc. Display device and method for controlling the same
US9684372B2 (en) 2012-11-07 2017-06-20 Samsung Electronics Co., Ltd. System and method for human computer interaction
US9851788B2 (en) 2013-06-11 2017-12-26 Samsung Electronics Co., Ltd Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US9928846B2 (en) 2013-12-11 2018-03-27 Samsung Electronics Co., Ltd Method and electronic device for tracking audio
US10354359B2 (en) 2013-08-21 2019-07-16 Interdigital Ce Patent Holdings Video display with pan function controlled by viewing direction

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5990954A (en) * 1994-04-12 1999-11-23 Canon Kabushiki Kaisha Electronic imaging apparatus having a functional operation controlled by a viewpoint detector
JP2000132152A (en) 1998-10-27 2000-05-12 Sharp Corp Display device
KR101109582B1 (en) * 2004-11-02 2012-02-06 삼성전자주식회사 Apparatus and method for controlling position of image when the imame is enlarged or reduced

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101254037B1 (en) * 2009-10-13 2013-04-12 에스케이플래닛 주식회사 Method and mobile terminal for display processing using eyes and gesture recognition
KR101479471B1 (en) * 2012-09-24 2015-01-13 네이버 주식회사 Method and system for providing advertisement based on user sight
US10235693B2 (en) 2012-09-24 2019-03-19 Naver Corporation Method and system for providing advertisement based on gaze of user
US9684372B2 (en) 2012-11-07 2017-06-20 Samsung Electronics Co., Ltd. System and method for human computer interaction
US9019170B2 (en) 2013-03-14 2015-04-28 Lg Electronics Inc. Display device and method for controlling the same
US10203754B2 (en) 2013-06-11 2019-02-12 Samsung Electronics Co., Ltd. Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US9851788B2 (en) 2013-06-11 2017-12-26 Samsung Electronics Co., Ltd Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US10514758B2 (en) 2013-06-11 2019-12-24 Samsung Electronics Co., Ltd. Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US10354359B2 (en) 2013-08-21 2019-07-16 Interdigital Ce Patent Holdings Video display with pan function controlled by viewing direction
US10055016B2 (en) 2013-09-16 2018-08-21 Lg Electronics Inc. Image display apparatus and method for operating the same
US20160224109A1 (en) * 2013-09-16 2016-08-04 Lg Electronics Inc. Image display apparatus and method for operating the same
WO2015037767A1 (en) * 2013-09-16 2015-03-19 Lg Electronics Inc. Image display apparatus and method for operating the same
US9928846B2 (en) 2013-12-11 2018-03-27 Samsung Electronics Co., Ltd Method and electronic device for tracking audio

Also Published As

Publication number Publication date
KR101242531B1 (en) 2013-03-12

Similar Documents

Publication Publication Date Title
US10007400B2 (en) Device, method, and graphical user interface for navigation of concurrently open software applications
US10481690B2 (en) Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface
EP2336870B1 (en) Mobile terminal and controlling method thereof
KR101857564B1 (en) Method for processing image of mobile terminal
EP2207342B1 (en) Mobile terminal and camera image control method thereof
US9355472B2 (en) Device, method, and graphical user interface for adjusting the appearance of a control
JP2017513126A (en) Apparatus and method for a ring computing device
US20190310768A1 (en) Gesture Mapping For Image Filter Input Parameters
US8922494B2 (en) Mobile terminal and method of controlling the same
US20120242852A1 (en) Gesture-Based Configuration of Image Processing Techniques
KR101651135B1 (en) Mobile terminal and method for controlling the same
RU2638004C2 (en) Device for information processing, method for managing display and program
US8456297B2 (en) Device, method, and graphical user interface for tracking movement on a map
US20110179381A1 (en) Portable electronic device and method of controlling same
KR20120071468A (en) Mobile terminal and method for controlling thereof
US8624927B2 (en) Display apparatus, display control method, and display control program
CN1276330C (en) Video display device and method for a gesture-based user interface
NL2008029C2 (en) Device, method, and graphical user interface for switching between two user interfaces.
KR20130102834A (en) Mobile terminal and control method thereof
US9952663B2 (en) Method for gesture-based operation control
KR101751360B1 (en) Device, method, and graphical user interface for switching between camera interfaces
US20090153468A1 (en) Virtual Interface System
US10416800B2 (en) Devices, methods, and graphical user interfaces for adjusting user interface objects
US8624858B2 (en) Portable electronic device including touch-sensitive display and method of controlling same
US8379098B2 (en) Real time video process control using gestures

Legal Events

Date Code Title Description
A201 Request for examination
N231 Notification of change of applicant
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20160302

Year of fee payment: 4

LAPS Lapse due to unpaid annual fee