KR101372484B1 - User device which enhanced response speed and response speed enhancing method using data pre-load - Google Patents
User device which enhanced response speed and response speed enhancing method using data pre-load Download PDFInfo
- Publication number
- KR101372484B1 KR101372484B1 KR1020130050090A KR20130050090A KR101372484B1 KR 101372484 B1 KR101372484 B1 KR 101372484B1 KR 1020130050090 A KR1020130050090 A KR 1020130050090A KR 20130050090 A KR20130050090 A KR 20130050090A KR 101372484 B1 KR101372484 B1 KR 101372484B1
- Authority
- KR
- South Korea
- Prior art keywords
- gaze
- preload
- target object
- area
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
The present invention relates to a user device and a method for increasing GUI response speed. More specifically, the present invention relates to a user device and a method for pre-loading data so that an immediate response can be made when a user selects a particular GUI object.
Background of the Invention Devices equipped with an operating system using a graphical user interface (GUI) as a human-computer interfacing means are widely used. In addition, various attempts have been made to increase the GUI response speed of the mobile terminal.
Korean Patent Publication No. 2010-0045868 describes a technique in which a specific function is performed at the time of mouse-over. In addition, U.S. Patent No. 8,112,619 discloses a technique for preloading an application in order to increase the execution speed of the application. However, these documents do not disclose a user device and method for pre-loading data so that an immediate response can be made when the user selects a particular GUI object.
The technical problem to be solved by the present invention, before the user makes a selection for a specific object provided on the graphical user interface (GUI), predict the selection in advance based on the position of the pointer and the user's eyes, It is to provide a user device with increased response speed by preloading data to be loaded when selecting an object.
Another technical problem to be solved by the present invention is to predict the selection in advance based on the position of the pointer and the user's gaze before the user makes a selection for a specific object provided on a graphical user interface (GUI), It is a method of preloading data that can increase the response speed of a user device by preloading data to be loaded when a specific object is selected.
The objects of the present invention are not limited to the above-mentioned objects, and other objects, which are not mentioned above, will be clearly understood by those skilled in the art from the following description.
According to another aspect of the present invention, there is provided a user device including: a pupil recognizing unit that calculates a gaze coordinate indicating a pupil position of a user, and a pointer is located within a preload area for a selected object; A preload performing unit which preloads target object data to be loaded when the selection target object is selected when the gaze position determined by the gaze coordinates and the preload region are included in the same gaze partitioning region; And a data access unit for accessing the preloaded target object data when the selection target object is selected by the user's manipulation.
According to an embodiment of the present disclosure, when the position of the gaze and the preload area are included in different gaze division regions, the preload performing unit may determine the position of the gaze position based on the gaze division area including the preload area. The preload may be performed according to a moving direction and a moving distance, and the moving direction may be a direction toward the gaze dividing area including the preload area.
According to an embodiment, the direction toward the gaze dividing area may be determined based on whether an angle determined according to the current gaze position and the gaze position after the movement is included in a preset angle range. The preload execution unit may load the target object data and store the target object data in the preload cache unit.
According to an embodiment of the present disclosure, when the input regarding the selection of the selection target object is not made during the predetermined time, the preload execution unit may delete the target object data stored in the preload cache unit. A gaze recording unit for recording the movement history of the gaze coordinates, a gaze movement pattern determination unit for determining a gaze movement pattern using data recorded in the gaze recording unit, and disposing the gaze segmentation area according to the determined gaze movement pattern; The apparatus may further include an area division personalization unit.
According to another aspect of the present invention, there is provided a data preloading method, the method comprising: calculating a gaze coordinate indicating a position of a user's pupil; a pointer is positioned within a preload region for a selection target object for a predetermined time; Pre-loading target object data to be loaded when the selection target object is selected when the gaze position determined by the gaze coordinates and the preload region are included in the same gaze partitioning region; and The method may include accessing the preloaded target object data when the selection target object is selected by the user's manipulation.
According to an embodiment of the present disclosure, the performing of the preload may include: when the position of the gaze and the preload area are included in different gaze division areas, the gaze division area including the preload area is included in the gaze division area. The preload may be performed according to the moving direction and the moving distance of the gaze position, and the moving direction may be a direction toward the gaze dividing area including the preload area.
According to an embodiment, the direction toward the gaze dividing area may be determined based on whether an angle determined according to the current gaze position and the gaze position after the movement is included in a preset angle range.
According to the present invention, when a user selects a specific object on the GUI, it is possible to shorten the time for loading the content that can be obtained by selecting the specific object.
The effects according to the present invention are not limited by the contents exemplified above, and more various effects are included in the specification.
1 schematically illustrates a user device according to an embodiment of the present invention.
2 and 3 schematically illustrate an object, a preload area, a pointer position, and a gaze dividing area according to an embodiment of the present invention.
4 and 5 schematically illustrate a case where a pointer position and a gaze position exist in the same gaze dividing area according to an exemplary embodiment of the present invention.
6 schematically illustrates a case where a pointer position and a gaze position exist in different gaze division regions according to an exemplary embodiment of the present invention.
7 to 9 schematically illustrate conditions for preloading when a pointer position and a gaze position exist in different gaze division areas according to an exemplary embodiment of the present invention.
FIG. 10 schematically illustrates an embodiment in which the gaze dividing area is dynamically set according to an embodiment of the present invention.
11 schematically illustrates a case in which preloading is performed in a user device according to another embodiment of the present invention.
12 is a flowchart illustrating steps of preloading when a pointer position and a gaze position exist in the same gaze dividing area according to an exemplary embodiment of the present invention.
FIG. 13 is a flowchart illustrating steps of preloading when the pointer position and the gaze position exist in different gaze division regions according to an exemplary embodiment of the present invention.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Is provided to fully convey the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.
Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.
In addition, each block may represent a portion of a module, segment, or code that includes one or more executable instructions for executing a specified logical function (s). It should also be noted that in some alternative implementations, the functions mentioned in the blocks may occur out of order. For example, two blocks that are shown one after the other may actually be executed substantially concurrently, or the blocks may sometimes be performed in reverse order according to the corresponding function.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular forms as used herein include plural forms as long as the phrases do not expressly express the opposite meaning thereto. Means that a particular feature, region, integer, step, operation, element and / or component is specified, and that other specific features, regions, integers, steps, operations, elements, components, and / And the like.
Although the first, second, etc. are used to describe various components, it goes without saying that these components are not limited to these terms. These terms are used only to distinguish one element from another. Therefore, it goes without saying that the first component mentioned below may be the second component within the technical scope of the present invention.
The term '~' used in this embodiment refers to a hardware component such as an FPGA or an ASIC by software or hardware function, and '~' plays a role. However, 'part' is not meant to be limited to software or hardware. &Quot; to " may be configured to reside on an addressable storage medium and may be configured to play one or more processors. Thus, by way of example, 'parts' may refer to components such as software components, object-oriented software components, class components and task components, and processes, functions, , Subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The functions provided in the components and components may be further combined with a smaller number of components and components or further components and components.
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. The present invention may be embodied in many different forms and is not limited to the embodiments described below.
1 schematically illustrates a user device according to an embodiment of the present invention.
Referring to FIG. 1, the
The
The
The
The pointer coordinate
The
The
The
The
The
The
2 and 3 schematically illustrate the
2 and 3, each of the
The
4 and 5 schematically illustrate a case where the
4 and 5, when the
FIG. 6 schematically illustrates a case where the
Referring to FIG. 6, the object to be selected 400 of the two
For example, as shown in FIG. 7, even when the
In FIG. 8, a criterion for determining the moving directions I, II, and III of the
In FIG. 9, criteria for determining the moving distances A, B, and C of the
FIG. 10 schematically illustrates an embodiment in which the
Referring to FIG. 10, when the
FIG. 11 schematically illustrates a case in which a preload is performed in the
Referring to FIG. 11, as another embodiment of the
12 is a flowchart illustrating steps in which preloading is performed when the
Referring to FIG. 12, the
FIG. 13 is a flowchart illustrating a step in which preloading is performed when the
Referring to FIG. 13, the
It will be understood by those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Therefore, it should be understood that the above-described embodiments are illustrative in all aspects and not restrictive. The scope of the present invention is shown by the following claims rather than the detailed description, and all changes or modifications derived from the meaning of the claims and the equivalent concepts described below are interpreted as being included in the scope of the present invention. Should be.
10: user device 20: gaze partitioned area
30: preload area 400: selected object
410, 420, 430: object 50: pointer position
60: eye position
Claims (11)
When the pointer is positioned within the preload area for the selection target object for a predetermined time or more, and the gaze position determined by the gaze coordinates and the preload area are included in the same gaze division area, the selection target object should be loaded when the selection object is selected. A preload performing unit for preloading target object data for the selection target object; And
And a data access unit configured to access the preloaded target object data when the selection target object is selected by the user's manipulation.
The preload execution unit,
When the position of the gaze and the preload area are included in different gaze division areas, the preload is performed according to the moving direction and the moving distance of the gaze position based on the gaze division area including the preload area. User device.
The moving direction is,
And a direction toward the gaze dividing area including the preload area.
The direction toward the gaze dividing area is
And an angle determined according to the current gaze position and the gaze position after the movement is included in a preset angle range.
The preload execution unit,
The user device to load the target object data and to store in the preload cache.
The preload execution unit,
And when the input regarding the selection of the selection target object is not made during the predetermined time, deleting the target object data stored in the preload cache unit.
A line of sight recording unit for recording a movement history of the line of sight coordinates;
A gaze movement pattern determination unit that determines a gaze movement pattern using data recorded in the gaze recording unit; And
And an area division personalization unit to arrange the gaze division area according to the determined gaze movement pattern.
When the pointer is positioned within the preload area for the selection target object for a predetermined time or more, and the gaze position determined by the gaze coordinates and the preload area are included in the same gaze division area, the selection target object should be loaded when the selection object is selected. Pre-loading target object data; And
And accessing the preloaded target object data when the selection target object is selected by the user's operation.
Performing the preloading,
When the position of the gaze and the preload area are included in different gaze division areas, the preload is performed according to the moving direction and the moving distance of the gaze position based on the gaze division area including the preload area. How to preload the data of the user device.
The moving direction is,
And a direction of the gaze dividing area including the preload area.
The direction toward the gaze dividing area is
And determining whether an angle determined according to the current gaze position and the gaze position after the movement is included in a preset angle range.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130050090A KR101372484B1 (en) | 2013-05-03 | 2013-05-03 | User device which enhanced response speed and response speed enhancing method using data pre-load |
PCT/KR2013/009039 WO2014058233A1 (en) | 2012-10-11 | 2013-10-10 | Method for increasing gui response speed of user device through data preloading, and said user device |
US14/434,970 US9886177B2 (en) | 2012-10-11 | 2013-10-10 | Method for increasing GUI response speed of user device through data preloading, and said user device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130050090A KR101372484B1 (en) | 2013-05-03 | 2013-05-03 | User device which enhanced response speed and response speed enhancing method using data pre-load |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101372484B1 true KR101372484B1 (en) | 2014-03-11 |
Family
ID=50648189
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130050090A KR101372484B1 (en) | 2012-10-11 | 2013-05-03 | User device which enhanced response speed and response speed enhancing method using data pre-load |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101372484B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150108662A (en) * | 2014-03-18 | 2015-09-30 | 연세대학교 산학협력단 | Data pre-load management method and terminal device thereof |
KR20150111158A (en) * | 2014-03-25 | 2015-10-05 | 연세대학교 산학협력단 | Pre-loaded data management method and terminal device thereof |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10293689A (en) * | 1997-04-18 | 1998-11-04 | Kobe Nippon Denki Software Kk | Object program preload method for window system and recording medium recorded with program for the same |
KR100574045B1 (en) * | 2004-11-10 | 2006-04-26 | 주식회사 네오엠텔 | Aparatus for playing multimedia contents and method thereof |
KR20100081406A (en) * | 2009-01-06 | 2010-07-15 | 엘지전자 주식회사 | Mobile terminal and method for inputting instructions thereto |
KR20120035771A (en) * | 2010-10-06 | 2012-04-16 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
-
2013
- 2013-05-03 KR KR1020130050090A patent/KR101372484B1/en active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10293689A (en) * | 1997-04-18 | 1998-11-04 | Kobe Nippon Denki Software Kk | Object program preload method for window system and recording medium recorded with program for the same |
KR100574045B1 (en) * | 2004-11-10 | 2006-04-26 | 주식회사 네오엠텔 | Aparatus for playing multimedia contents and method thereof |
KR20100081406A (en) * | 2009-01-06 | 2010-07-15 | 엘지전자 주식회사 | Mobile terminal and method for inputting instructions thereto |
KR20120035771A (en) * | 2010-10-06 | 2012-04-16 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150108662A (en) * | 2014-03-18 | 2015-09-30 | 연세대학교 산학협력단 | Data pre-load management method and terminal device thereof |
KR101655832B1 (en) * | 2014-03-18 | 2016-09-22 | 연세대학교 산학협력단 | Data pre-load management method and terminal device thereof |
KR20150111158A (en) * | 2014-03-25 | 2015-10-05 | 연세대학교 산학협력단 | Pre-loaded data management method and terminal device thereof |
KR101639993B1 (en) * | 2014-03-25 | 2016-07-15 | 연세대학교 산학협력단 | Pre-loaded data management method and terminal device thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9886177B2 (en) | Method for increasing GUI response speed of user device through data preloading, and said user device | |
KR102059913B1 (en) | Tag storing method and apparatus thereof, image searching method using tag and apparauts thereof | |
US10592050B2 (en) | Systems and methods for using hover information to predict touch locations and reduce or eliminate touchdown latency | |
EP3234732B1 (en) | Interaction with 3d visualization | |
JP6129879B2 (en) | Navigation technique for multidimensional input | |
KR102230630B1 (en) | Rapid gesture re-engagement | |
US9823753B2 (en) | System and method for using a side camera for free space gesture inputs | |
AU2014200924B2 (en) | Context awareness-based screen scroll method, machine-readable storage medium and terminal therefor | |
US10409366B2 (en) | Method and apparatus for controlling display of digital content using eye movement | |
US20130145304A1 (en) | Confirming input intent using eye tracking | |
KR102614046B1 (en) | Method for obtaining bio data and an electronic device thereof | |
US20150316981A1 (en) | Gaze calibration | |
EP2662756A1 (en) | Touch screen palm input rejection | |
US20170344220A1 (en) | Collaboration with 3d data visualizations | |
US20120131513A1 (en) | Gesture Recognition Training | |
US10089000B2 (en) | Auto targeting assistance for input devices | |
WO2015183766A1 (en) | Gaze tracking for one or more users | |
US9317199B2 (en) | Setting a display position of a pointer | |
US9092085B2 (en) | Configuring a touchpad setting based on the metadata of an active application of an electronic device | |
US20170344104A1 (en) | Object tracking for device input | |
CN108509133B (en) | Search component display method and device | |
US20120287063A1 (en) | System and method for selecting objects of electronic device | |
KR101372484B1 (en) | User device which enhanced response speed and response speed enhancing method using data pre-load | |
US20160034027A1 (en) | Optical tracking of a user-guided object for mobile platform user input | |
KR20140105116A (en) | Method for controlling operation and an electronic device thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20170227 Year of fee payment: 4 |
|
FPAY | Annual fee payment |
Payment date: 20180309 Year of fee payment: 5 |
|
FPAY | Annual fee payment |
Payment date: 20190318 Year of fee payment: 6 |