CN104731340B - Cursor position determines method and terminal device - Google Patents

Cursor position determines method and terminal device Download PDF

Info

Publication number
CN104731340B
CN104731340B CN201510150267.3A CN201510150267A CN104731340B CN 104731340 B CN104731340 B CN 104731340B CN 201510150267 A CN201510150267 A CN 201510150267A CN 104731340 B CN104731340 B CN 104731340B
Authority
CN
China
Prior art keywords
described
position
cursor
target location
information input
Prior art date
Application number
CN201510150267.3A
Other languages
Chinese (zh)
Other versions
CN104731340A (en
Inventor
韦在胜
张腾
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Priority to CN201510150267.3A priority Critical patent/CN104731340B/en
Publication of CN104731340A publication Critical patent/CN104731340A/en
Application granted granted Critical
Publication of CN104731340B publication Critical patent/CN104731340B/en

Links

Abstract

The invention discloses a kind of cursor position and determine that method, described cursor position determine that method comprises the following steps: in the information input frame of terminal device, open ocular focusing recognition mode;Light target target location in described information input frame is determined by ocular focusing identification.The invention also discloses a kind of terminal device.The cursor position that the present invention provides determines that method and terminal device can realize fast positioning light target in Information Inputting Process based on eyeball identification technology, enable users to determine soon information is inserted into position, thus Information Inputting Process beneficially carries out the operations such as inserting edition, and then improve Consumer's Experience.

Description

Cursor position determines method and terminal device

Technical field

The present invention relates to communication technique field, particularly relate to a kind of cursor position and determine method and terminal device.

Background technology

Existing technology based on eyeball identification, is commonly used to following aspect: such as the eye by detecting user Spherical state controls the screen locking time of terminal device, as long as detecting that the eyeball of user is just staring at terminal device Such as the screen of mobile phone, even if user does not carry out any operation, screen is also not turn off.Additionally, eyeball Identification technology extends its application further, controls page scroll-up/down such as by eyeball;Support is passed through The broadcasting of oculomotor control video, as long as user diverts sb.'s attention, video player meeting automatic pause, directly Screen is returned to sight line;It is also with eyeball and completes part operation to control computer, such as the manipulation IE page Roll.But these schemes controlling video playback and webpage broadcasting etc. are all not applied to literary composition The control of word input.

Foregoing is only used for auxiliary and understands technical scheme, does not represent and recognizes that foregoing is Prior art.

Summary of the invention

A kind of cursor position of offer is provided and determines method and terminal device, it is intended to base Realize in Information Inputting Process, light target fast being positioned in eyeball identification technology.

For achieving the above object, the present invention provides a kind of cursor position to determine method, and described cursor position is true The method of determining comprises the following steps:

Ocular focusing recognition mode is opened in the information input frame of terminal device;

Light target target location in described information input frame is determined by ocular focusing identification.

Preferably, described light target target location in described information input frame is determined by ocular focusing identification Step include:

Obtain the spacing of eyes and eyes initially converge with sight respectively the first distance of position and second away from From;

According to described spacing, the first distance and second distance, it is calculated light in described information input frame Target initial position;

Obtain described sight initially converge position and sight transfer converge the horizontal range component between position and/ Or vertical range component;

According to described initial position, horizontal range component and/or vertical range component, it is calculated described letter Light target target location in breath input frame.

Preferably, described light target target location in described information input frame is determined by ocular focusing identification Step after also include:

Described cursor is moved to target location from described initial position;

Control terminal device input information in described target location in described information input frame.

Preferably, described the step that described cursor moves to target location from described initial position is included:

Obtain the speed that ocular focusing moves;

Proportionate relationship between the speed and the cursor mobile range that move according to ocular focusing, obtains with described The cursor mobile range that speed that ocular focusing moves is corresponding;

According to described cursor mobile range, described cursor is moved to target location from described initial position.

Preferably, described according to described cursor mobile range, described cursor is moved from described initial position Step to target location includes:

Judge whether the distance between described initial position and target location exceedes predeterminable range;

The most then increase described cursor mobile range by side switch;

According to the described cursor mobile range increased, described cursor is moved to target from described initial position Position.

Preferably, described according to described cursor mobile range, described cursor is moved from described initial position Step to target location includes:

Judge whether there is centre position between described initial position and target location;

If existing, then obtain the cursor between described centre position and described target location and compensate distance;

By keyboard, described cursor mobile range is compensated automatically, by described cursor from described interposition Put mobile to described target location.

Additionally, for achieving the above object, the present invention also provides for a kind of terminal device, described terminal device bag Include:

Start module, for opening ocular focusing recognition mode in the information input frame of terminal device;

Cursor position determines module, cursor in determined described information input frame by ocular focusing identification Target location.

Preferably, the first acquiring unit, initial with sight respectively for the spacing and eyes obtaining eyes Converge the first distance and the second distance of position;

First computing unit, for according to described spacing, the first distance and second distance, being calculated Light target initial position in described information input frame;

Second acquisition unit, be used for obtaining described sight initially converge position and sight transfer converge position it Between horizontal range component and/or vertical range component;

Second computing unit, for dividing according to described initial position, horizontal range component and/or vertical range Amount, is calculated light target target location in described information input frame.

Preferably, described terminal device also includes:

Mobile module, for moving described cursor to target location from described initial position;

Input module, for controlling the terminal device described target location input in described information input frame Information.

Preferably, described mobile module includes:

Acquiring unit, for obtaining the speed that ocular focusing moves;

Computing unit, the ratio between speed and the cursor mobile range moved according to ocular focusing is closed System, obtains the cursor mobile range corresponding with the speed that described ocular focusing moves;

Mobile unit, for according to described cursor compensation magnitude, moves described cursor from described initial position Move to described target location.

Preferably, described mobile unit includes:

First judgment sub-unit, for judging whether the distance between described initial position and target location surpasses Cross predeterminable range;

First compensation deals subelement, for the most then increasing described cursor mobile range by side switch;

First mover unit, for according to the described cursor mobile range increased, by described cursor from institute State initial position to move to target location.

Preferably, described mobile unit includes:

Second judgment sub-unit, is used for judging whether there is centre between described initial position and target location Position;

Second compensation deals subelement, if for existing, then obtaining described centre position and described target position Cursor between putting compensates distance;

Second mover unit, for described cursor mobile range being compensated automatically by keyboard, will Described cursor moves to described target location from described centre position.

The cursor position that the present invention provides determines method and terminal device, defeated by the information at terminal device Enter unlatching ocular focusing recognition mode in frame, utilize and determine described information input frame by ocular focusing identification Interior smooth target target location so that user, only need to by eyeball identification i.e. in carrying out Information Inputting Process Can realize light target is fast positioned, so, it is possible to use what family determined information soon is inserted into position, Thus Information Inputting Process beneficially carries out the operations such as inserting edition, and then improve Consumer's Experience.

Accompanying drawing explanation

Fig. 1 is the hardware configuration signal of the mobile terminal realizing each embodiment of the present invention;

Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;

Fig. 3 is the schematic flow sheet that cursor position of the present invention determines method one embodiment;

Fig. 4 is the refinement schematic flow sheet of step 20 in Fig. 3;

Fig. 5 is the schematic flow sheet that cursor position of the present invention determines another embodiment of method;

Fig. 6 is the refinement schematic flow sheet of step S30 in Fig. 5;

Fig. 7 is the refinement schematic flow sheet of step S303 first embodiment in Fig. 6;

Fig. 8 is the refinement schematic flow sheet of step S303 the second embodiment in Fig. 6;

Fig. 9 is the high-level schematic functional block diagram of terminal device one embodiment of the present invention;

Figure 10 is the refinement high-level schematic functional block diagram that in Fig. 9, cursor position determines module;

Figure 11 is the high-level schematic functional block diagram of another embodiment of terminal device of the present invention;

Figure 12 is the refinement high-level schematic functional block diagram of mobile module in Figure 11;

Figure 13 is the refinement high-level schematic functional block diagram of mobile unit first embodiment in Figure 12;

Figure 14 is the refinement high-level schematic functional block diagram of mobile unit the second embodiment in Figure 12.

The realization of the object of the invention, functional characteristics and advantage will in conjunction with the embodiments, do referring to the drawings further Explanation.

Detailed description of the invention

Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not used to limit Determine the present invention.

The terminal realizing each embodiment of the present invention is described referring now to accompanying drawing.In follow-up description, Use the suffix being used for representing such as " module ", " parts " or " unit " of element only for the beneficially present invention Explanation, itself do not have specific meaning.Therefore, " module " can mixedly use with " parts ".

Terminal can be implemented in a variety of manners.Such as, the terminal described in the present invention can include such as Mobile phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), The mobile terminal of PAD (panel computer), PMP (portable media player), guider etc. with And the fixed terminal of such as numeral TV, desktop computer etc..Hereinafter it is assumed that terminal is mobile terminal. However, it will be understood by those skilled in the art that, in addition to being used in particular for the element of mobile purpose, root The terminal of fixed type is can also apply to according to the structure of embodiments of the present invention.

Fig. 1 is the hardware configuration signal of the mobile terminal realizing each embodiment of the present invention.

Mobile terminal 100 can include wireless communication unit 110, A/V (audio/video) input block 120, User input unit 130, sensing unit 140, output unit 150, memory 160, interface unit 170, Controller 180 and power subsystem 190 etc..Fig. 1 shows the mobile terminal with various assembly, but should It is understood by, it is not required that implement all assemblies illustrated.Can alternatively implement more or less of group Part.Will be discussed in more detail below the element of mobile terminal.

Wireless communication unit 110 generally includes one or more assembly, and it allows mobile terminal 100 with wireless Radio communication between communication system or network.Such as, wireless communication unit can include broadcast reception Module 111, mobile communication module 112, wireless Internet module 113, short range communication module 114 and position letter At least one in breath module 115.

Broadcast reception module 111 via broadcast channel from external broadcasting management server receive broadcast singal and/ Or broadcast related information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast control services Device can be to generate and send generation before broadcast singal and/or the server of broadcast related information or reception Broadcast singal and/or broadcast related information and send it to the server of terminal.Broadcast singal is permissible Including TV broadcast singal, radio signals, data broadcasting signal etc..And, broadcast singal can To farther include the broadcast singal combined with TV or radio signals.Broadcast related information can also There is provided via mobile communications network, and in this case, broadcast related information can be by mobile communication mould Block 112 receives.Broadcast singal can exist in a variety of manners, and such as, it can be wide with digital multimedia Broadcast the electronic program guides (EPG) of (DMB), the electronic service guidebooks of digital video broadcast-handheld (DVB-H) Etc. (ESG) form and exist.Broadcast reception module 111 can be by using various types of broadcast system System receives signal broadcast.Especially, broadcast reception module 111 can by use such as multimedia broadcasting- Ground (DMB-T), DMB-satellite (DMB-S), DVB-hand-held (DVB-H), The Radio Data System of forward link media (MediaFLO), received terrestrial digital broadcasting integrated service (ISDB-T) Etc. digit broadcasting system receive digital broadcasting.Broadcast reception module 111 may be constructed such that and is adapted to provide for The various broadcast systems of broadcast singal and above-mentioned digit broadcasting system.Receive via broadcast reception module 111 Broadcast singal and/or broadcast related information can be stored in memory 160 (or other type of storage is situated between Matter) in.

Mobile communication module 112 send radio signals to base station (such as, access point, node B etc.), In exterior terminal and server at least one and/or receive from it radio signal.Such radio Signal can include voice call signal, video calling signal or according to text and/or Multimedia Message The various types of data sent and/or receive.

Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can internal or Externally it is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can include WLAN (nothing Line LAN) (Wi-Fi), Wibro (WiMAX), Wimax (worldwide interoperability for microwave access), HSDPA (at a high speed Downlink packets accesses) etc..

Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology Including bluetoothTM, RF identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybeeTM Etc..

Positional information module 115 is the module of positional information for checking or obtain mobile terminal.Position is believed The typical case of breath module is GPS (global positioning system).According to current technology, GPS module 115 calculates From the range information of three or more satellites and correct time information and for the Information application calculated Triangulation, thus according to longitude, latitude and highly accurately calculating three-dimensional current location information.When Before, use three satellites and by using other one for calculating the method for position and temporal information Satellite corrects the position and the error of temporal information calculated.Additionally, GPS module 115 can be by real time Ground Continuous plus current location information calculates velocity information.

A/V input block 120 is used for receiving audio or video signal.A/V input block 120 can include phase Machine 121 and microphone 1220, camera 121 is caught by image in Video Capture pattern or image capture mode The view data of the static images or video that obtain device acquisition processes.Picture frame after process can show Show on display unit 151.Picture frame after camera 121 processes can be stored in memory 160 (or other Storage medium) in or be transmitted via wireless communication unit 110, can carry according to the structure of mobile terminal For two or more cameras 1210.Microphone 122 can be known at telephone calling model, logging mode, voice Other pattern etc. operational mode receives sound (voice data) via microphone, and can be by such sound Sound is processed as voice data.Audio frequency (voice) data after process can turn in the case of telephone calling model It is changed to be sent to the form output of mobile communication base station via mobile communication module 112.Microphone 122 can Eliminate (or suppression) algorithm with the various types of noises of enforcement and in reception and send audio frequency letter to eliminate (or suppression) The noise produced during number or interference.

It is mobile to control that user input unit 130 can generate key input data according to the order of user's input The various operations of terminal.User input unit 130 allows user to input various types of information, and permissible Including keyboard, metal dome, touch pad (such as, detection due to touched and cause resistance, pressure, electricity The sensitive component of change held etc.), roller, rocking bar etc..Especially, when touch pad as a layer When being superimposed upon on display unit 151, touch-screen can be formed.

Sensing unit 140 detects the current state of mobile terminal 100, (such as, mobile terminal 100 open or Closed mode), the position of mobile terminal 100, user is for the contact (that is, touch input) of mobile terminal 100 Presence or absence, the orientation of mobile terminal 100, the acceleration or deceleration of mobile terminal 100 move and direction etc., And generate the order or signal being used for controlling the operation of mobile terminal 100.Such as, when mobile terminal 100 When being embodied as sliding-type mobile phone, it is to engage on or off that sensing unit 140 can sense this sliding-type number Close.It addition, sensing unit 140 can detect whether power subsystem 190 provides electric power or interface unit 170 Whether couple with external device (ED).It is tactile that sensing unit 140 can include that proximity transducer 1410 will combine below Touch screen this is described.

Interface unit 170 is used as at least one external device (ED) and is connected connecing of can passing through with mobile terminal 100 Mouthful.Such as, external device (ED) can include wired or wireless head-band earphone port, external power source (or battery Charger) port, wired or wireless FPDP, memory card port, for connect there is identification module The port of device, audio frequency input/output (I/O) port, video i/o port, ear port etc..Identify mould Block can be that storage is for verifying that user uses the various information of mobile terminal 100 and can include user Identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) etc..It addition, The device (hereinafter referred to as " identifying device ") with identification module can be to take the form of smart card, therefore, knows Other device can be connected with mobile terminal 100 via port or other attachment means.Interface unit 170 is permissible For receiving from the input (such as, data message, electric power etc.) of external device (ED) and defeated by receive Enter to be transferred to the one or more elements in mobile terminal 100 or may be used in mobile terminal and outside Data are transmitted between device.

It addition, when mobile terminal 100 is connected with external base, interface unit 170 can serve as allowing to lead to Cross it provide the path of mobile terminal 100 by electric power from base or can serve as allowing to input from base Various command signals be transferred to the path of mobile terminal by it.Various command signals from base input Or electric power may serve as identifying whether mobile terminal is accurately fitted within the signal on base.Output is single Unit 150 be configured to vision, audio frequency and/or tactile manner provide output signal (such as, audio signal, Vision signal, alarm signal, vibration signal etc.).Output unit 150 can include display unit 151, Dio Output Modules 152, alarm unit 153 etc..

Display unit 151 may be displayed on the information processed in mobile terminal 100.Such as, mobile terminal is worked as 100 when being in telephone calling model, display unit 151 can show and call or other communicate (such as, civilian This information receiving and transmitting, multimedia file download etc.) relevant user interface (UI) or graphic user interface (GUI).When mobile terminal 100 is in video calling pattern or image capture mode, display unit 151 Can show capture image and/or the image of reception, illustrate video or image and the UI of correlation function or GUI etc..

Meanwhile, when display unit 151 and touch pad the most superposed on one another to form touch-screen time, aobvious Show that unit 151 can serve as input unit and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT) LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexibility show Show at least one in device, three-dimensional (3D) display etc..Some in these displays may be constructed such that Transparence is to allow user to watch from outside, and this is properly termed as transparent display, typical transparent display Can for example, TOLED (transparent organic light emitting diode) display etc..According to the specific enforcement wanted Mode, mobile terminal 100 can include two or more display units (or other display device), such as, Mobile terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch-screen can For detecting touch input pressure and touch input position and touch input area.

Dio Output Modules 152 can be in call signal at mobile terminal and receive pattern, call mode, note Time under the isotypes such as record pattern, speech recognition mode, broadcast reception mode, wireless communication unit 110 is connect Receive or in memory 160 storage voice data transducing audio signal and be output as sound.And And, the audio frequency that dio Output Modules 152 can provide relevant to the specific function of mobile terminal 100 execution is defeated Go out (such as, call signal receives sound, message sink sound etc.).Dio Output Modules 152 can wrap Include loudspeaker, buzzer etc..

Alarm unit 153 can provide output to notify event to mobile terminal 100.Typically Event can include calling reception, message sink, key signals input, touch input etc..Except audio frequency Or outside video frequency output, alarm unit 153 can provide in a different manner and export sending out with notification event Raw.Such as, alarm unit 153 can with vibration form provide output, when receive calling, message or During some other entrance communication (incoming communication), alarm unit 153 can provide sense of touch defeated Go out (that is, vibration) to notify to user.By providing such sense of touch to export, even if in the shifting of user When mobile phone is in the pocket of user, user also is able to identify the generation of various event.Alarm unit The output of 153 generations that notification event can also be provided via display unit 151 or dio Output Modules 152.

Memory 160 can store the process performed by controller 180 and control the software program etc. of operation Deng, or can temporarily store oneself through output maybe will export data (such as, telephone directory, message, Still image, video etc.).And, memory 160 can store about when touch is applied to touch-screen The vibration of the various modes of output and the data of audio signal.

Memory 160 can include the storage medium of at least one type, described storage medium include flash memory, Hard disk, multimedia card, card-type memory (such as, SD or DX memory etc.), random access storage device (RAM), static random-access memory (SRAM), read-only storage (ROM), electrically erasable Read-only storage (EEPROM), programmable read only memory (PROM), magnetic storage, disk, light Dish etc..And, mobile terminal 100 can be connected the storage function performing memory 160 with by network Network storage device cooperation.

Controller 180 generally controls the overall operation of mobile terminal.Such as, controller 180 performs and voice Control that call, data communication, video calling etc. are relevant and process.It addition, controller 180 can wrap Including the multi-media module 1810 for reproducing (or playback) multi-medium data, multi-media module 1810 can construct In controller 180, or it is so structured that separate with controller 180.Controller 180 can perform pattern Identifying processing, with the handwriting input performed on the touchscreen or picture are drawn input be identified as character or Image.

Power subsystem 190 receives external power or internal power under the control of controller 180 and provides behaviour Make the suitable electric power needed for each element and assembly.

Various embodiment described herein can be to use such as computer software, hardware or its any group The computer-readable medium closed is implemented.Implementing for hardware, embodiment described herein can pass through Use application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), processor, control Device, microcontroller, microprocessor, it is designed to perform in the electronic unit of function described herein extremely Few one is implemented, and in some cases, such embodiment can be implemented in controller 180.Right Implementing in software, the embodiment of such as process or function can perform at least one function or behaviour with permission The single software module made is implemented.Software code can be by writing with any suitable programming language Software application (or program) is implemented, and software code can be stored in memory 160 and by controlling Device 180 performs.

So far, oneself is through describing mobile terminal according to its function.Below, for the sake of brevity, will describe Various types of mobile terminals of such as folded form, board-type, oscillating-type, slide type mobile terminal etc. In slide type mobile terminal as example.Therefore, the present invention can be applied to any kind of mobile whole End, and it is not limited to slide type mobile terminal.

As shown in Figure 1 mobile terminal 100 may be constructed such that and utilizes via frame or packet transmission data The most wired and wireless communication system and satellite-based communication system operate.

The communication system being wherein operable to according to the mobile terminal of the present invention is described referring now to Fig. 2.

Such communication system can use different air interfaces and/or physical layer.Such as, by communication system The air interface that system uses includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) move lead to UMTS (UMTS) (especially, Long Term Evolution (LTE)), the whole world Communication system (GSM) etc..As non-limiting example, explained below relates to cdma communication system, but It is that such teaching is equally applicable to other type of system.

With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 100, multiple base station (BS) 270, base station controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to Interface is formed with Public Switched Telephony Network (PSTN) 290.MSC280 is also structured to and can be via returning Journey circuit is couple to the BSC275 of base station 270 and forms interface.If the interface that back haul link can be known according to Ganji In any one construct, described interface includes such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system as shown in Figure 2 can include multiple BSC2750.

Each BS270 can service one or more subregion (or region), by multidirectional antenna or sensing certain party To antenna cover each subregion radially away from BS270.Or, each subregion can be by for dividing Two or more antennas that collection receives cover.Each BS270 may be constructed such that support multiple frequencies distribution, And the distribution of each frequency has specific frequency spectrum (such as, 1.25MHz, 5MHz etc.).

Intersecting that subregion and frequency are distributed can be referred to as CDMA Channel.BS270 can also be referred to as base station Transceiver subsystem (BTS) or other equivalent terms.In this case, term " base station " can be used In broadly representing single BSC275 and at least one BS270.Base station can also be referred to as " cellular station ". Or, each subregion of specific BS270 can be referred to as multiple cellular station.

As shown in Figure 2, broadcast singal is sent in system the shifting operated by broadcsting transmitter (BT) 295 Dynamic terminal 100.Broadcast reception module 111 is arranged on mobile terminal 100 and sentences reception as shown in Figure 1 The broadcast singal sent by BT295.In fig. 2 it is shown that several global positioning systems (GPS) satellite 300. Satellite 300 helps to position at least one in multiple mobile terminals 100.

In fig. 2, depict multiple satellite 300, it is understood that be, it is possible to use any number of defend Star obtains useful location information.GPS module 115 is generally configured to and satellite 300 as shown in Figure 1 Coordinate the location information wanted with acquisition.Substitute GPS tracking technique or outside GPS tracking technique, can To use other technology of the position that can follow the tracks of mobile terminal.It addition, at least one gps satellite 300 can Optionally or additionally to process satellite dmb transmission.

As a typical operation of wireless communication system, BS270 receives from various mobile terminals 100 Reverse link signal.Mobile terminal 100 generally participates in call, information receiving and transmitting communicates with other type of.Special The each reverse link signal determining base station 270 reception is processed in specific BS270.The data obtained It is forwarded to the BSC275 being correlated with.BSC provides call resource distribution and the soft handover included between BS270 The mobile management function of the coordination of process.The data received also are routed to MSC280 by BSC275, its Extra route service for forming interface with PSTN290 is provided.Similarly, PSTN290 with MSC280 forms interface, MSC Yu BSC275 forms interface, and BSC275 correspondingly controls BS270 Forward link signals to be sent to mobile terminal 100.

Based on above-mentioned mobile terminal hardware configuration and communication system, the cursor position side of determination of the present invention is proposed Each embodiment of method.

In one embodiment, as it is shown on figure 3, in the present embodiment, described cursor position determines that method includes Following steps:

Step S10, opens ocular focusing recognition mode in the information input frame of terminal device;

In the present embodiment, the ocular focusing recognition mode of opening terminal apparatus, described terminal device i.e. can bullet Go out information input frame, carry out information editing for user's input information.In other embodiments, it is possible to First to open information input frame, open ocular focusing recognition mode the most when needed.This is the most real Executing in example, described terminal device includes mobile phone, panel computer and notebook computer etc..Described terminal sets For quickly starting as mobile phone can arrange a shortcut key with side or close ocular focusing recognition mode.Can To be understood by, described terminal device can monitor turning of user's eyeball with internal or external infrared inductor Dynamic direction, and then determine the position by being inserted into information at information input frame.

Step S20, determines light target target location in described information input frame by ocular focusing identification;

In the present embodiment, when the ocular focusing recognition mode of terminal device is opened, information input frame can be dashed forward Go out display highlighting mark.Utilize infrared inductor can sense the sight convergence at information input frame of eyeball Position, specifically can be by being calculated light target particular location in information input frame.When ocular focusing position When putting mobile, cursor also can move, such as left and right or move up and down.By described infrared inductor sense The sight that should arrive eyeball stops the scheduled time such as more than 2s in a certain convergence position, then may determine that this converges Position is light target target location.

In other embodiments, it is also possible to by opening ocular focusing recognition mode, be determined by light target Target location selects to return to main screen interface;Can also be at the main screen interface of terminal device, by really Determine the application program that light target target location selects to start;Light target target can also be determined by Position selects the thumbnail file etc. that will open.

The cursor position that the present invention provides determines method, by opening in the information input frame of terminal device Ocular focusing recognition mode, utilizes and determines light target mesh in described information input frame by ocular focusing identification Cursor position so that user, in carrying out Information Inputting Process, only can need to be realized light by eyeball identification Target fast positions, so, it is possible to use family determines the position that is inserted into of information soon, thus favorably In Information Inputting Process, carry out the operations such as inserting edition, and then improve Consumer's Experience.

In one embodiment, as shown in Figure 4, on the basis of the embodiment of above-mentioned Fig. 3, the present embodiment In, described step S20 includes:

Step S201, obtain the spacing of eyes and eyes initially converge with sight respectively the first of position away from From and second distance;

In the present embodiment, the spacing of eyes can be set as the distance of eyes inner eye corner, the eyes tail of the eyes Distance or the distance etc. at pupil of both eyes center.Utilize described infrared inductor to sense and monitor eyes respectively with In information input frame, sight initially converges the first distance and the second distance of position.

Step S202, according to described spacing, the first distance and second distance, is calculated described information Light target initial position in input frame;

In the present embodiment, as a example by pupil center, set the spacing of eyes as at the beginning of L0, left eye and sight The first distance beginning to converge position is L1, and it is L2 that right eye and sight initially converge the second distance of position, institute State the computational methods prestoring described initial position in the memory of terminal device:

1, with the pupil center of left eye as initial point, move in a circle in space with L1 for radius, its motion Track is S1;

2, with the pupil center of the right eye of the pupil center L0 apart from described left eye as initial point, it is half with L2 Footpath moves in a circle in space S2, and its movement locus is S2;

3, the joining of S1 Yu S2 is described initial position.

Step S203, obtains described sight and initially converges the level between position and sight transfer convergence position Distance component and/or vertical range component;

In the present embodiment, when sight initially converges position and sight transfer converges position in same a line, then Obtain described sight and initially converge the horizontal range component between position and sight transfer convergence position;Work as mesh Light initially converges the transfer of position and sight and converges position when same row, then obtain described sight and initially converge The vertical range component between position is converged in position and sight transfer;When sight initially converges position and sight Transfer converge position neither same a line the most not same row time, obtain described sight initially converge position with The horizontal range component between position and vertical range component are converged in sight transfer.

Step S204, according to described initial position, horizontal range component and/or vertical range component, calculates Obtain light target target location in described information input frame.

In the present embodiment, setting horizontal range component as X, vertical range component is Y, when sight is initial Converge the transfer of position and sight and converge position when same a line, described target location be initial position to the left or The position of distance component X that moves right;Position is converged same with sight transfer when sight initially converges position During one row, described target location is the position that initial position moves up or down distance component Y;Work as mesh Light initially converge the transfer of position and sight converge position neither same a line the most not at same row time, described mesh Mark be set to initial position obliquely, displacement component obliquelyPosition.It is appreciated that , the described concrete moving direction of smooth target is initially converged position and converges position shifting to sight transfer by sight Dynamic direction determines.

In one embodiment, as it is shown in figure 5, on the basis of the embodiment of above-mentioned Fig. 3, the present embodiment In, also include after described step S20:

Step S30, moves described cursor to target location from described initial position;

In the present embodiment, described cursor is initially converged by sight position and moves to sight transfer convergence position Direction correspondence move corresponding distance component, and make cursor move to target location from described initial position.

Step S40, controls terminal device input information in described target location in described information input frame.

In the present embodiment, treat that cursor moves to target location from described initial position, then by entity or void Body keyboard inputs, in described information input frame, the information being inserted into.It is understood that when user completes During information input, when finding to need to input some information, ocular focusing recognition mode can be again turned on, Cursor is moved to described target location, the typing to the information of insertion can be realized.It is understood that Directionkeys in described entity or dummy keyboard or other self-defining keys, it is also possible to control light target and move. Described input information includes letter, word, symbol, numeral and expression picture etc..

In one embodiment, as shown in Figure 6, on the basis of the embodiment of above-mentioned Fig. 5, the present embodiment In, described step S30 includes:

Step S301, obtains the speed that ocular focusing moves;

Step S302, the proportionate relationship between the speed moved according to ocular focusing and cursor mobile range, Obtain the cursor mobile range corresponding with the speed that described ocular focusing moves;

Step S303, according to described cursor mobile range, by described cursor from described initial position move to Target location.

In the present embodiment, obtain the speed that ocular focusing moves, gather according to the eyeball prestored in terminal device Proportionate relationship between speed and the cursor mobile range of burnt movement, obtains moving with described ocular focusing The cursor mobile range that speed is corresponding.The speed that i.e. ocular focusing moves is the fastest, and correspondingly, cursor moves Amplitude is the biggest, so, can improve the manipulation efficiency to cursor mark.

In the first embodiment, as it is shown in fig. 7, on the basis of the embodiment of above-mentioned Fig. 6, this enforcement In example, described step S303 includes:

Step S3031, it is judged that the distance between described initial position and target location whether exceed preset away from From;

Step S3032, the most then increase described cursor mobile range by side switch;

Step S3033, according to the described cursor mobile range increased, by described cursor from described initial position Mobile to target location.

In this preferred embodiment, if the speed that the ocular focusing obtained moves is relatively slowly, i.e. corresponding cursor Mobile range is less, and during distant between described initial position and target location, it is possible to use eventually Volume adjusting key that the side of end equipment such as mobile phone is arranged "+" increase amplitude that cursor moves or utilize sound Amount adjusting key "-" reduces the amplitude that cursor moves.For example it is assumed that the speed that ocular focusing moves is 5 Individual character/per second, in 3 seconds, cursor mobile range is 15 characters, between initial positions and target location Character distance reaches predetermined value as time more than 50 characters, then can by press adjusting key "+" increase The amplitude that cursor moves, in making 3 seconds, cursor mobile range increases to the compensation magnitude of 25 characters, so, no Only can quickly position cursor position, it is also possible to be accurately positioned light target shift position.

In a second embodiment, as shown in Figure 8, on the basis of the embodiment of above-mentioned Fig. 6, this enforcement In example, described step S313 includes:

Step S3134, it is judged that whether there is centre position between described initial position and target location;

Step S3135, if existing, then obtains the cursor between described centre position and described target location and mends Repay distance;

Step S3136, is compensated described cursor mobile range automatically by keyboard, by described cursor from Described centre position is moved to described target location.

In this preferred embodiment, due in actual mechanical process, it is understood that there may be the problem that sensitivity is relatively low, I.e. sight initially converge position to sight transfer converge position quickly move time, cursor might not move To position, several characters as the poorest in distance objective position or the most several character accurately.Now, cursor moves Dynamic position is not target location, but centre position.Obtain centre position and described target location it Between cursor compensate distance, centre position as described in the front of target location time, then utilize virtual or real Body keyboard, is moved rearwardly into target location accurately by directionkeys by cursor;If centre position is described During the rear of target location, then utilize virtual or physical keyboard, by directionkeys, cursor is moved forward to Target location accurately.As such, it is possible to improve the sensitivity of ocular focusing identification.

Based on above-mentioned mobile terminal hardware configuration and communication system, each is real to propose terminal device of the present invention Execute example.

In one embodiment, as it is shown in figure 9, in the present embodiment, described terminal device 1 includes:

Start module 10, for opening ocular focusing recognition mode in the information input frame of terminal device;

In the present embodiment, the ocular focusing recognition mode of opening terminal apparatus, described terminal device i.e. can bullet Go out information input frame, carry out information editing for user's input information.In other embodiments, it is possible to First to open information input frame, open ocular focusing recognition mode the most when needed.This is the most real Executing in example, described terminal device includes mobile phone, panel computer and notebook computer etc..Described terminal sets For quickly starting as mobile phone can arrange a shortcut key with side or close ocular focusing recognition mode.Can To be understood by, described terminal device can monitor turning of user's eyeball with internal or external infrared inductor Dynamic direction, and then determine the position by being inserted into information at information input frame.

Cursor position determines module 20, light in determined described information input frame by ocular focusing identification Target target location.

In the present embodiment, when the ocular focusing recognition mode of terminal device is opened, information input frame can be dashed forward Go out display highlighting mark.Utilize infrared inductor can sense the sight convergence at information input frame of eyeball Position, specifically can be by being calculated light target particular location in information input frame.When ocular focusing position When putting mobile, cursor also can move, such as left and right or move up and down.By described infrared inductor sense The sight that should arrive eyeball stops the scheduled time such as more than 2s in a certain convergence position, then may determine that this converges Position is light target target location.

In other embodiments, it is also possible to by opening ocular focusing recognition mode, be determined by light target Target location selects to return to main screen interface;Can also be at the main screen interface of terminal device, by really Determine the application program that light target target location selects to start;Light target target can also be determined by Position selects the thumbnail file etc. that will open.

The terminal device that the present invention provides, by opening ocular focusing in the information input frame of terminal device Recognition mode, utilizes ocular focusing identification to determine light target target location in described information input frame so that User, in carrying out Information Inputting Process, only can need to realize fast positioning light target by eyeball identification, So, it is possible to use family determines the position that is inserted into of information soon, thus beneficially Information Inputting Process In carry out the operation such as inserting edition, and then improve Consumer's Experience.

In the first embodiment, as shown in Figure 10, on the basis of the embodiment of above-mentioned Fig. 9, this enforcement In example, described cursor position determines that module 20 includes:

First acquiring unit 201, converges position with sight respectively for the spacing and eyes obtaining eyes First distance and second distance;

In the present embodiment, the spacing of eyes can be set as the distance of eyes inner eye corner, the eyes tail of the eyes Distance or the distance etc. at pupil of both eyes center.Utilize described infrared inductor to sense and monitor eyes respectively with In information input frame, sight converges the first distance and the second distance of position.

First computing unit 202, for according to described spacing, the first distance and second distance, calculating Light target initial position in described information input frame.

In the present embodiment, as a example by pupil center, set the spacing of eyes as L0, pupil of left eye center The first distance converging position with sight is L1, and the second distance of position is converged with sight in pupil of right eye center For L2, the memory of described terminal device prestores the computational methods of described initial position:

1, with the pupil center of left eye as initial point, move in a circle in space with L1 for radius, its motion Track is S1;

2, with the pupil center of the right eye of the pupil center L0 apart from described left eye as initial point, it is half with L2 Footpath moves in a circle in space S2, and its movement locus is S2;

3, the joining of S1 Yu S2 is described initial position.

Second acquisition unit 203, is used for obtaining described sight and initially converges position and sight transfer convergence position Between horizontal range component and/or vertical range component;

In the present embodiment, when sight initially converges position and sight transfer converges position in same a line, then Obtain described sight and initially converge the horizontal range component between position and sight transfer convergence position;Work as mesh Light initially converges the transfer of position and sight and converges position when same row, then obtain described sight and initially converge The vertical range component between position is converged in position and sight transfer;When sight initially converges position and sight Transfer converge position neither same a line the most not same row time, obtain described sight initially converge position with The horizontal range component between position and vertical range component are converged in sight transfer.

Second computing unit 204, divides according to described initial position, horizontal range component and/or vertical range Amount, is calculated light target target location in described information input frame.

In the present embodiment, setting horizontal range component as X, vertical range component is Y, when sight is initial Converge the transfer of position and sight and converge position when same a line, described target location be initial position to the left or The position of distance component X that moves right;Position is converged same with sight transfer when sight initially converges position During one row, described target location is the position that initial position moves up or down distance component Y;Work as mesh Light initially converge the transfer of position and sight converge position neither same a line the most not at same row time, described mesh Mark be set to initial position obliquely, displacement component obliquelyPosition.It is appreciated that , the described concrete moving direction of smooth target is initially converged position and converges position shifting to sight transfer by sight Dynamic direction determines.

In one embodiment, as shown in figure 11, on the basis of the embodiment of above-mentioned Fig. 9, the present embodiment In, described terminal device 1 also includes:

Mobile module 30, for moving described cursor to target location from described initial position;

In the present embodiment, described cursor is initially converged by sight position and moves to sight transfer convergence position Direction correspondence move corresponding distance component, and make cursor move to target location from described initial position.

Input module 40, defeated for controlling the terminal device described target location in described information input frame Enter information.

In the present embodiment, treat that cursor moves to target location from described initial position, then by entity or void Body keyboard inputs, in described information input frame, the information being inserted into.It is understood that when user completes During information input, when finding to need to input some information, ocular focusing recognition mode can be again turned on, Cursor is moved to described target location, the typing to the information of insertion can be realized.It is understood that Directionkeys in described entity or dummy keyboard or other self-defining keys, it is also possible to control light target and move. Described input information includes letter, word, symbol, numeral and expression picture etc..

In one embodiment, as shown in figure 12, on the basis of the embodiment of above-mentioned Figure 11, this enforcement In example, described mobile module 30 includes:

Acquiring unit 301, for obtaining the speed that ocular focusing moves;

Computing unit 302, the ratio between speed and the cursor mobile range moved according to ocular focusing Relation, obtains the cursor mobile range corresponding with the speed that described ocular focusing moves;

Mobile unit 303, for according to described cursor mobile range, by described cursor from described initial position Mobile to target location.

In the present embodiment, obtain the speed that ocular focusing moves, gather according to the eyeball prestored in terminal device Proportionate relationship between speed and the cursor mobile range of burnt movement, obtains moving with described ocular focusing The cursor mobile range that speed is corresponding.The speed that i.e. ocular focusing moves is the fastest, and correspondingly, cursor moves Amplitude is the biggest, so, can improve the manipulation efficiency to cursor mark.

In the first embodiment, as shown in figure 13, on the basis of the embodiment of above-mentioned Figure 12, this reality Executing in example, described mobile unit 303 includes:

First judgment sub-unit 3031, for judging that the distance between described initial position and target location is No exceed predeterminable range;

First compensation deals subelement 3032, moves width for the most then increasing described cursor by side switch Degree;

First mover unit 3033, for according to the described cursor mobile range increased, by described cursor Move to target location from described initial position.

In this preferred embodiment, if the speed that the ocular focusing obtained moves is relatively slowly, i.e. corresponding cursor Mobile range is less, and during distant between described initial position and target location, it is possible to use eventually Volume adjusting key that the side of end equipment such as mobile phone is arranged "+" increase amplitude that cursor moves or utilize sound Amount adjusting key "-" reduces the amplitude that cursor moves.For example it is assumed that the speed that ocular focusing moves is 5 Individual character/per second, in 3 seconds, cursor mobile range is 15 characters, between initial positions and target location Character distance reaches predetermined value as time more than 50 characters, then can by press adjusting key "+" increase The amplitude that cursor moves, in making 3 seconds, cursor mobile range increases to the compensation magnitude of 25 characters, so, no Only can quickly position cursor position, it is also possible to be accurately positioned light target shift position.

In a second embodiment, as shown in figure 14, on the basis of the embodiment of above-mentioned Figure 12, this reality Executing in example, described mobile unit 303 includes:

Second judgment sub-unit 3034, is used for judging whether exist between described initial position and target location Centre position;

Second compensation deals subelement 3035, if for existing, then obtaining described centre position and described mesh Cursor between cursor position compensates distance;

Second mover unit 3036, for described cursor mobile range being compensated automatically by keyboard, Described cursor is moved to described target location from described centre position.

In this preferred embodiment, due in actual mechanical process, it is understood that there may be the problem that sensitivity is relatively low, I.e. sight initially converge position to sight transfer converge position quickly move time, cursor might not move To position, several characters as the poorest in distance objective position or the most several character accurately.Now, cursor moves Dynamic position is not target location, but centre position.Obtain centre position and described target location it Between cursor compensate distance, centre position as described in the front of target location time, then utilize virtual or real Body keyboard, is moved rearwardly into target location accurately by directionkeys by cursor;If centre position is described During the rear of target location, then utilize virtual or physical keyboard, by directionkeys, cursor is moved forward to Target location accurately.As such, it is possible to improve the sensitivity of ocular focusing identification.

It should be noted that in this article, term " include ", " comprising " or its any other variant Be intended to comprising of nonexcludability so that include the process of a series of key element, method, article or Person's device not only includes those key elements, but also includes other key elements being not expressly set out, or also Including the key element intrinsic for this process, method, article or device.In the feelings not having more restriction Under condition, statement " including ... " key element limited, it is not excluded that include this key element process, Method, article or device there is also other identical element.

The invention described above embodiment sequence number, just to describing, does not represent the quality of embodiment.

Through the above description of the embodiments, those skilled in the art is it can be understood that arrive above-mentioned Embodiment method can add the mode of required general hardware platform by software and realize, naturally it is also possible to logical Cross hardware, but a lot of in the case of the former is more preferably embodiment.Based on such understanding, the present invention's The part that prior art is contributed by technical scheme the most in other words can be with the form body of software product Revealing to come, this computer software product is stored in a storage medium (such as ROM/RAM, magnetic disc, light Dish) in, including some instructions with so that a station terminal equipment (can be mobile phone, computer, service Device, air-conditioner, or the network equipment etc.) perform the method described in each embodiment of the present invention.

These are only the preferred embodiments of the present invention, not thereby limit the scope of the claims of the present invention, every Utilize equivalent structure or equivalence flow process conversion that description of the invention and accompanying drawing content made, or directly or Connect and be used in other relevant technical fields, be the most in like manner included in the scope of patent protection of the present invention.

Claims (10)

1. a cursor position determines method, it is characterised in that the described cursor position side of determination Method comprises the following steps:
Ocular focusing recognition mode is opened in the information input frame of terminal device;
By the sight of ocular focusing identification acquisition eyeball in the convergence position of information input frame, and According to light target target location in converging position calculation information input frame to carry out information input;
The described sight by ocular focusing identification acquisition eyeball is in the convergence position of information input frame Put, and defeated to carry out information according to light target target location in converging position calculation information input frame The step entered includes:
The spacing of acquisition eyes and eyes initially converge the first distance of position respectively with sight And second distance;
According to described spacing, the first distance and second distance, it is calculated the input of described information Light target initial position in frame;
Obtain described sight initially converges position and sight transfer is converged between position level away from From component and/or vertical range component;
According to described initial position, horizontal range component and/or vertical range component, calculate Light target target location in described information input frame.
2. cursor position as claimed in claim 1 determines method, it is characterised in that described By the sight of ocular focusing identification acquisition eyeball in the convergence position of information input frame, and according to In converging position calculation information input frame, light target target location is to carry out the step of information input The most also include:
Described cursor is moved to target location from described initial position;
Control terminal device input information in described target location in described information input frame.
3. cursor position as claimed in claim 2 determines method, it is characterised in that described The step that described cursor moves to target location from described initial position is included:
Obtain the speed that ocular focusing moves;
Proportionate relationship between the speed and the cursor mobile range that move according to ocular focusing, obtains The cursor mobile range corresponding with the speed that described ocular focusing moves;
According to described cursor mobile range, described cursor is moved to target from described initial position Position.
4. cursor position as claimed in claim 3 determines method, it is characterised in that described According to described cursor mobile range, described cursor is moved to target location from described initial position Step include:
Judge whether the distance between described initial position and target location exceedes predeterminable range;
The most then increase described cursor mobile range by side switch;
According to the described cursor mobile range increased, described cursor is moved from described initial position To target location.
5. cursor position as claimed in claim 3 determines method, it is characterised in that described According to described cursor mobile range, described cursor is moved to target location from described initial position Step include:
Judge whether there is centre position between described initial position and target location;
If existing, then obtain cursor between described centre position and described target location compensate away from From;
By keyboard, described cursor mobile range is compensated automatically, by described cursor from described Centre position is moved to described target location.
6. a terminal device, it is characterised in that described terminal device includes:
Start module, for opening ocular focusing identification mould in the information input frame of terminal device Formula;
Cursor position determines module, for being obtained the sight of eyeball by ocular focusing identification at letter The convergence position of breath input frame, and according to light target target in convergence position calculation information input frame Position is to carry out information input;
Described cursor position determines that module includes:
First acquiring unit, initially converges with sight respectively for the spacing and eyes obtaining eyes First distance of poly-position and second distance;
First computing unit, for according to described spacing, the first distance and second distance, meter Calculation obtains light target initial position in described information input frame;
Second acquisition unit, is used for obtaining described sight and initially converges position and sight transfer convergence Horizontal range component between position and/or vertical range component;
Second computing unit, is used for according to described initial position, horizontal range component and/or hangs down Straight distance component, is calculated light target target location in described information input frame.
7. terminal device as claimed in claim 6, it is characterised in that described terminal device Also include:
Mobile module, for moving described cursor to target location from described initial position;
Input module, for controlling the terminal device described target position in described information input frame Put input information.
8. terminal device as claimed in claim 7, it is characterised in that described mobile module Including:
Acquiring unit, for obtaining the speed that ocular focusing moves;
Computing unit, between speed and the cursor mobile range moved according to ocular focusing Proportionate relationship, obtains the cursor mobile range corresponding with the speed that described ocular focusing moves;
Mobile unit, for according to described cursor mobile range, by described cursor from described initially Position is moved to described target location.
9. terminal device as claimed in claim 8, it is characterised in that described mobile unit Including:
First judgment sub-unit, for judging the distance between described initial position and target location Whether exceed predeterminable range;
First compensation deals subelement, moves for the most then increasing described cursor by side switch Amplitude;
First mover unit, for according to the described cursor mobile range increased, by described light Mark moves to target location from described initial position.
10. terminal device as claimed in claim 8, it is characterised in that described mobile unit Including:
Second judgment sub-unit, is used for judging whether deposit between described initial position and target location In centre position;
Second compensation deals subelement, if for existing, then obtaining described centre position with described Cursor between target location compensates distance;
Second mover unit, for automatically mending described cursor mobile range by keyboard Repay, described cursor is moved to described target location from described centre position.
CN201510150267.3A 2015-03-31 2015-03-31 Cursor position determines method and terminal device CN104731340B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510150267.3A CN104731340B (en) 2015-03-31 2015-03-31 Cursor position determines method and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510150267.3A CN104731340B (en) 2015-03-31 2015-03-31 Cursor position determines method and terminal device

Publications (2)

Publication Number Publication Date
CN104731340A CN104731340A (en) 2015-06-24
CN104731340B true CN104731340B (en) 2016-08-17

Family

ID=53455307

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510150267.3A CN104731340B (en) 2015-03-31 2015-03-31 Cursor position determines method and terminal device

Country Status (1)

Country Link
CN (1) CN104731340B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106547339B (en) * 2015-09-22 2020-03-20 百度在线网络技术(北京)有限公司 Control method and device of computer equipment
CN105511620A (en) * 2015-12-08 2016-04-20 北京小鸟看看科技有限公司 Chinese three-dimensional input device, head-wearing device and Chinese three-dimensional input method
CN106997273B (en) * 2016-01-22 2020-06-05 阿里巴巴集团控股有限公司 Information input method and device
CN106384107A (en) * 2016-08-23 2017-02-08 广东小天才科技有限公司 Cursor location recognition method and device
CN106527705A (en) * 2016-10-28 2017-03-22 努比亚技术有限公司 Operation realization method and apparatus
CN107508963A (en) * 2017-07-05 2017-12-22 宇龙计算机通信科技(深圳)有限公司 Terminal control method, device, terminal and computer-readable recording medium
CN107390876A (en) * 2017-07-31 2017-11-24 合肥上量机械科技有限公司 A kind of computer cursor eyeball control system
CN107894866A (en) * 2017-12-08 2018-04-10 青岛海信移动通信技术股份有限公司 Cursor-moving method and terminal device in editing interface
CN109995986A (en) * 2017-12-29 2019-07-09 北京亮亮视野科技有限公司 Control the mobile method of intelligent glasses shooting visual angle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101201695A (en) * 2006-12-26 2008-06-18 谢振华 Mouse system for extracting and tracing based on ocular movement characteristic
CN102662476A (en) * 2012-04-20 2012-09-12 天津大学 Gaze estimation method
CN102981616A (en) * 2012-11-06 2013-03-20 中兴通讯股份有限公司 Identification method and identification system and computer capable of enhancing reality objects

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140049462A1 (en) * 2012-08-20 2014-02-20 Google Inc. User interface element focus based on user's gaze

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101201695A (en) * 2006-12-26 2008-06-18 谢振华 Mouse system for extracting and tracing based on ocular movement characteristic
CN102662476A (en) * 2012-04-20 2012-09-12 天津大学 Gaze estimation method
CN102981616A (en) * 2012-11-06 2013-03-20 中兴通讯股份有限公司 Identification method and identification system and computer capable of enhancing reality objects

Also Published As

Publication number Publication date
CN104731340A (en) 2015-06-24

Similar Documents

Publication Publication Date Title
CN104750420B (en) Screenshotss method and device
CN105094613B (en) Terminal control mechanism and method
CN104915141A (en) Method and device for previewing object information
CN105159533A (en) Mobile terminal and automatic verification code input method thereof
CN104660913B (en) Focus adjustment method and apparatus
CN104808912B (en) Inputting interface method of adjustment and device
CN104735255B (en) Multi-screen display method and system
CN104731512B (en) The method, apparatus and terminal that picture is shared
CN104850345A (en) Mobile terminal and fast task switching method therefor
CN105094618A (en) Method and device for managing background applications
CN104991819A (en) Terminal application switching method and apparatus
CN104883658B (en) The processing method of virtual card information and system
CN104902477A (en) Authentication terminal, wireless router, wireless router connection method and wireless router connection system
CN104660912A (en) Photographing method and photographing device
CN104850259A (en) Combination operation method, combination operation apparatus, touch screen operating method and electronic device
CN104951206A (en) Icon hiding method and device
CN105046131A (en) Fingerprint identification apparatus and method
CN104898961A (en) Application rapid starting method and apparatus
CN105183272A (en) Desktop icon arrangement method and apparatus
CN106572249A (en) Region enlargement method and apparatus
CN104808944A (en) Touch operation induction method and device
CN105100476A (en) Device and method for unlocking mobile terminal
CN105827775A (en) Test method and system for reducing interference of mobile industry processor interfaces (MIPI)
CN104954553A (en) Mobile terminal split-screen display method and device
CN105259996A (en) Temperature regulation device and method of mobile terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant