CN110196640A - A kind of method of controlling operation thereof and terminal - Google Patents

A kind of method of controlling operation thereof and terminal Download PDF

Info

Publication number
CN110196640A
CN110196640A CN201910467860.9A CN201910467860A CN110196640A CN 110196640 A CN110196640 A CN 110196640A CN 201910467860 A CN201910467860 A CN 201910467860A CN 110196640 A CN110196640 A CN 110196640A
Authority
CN
China
Prior art keywords
information
coordinate
screen
eyeball
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910467860.9A
Other languages
Chinese (zh)
Inventor
李沛德
马韶靖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910467860.9A priority Critical patent/CN110196640A/en
Publication of CN110196640A publication Critical patent/CN110196640A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The embodiment of the invention provides a kind of method of controlling operation thereof and terminals.This method comprises: the true feeling pixel in the front camera mould group for passing through terminal, obtains the eye motion information of user;Wherein, the true feeling pixel is used to detect and export the profile information of moving object;The target control with the eye motion information association is executed to operate.In the embodiment of the present invention, the capture of clear and definite is carried out to the movement of eyes of user by true feeling pixel, the motion information of eyes of user can be obtained in real time, improve the real-time of eye motion infomation detection, and then realizes more accurate eyes control.

Description

A kind of method of controlling operation thereof and terminal
Technical field
The present embodiments relate to field of communication technology more particularly to a kind of method of controlling operation thereof and terminals.
Background technique
Eyeball information extraction and feedback control terminal system, are exactly a kind of application of extraction of motion information.Industry is for reality Lose face eyeball control, mainly use image recognition algorithm, this mode is limited to frame per second and image quality, and range of fit is relatively narrow, deposits It is limited under highlighted environment by platform handling capacity in biggish identification error and identification delay issue;Under low-light environment, it is limited In the limitation of time for exposure, it is not high to result in detection real-time, there is the problems such as detection delay.
Summary of the invention
The embodiment of the invention provides a kind of method of controlling operation thereof and terminals, to solve eyes controlling party in the prior art Case, eye movement identify the low problem of real-time.
In order to solve the above-mentioned technical problem, the present invention adopts the following technical scheme:
In a first aspect, providing a kind of method of controlling operation thereof, comprising:
By the true feeling pixel in the front camera mould group of terminal, the eye motion information of user is obtained;Wherein, described True feeling pixel is used to detect and export the profile information of moving object;
The target control with the eye motion information association is executed to operate.
Second aspect provides a kind of terminal, comprising:
First obtains module, for obtaining the eyes of user by the true feeling pixel in the front camera mould group of terminal Motion information;Wherein, the true feeling pixel is used to detect and export the profile information of moving object;
Processing module is grasped for executing the target control for obtaining the eye motion information association that module obtains with described first Make.
The third aspect provides a kind of terminal, comprising: processor, memory and is stored on the memory and can be The computer program run on the processor realizes behaviour as described above when the computer program is executed by the processor The step of making control method.
Fourth aspect provides a kind of computer readable storage medium, stores meter on the computer readable storage medium The step of calculation machine program, the computer program realizes method of controlling operation thereof as described above when being executed by processor.
In the embodiment of the present invention, the capture of clear and definite, Neng Goushi are carried out to the movement of eyes of user by true feeling pixel When obtain the motion information of eyes of user, improve the real-time of eye motion infomation detection, and then realize more accurate eye Eyeball control.
Detailed description of the invention
Fig. 1 shows the flow diagrams of method of controlling operation thereof provided in an embodiment of the present invention;
Fig. 2 indicates the schematic diagram of the true feeling pixel on imaging sensor provided in an embodiment of the present invention;
Fig. 3 indicates that normal pixels provided in an embodiment of the present invention and true feeling pixel capture the comparison signal of moving object image Figure;
Fig. 4 indicates the schematic diagram for the eyeball image that true feeling pixel provided in an embodiment of the present invention obtains;
Fig. 5 indicates blink schematic diagram provided in an embodiment of the present invention;
Fig. 6 indicates the schematic diagram of calibration point provided in an embodiment of the present invention;
Fig. 7 indicates status diagram when each calibration point of eyeball fixes provided in an embodiment of the present invention;
Fig. 8 shows when eyeball fixes calibration point provided in an embodiment of the present invention, eyeball image is formed on the image sensor Eyeball image schematic diagram;
Fig. 9 indicates the schematic diagram of the coordinate relationship of calibration point provided in an embodiment of the present invention and eyeball image;
Figure 10 indicates the circuit diagram of true feeling pixel provided in an embodiment of the present invention;
Figure 11 indicates the schematic diagram of strabismus provided in an embodiment of the present invention;
Figure 12 indicates strabismus provided in an embodiment of the present invention and watches the contrast schematic diagram of equidirectional calibration point attentively;
Figure 13 indicates one of the block diagram of terminal provided in an embodiment of the present invention;
Figure 14 indicates the two of the block diagram of terminal provided in an embodiment of the present invention.
Specific embodiment
The exemplary embodiment that the present invention will be described in more detail below with reference to accompanying drawings.Although showing the present invention in attached drawing Exemplary embodiment, it being understood, however, that may be realized in various forms the present invention without should be by embodiments set forth here It is limited.It is to be able to thoroughly understand the present invention on the contrary, providing these embodiments, and can be by the scope of the present invention It is fully disclosed to those skilled in the art.
One aspect according to an embodiment of the present invention provides a kind of method of controlling operation thereof.
Referring to Fig. 1, which includes:
Step 101: by the true feeling pixel in the front camera mould group of terminal, obtaining the eye motion information of user.
Referring to fig. 2, (scheme on the imaging sensor that true feeling pixel described here is set in the front camera mould group Shown in A in 2), the quantity of the true feeling pixel is at least two, and on the image sensor according to pre-set density distribution, is used for The profile information of detection and output moving object.
The true feeling pixel is to the difference of normal pixels: normal pixels will in a period of time (related to frame per second) it is right After optical information integrates, read one by one in sequence.And it independently can independently be exported between true feeling pixel The contour of object of brightness change.With pixel clock frequency, true feeling pixel real-time perception external environment brightness change is bright by environment The shift conversion of degree and then projects into the variation of digital signal at the variation of electric current.If the current number of some true feeling pixel The variable quantity of signal is more than that preset threshold value VH and VL (believe when VH and VL are respectively a upper clock frequency by the number of the true feeling pixel Numerical value after number value ± threshold value), then can reporting system require to read, and export with coordinate information, luminance information, time letter The data packet of breath.As shown in figure 3, left figure is telemechanical subject image seen by person in figure, top right plot is caught in figure for normal pixels The moving object image caught, bottom-right graph is the image of doing exercises that true feeling pixel captures in figure.From figure 3, it can be seen that movement The motion profile and state of object can not be showed well by normal pixels, and true feeling pixel then can be by the movement of object Track and state, very clear and definite capture.
In the embodiment of the present invention, after detecting that terminal enters eyes control model, by front camera mould group True feeling pixel on imaging sensor obtains the motion information of eyes of user, in real time will pass through the motion information of eyes to end End carries out operation control.
Step 102: executing the target control with eye motion information association and operate.
In this step, according to the eye motion information that true feeling pixel obtains, the determining mesh with the eye motion information association Mark control operation, and target control operation is executed, to be operated to terminal.
In the embodiment of the present invention, the capture of clear and definite, Neng Goushi are carried out to the movement of eyes of user by true feeling pixel When obtain the motion information of eyes of user, improve the real-time of eye motion infomation detection, and then realize more accurate eye Eyeball control.
Optionally, eye motion information described in the embodiment of the present invention include: eyeball mobile message, blink information in At least one.
Blink information described here include: only left eye blink information, only right eye blink information, left eye and right eye blink simultaneously At least one of eye information.Left eye blink information described here refers to the information acquired when left eye blink, right eye are opened, i.e., The information acquired when only left eye is blinked.Similarly, right eye described here blink information refers to acquisition when right eye blink, left eye are opened Information, i.e., the information that acquires when only right eye is blinked.It is blinked as shown in figure 5, figure shown in the first row to the third line is respectively as follows: a left side Eye schematic diagram, right blink schematic diagram, left and right while schematic diagram of blinking.In order to be distinguished with normal blink, blink letter described here Breath is information obtained when the duration that user's blink act is more than or equal to preset threshold (such as 0.5 second), i.e. user blinks It when the duration that eye movement is made is greater than or equal to preset duration, can just be acquired, generate blink information.
It, can be based on determining eyeball fixes screen position by obtaining the mobile message of eyeball in the embodiment of the present invention Position, realization such as operate at the position location, such as take pictures focusing or execution photographing instruction, to facilitate user to the behaviour of terminal Make.
In this embodiment of the present invention, by obtaining the blink information of eyes, it can be held according to preset operational order Row and the control of blink information association operate.Such as, left blink information can be set (to refer to the clicking operation of reply target object It enables, right blink information corresponds to interface and retreats instruction, and double blinks (i.e. left eye and right eye is blinked simultaneously) are corresponding to return to main interface instruction. In this way, executing and being instructed to the clicking operation of target object when detecting eye motion information is left blink information;It is detecting When Eyeball motion information is right blink information, executes interface and retreat instruction;Detecting that eye motion information is left eye and right eye While information of blinking, it executes and returns to main interface instruction.In this way, user can control terminal by blink, without It manually controls again, facilitates operation of the user to terminal.Of course, it should be understood that the above-mentioned control to blink information setting refers to It enables, can be adjusted according to demand, such as set the clicking operation instruction to reply target object for only right blink information, it will Left eye and right eye information of blinking simultaneously are set as corresponding interface and retreat instruction, only left blink information will be set as return main interface and refer to Enable etc..
Optionally, the true feeling pixel in front camera mould group of the step 101 by terminal obtains the eyes fortune of user Before dynamic information, which includes: to obtain the eyes initial information of front camera acquisition.
Wherein, initial information described here includes: that eye socket profile, eyeball size and the eyeball fixes of eyes of user are whole When holding at least two default calibration point on screen, first coordinate position of the eyeball image in imaging sensor coordinate system.
After entering eyes control model, user can first be reminded to carry out information calibration.Such as to the eye socket wheel of eyes of user Wide, eyeball size is calibrated, to can accurately identify the motion information of eyes in subsequent eye control.
Optionally, eyes initial information described here further include: at least one of blink information, strabismus information, with Just in eyes control, the motion informations such as blink, strabismus can accurately be identified.
In the embodiment of the present invention, default calibration point can also be shown on a terminal screen, and user is prompted to see to default school On schedule.Since eyeball and the white of the eye are in white and black, so the displacement of eyeball can be led when user sees to different calibration points Cause the position of original eyeball that brightness change occurs, and true feeling pixel can capture the part of brightness change, it is possible to according to having The position of the point of brightness change, to confirm the position of eyeball.As shown in figure 4, black portions are eyeball in figure, true feeling pixel is captured To eyeball image be that stain in figure shown in dashed graph, and in dashed graph indicates eyeball center, usually by eyeball center Position is determined as location information of the eyeball image in imaging sensor coordinate system.
For example, can also be referred to as shown in fig. 6, the circular pattern in figure is the calibration point shown on terminal screen Calibrating position prompting region.In Fig. 6, nine calibration points are shown, are respectively as follows: the first calibration for being shown in the screen upper left corner Point, the third calibration point for being shown in the screen lower left corner, is shown in the screen lower right corner at the second calibration point for being shown in the screen upper right corner The 4th calibration point, be shown between first calibration point and second calibration point middle position the 5th calibration Point, the middle position being shown between the first calibration point and the third calibration point the 6th calibration point, be shown in described 7th calibration point of the middle position between two calibration points and the 4th calibration point is shown in the third calibration point and institute It states the 8th calibration point of the middle position between the 4th calibration point, be shown in the 9th calibration point of screen centre position.User Each calibration point can be watched attentively, according to prompt to complete calibration process.
As shown in fig. 7, schematic diagram when respectively illustrating each calibration point of eyeball fixes, according to sequence from left to right, It is showing for the first calibration point of eyeball fixes, the 5th calibration point and the second calibration point respectively in the first row figure from top to bottom It is intended to;It is the 6th calibration point of eyeball fixes, the 9th calibration point and the 7th calibration point respectively in the second row figure from top to bottom Schematic diagram;It is eyeball fixes third calibration point, the 8th calibration point and the 4th calibration respectively in the third line figure from top to bottom The schematic diagram of point.
In the embodiment of the present invention, by watching calibration point attentively, can establish imaging sensor coordinate system and screen coordinate system it Between coordinate mapping relations, so that the position by eyeball image on imaging sensor coordinate system, projects in screen coordinate system, Determine the screen position of eyeball fixes.Specifically, first when at least two default calibration points can be watched attentively respectively according to eyeball The second coordinate position that coordinate position and corresponding default calibration point are fastened in screen coordinate, establish imaging sensor coordinate system with Coordinate between screen coordinate system, which maps, to close.
The available user of true feeling pixel watches eyeball position information when calibration point attentively, while recording the position and passing in image The first coordinate position on sensor, then in conjunction with the second coordinate position that default calibration point is fastened in screen coordinate, described in foundation Coordinate mapping relations between imaging sensor coordinate system and screen coordinate system, so as to by the eyeball of true feeling pixel real-time capture Motion information maps in screen coordinate system in real time, to realize that accurately eyes control.
As shown in figure 8, when illustrating the calibration point on eyeball fixes terminal screen, eyeball image shape on the image sensor At eyeball image schematic diagram.As shown in figure 9, illustrating coordinate of the calibration point in screen coordinate system (takes alignment pattern Center point coordinate), when with the eyeball fixes calibration point, the coordinate relationship of eyeball image on the image sensor.
In the embodiment of the present invention, according to the first coordinate position and the second coordinate position, establish imaging sensor coordinate system with Coordinate mapping relations between screen coordinate system may include: the N number of default calibration point of display, watch default school attentively in user eyeball During on schedule, when determining that user eyeball watches terminal screen attentively, maximum of the eyeball image on imaging sensor coordinate system is living Dynamic region;The first coordinate position and corresponding default calibration point when watching N number of default calibration point attentively respectively according to eyeball exist The second coordinate position that screen coordinate is fastened, the coordinate mapping established between the coordinate of maximum active area and screen coordinate system are closed System.
Wherein, N is the integer more than or equal to 2.When default calibration point is two, it can be and be shown in end panel Curtain two diagonal positions on calibration point, as in Fig. 6 the first calibration point and the 4th calibration point or the second calibration point and third school On schedule.
The screen area size of terminal can be determined by the default calibration point shown on terminal screen.In this way, in user When watching these default calibration points attentively, the position of the eyeball image formed on the image sensor can recorde, thus image sensing Then the maximum active area of eyeball image corresponding with maximum active area when eyeball fixes screen on device establishes image biography On sensor between the coordinate of the maximum active area of eyeball image and screen coordinate system (it can be appreciated that screen area coordinate) Coordinate mapping relations.By establishing such coordinate mapping relations, can be convenient for the eyeball of true feeling pixel real-time capture Motion information in real time and accurately maps in screen coordinate system, to realize that accurately eyes control.
Optionally, when eye motion information is the mobile message of eyeball, step 101 passes through the front camera mould of terminal True feeling pixel in group, the eye motion information for obtaining user include: to obtain eyeball image in image sensing by true feeling pixel First movement trajectory coordinates information under device coordinate system during movement;According to imaging sensor coordinate system and screen coordinate system it Between coordinate mapping relations, first movement trajectory coordinates information is projected as the second motion track coordinate that screen coordinate fastens and is believed Breath.It includes: to execute and the second motion track coordinate information that step 103, which is executed with the associated target control operation of Eyeball motion information, Corresponding target control operation.
Generally, it when eye motion information is the mobile message of eyeball, needs through imaging sensor coordinate system and screen Coordinate mapping relations (specially maximum active area and screen of the eyeball image on imaging sensor coordinate system between coordinate system Coordinate mapping relations between curtain coordinate system), the position for the eyeball image that in eyeball moving process, true feeling pixel is captured is believed Breath (i.e. motion track coordinate information) successively maps in screen coordinate system, the corresponding target of the movement to determine this eyeball Control operation.
For example, eyeball is by watching screen middle position attentively to during watching screen upper marginal position attentively, true feeling pixel is captured The motion track information of eyeball, while being mapped close according to the coordinate between imaging sensor coordinate system and screen coordinate system in real time System, by eyeball moving process, the location information of the eyeball image formed on the image sensor maps to screen coordinate system In.
Further, according to the coordinate mapping relations between imaging sensor coordinate system and screen coordinate system, by described One motion track coordinate information is projected as the second motion track coordinate information that the screen coordinate is fastened, comprising:
Detect the coordinate position variation of the eye socket profile center relative screen coordinate system of eyes of user;Detecting user's eye When coordinate position variation occurs for the eye socket profile center relative screen coordinate system of eyeball, by true feeling pixel, eyes of user is obtained Relative coordinate position change information between eye socket profile center and screen coordinate system;According to relative coordinate position change information and Affine transformation adjusts the coordinate mapping relations between imaging sensor coordinate system and screen coordinate system;According to image adjusted First movement trajectory coordinates information is projected as screen and sat by the coordinate mapping relations between sensor coordinate system and screen coordinate system Mark the second motion track coordinate information fastened.
Relative position change information described here includes at least one of the following: that the eye socket profile center of eyes of user is opposite The eye socket profile center relative screen of coordinate position change information, eyes of user that screen coordinate ties up in parallel screen direction is sat Mark the eye socket profile center relative screen coordinate system of the coordinate position change information, eyes of user that tie up on vertical screen direction In space angle change information.
Due to the eyes of user and the relative position of terminal screen, it is not fixed and invariable, once and relative position occurs Variation, if the coordinate mapping relations between the imaging sensor coordinate system and screen coordinate system established before using again, by eyeball The position coordinates of image on the image sensor map in screen coordinate system, then will appear error, therefore, implement in the present invention In example, when detecting that change in location occurs for eyes of user (i.e. eye socket profile center) relative termination screen, then according to opposite position Change information and affine transformation are set, the coordinate mapping relations between imaging sensor coordinate system and screen coordinate system are adjusted;According to Coordinate mapping relations between imaging sensor coordinate system and screen coordinate system adjusted, carry out the mapping of coordinate, thus real Now accurate eye-controlled manipulation.
Wherein it is possible to the opposite position between the eye socket profile center and terminal screen that pass through true feeling pixel detection eyes of user Set change information.As shown in Figure 10, the circuit of the true feeling pixel in the embodiment of the present invention include two photodiode PD1 and PD2 respectively represents left and right photosensitive unit, according to the photoreceptor signal that two photosensitive units export, can determine between two images Phase difference, therefore the true feeling pixel in the embodiment of the present invention have distance measurement function, can detecte eyes of user relative termination screen Spatial translation, rotation and scaling (i.e. longitudinal separation variation) information of curtain, it can calculate translation matrix T, spin matrix R And scaling matrix, thus the adjustable coordinate according between imaging sensor coordinate system adjusted and screen coordinate system reflects Relationship is penetrated, and then continues to realize accurate eye-controlled manipulation.
Further, target control operation corresponding with the second motion track coordinate information is executed, comprising: move according to second Dynamic rail mark coordinate information, determines whether user eyeball watches screen area attentively;When user eyeball watches screen area attentively, position is controlled Indicateing arm is moved to the screen position of eyeball fixes;In the case where user eyeball does not watch screen area attentively, second shifting is determined Dynamic rail mark coordinate information is strabismus information, executes target control operation corresponding with the strabismus information.
In the embodiment of the present invention, by judging whether user eyeball watches screen area attentively, to distinguish different eye movements Information, to determine the corresponding target control operation of the second motion track coordinate information more accurately.
When user eyeball watches screen area attentively, determine that the second motion track coordinate information is to show in controlling terminal screen The position instruction target move shown.The position instruction mark is used to indicate the screen area of eyeball current fixation, so as to user Recognize the display content of current eyeball locking.Wherein, watch the shadow of display content attentively in order to reduce the position instruction mark to user Ring, can be set the position instruction be designated as it is translucent.The position instruction target shape can design according to actual needs, such as circle Shape, arrow shaped etc..
When user eyeball does not watch screen area attentively, determine that the second motion track coordinate information is strabismus information, then Execute target control operation corresponding with the strabismus information.Wherein, strabismus information described here is that eyeball is mobile more than screen The motion information in region, includes at least: upward strabismus, downward strabismus, to the left strabismus, to the right strabismus.
Wherein, according to the second motion track coordinate information, determine whether user eyeball watches screen area attentively, comprising: At the time of determining that eyeball stops mobile in the second motion track coordinate information, third of the eyeball fixes position in screen coordinate system Coordinate position;In the case where in the indication range that third coordinate position is in screen area, determine that user eyeball watches screen attentively Region;In the case where third coordinate position exceeds the indication range of screen area, determine that user eyeball does not watch screen area attentively.
In the embodiment of the present invention, when stopping mobile according to eyeball, position coordinates of the position in screen coordinate system are watched attentively, really Fixed this eyeball movement is the movement in screen area, the strabismus movement being also above outside screen area, due to being with screen The indication range in region is reference, therefore judging result is more accurate.Wherein, when eyeball stops mobile, eyeball fixes position When position coordinates in screen coordinate system are in the coordinates regional of screen area, determine that user eyeball is look at screen area, That is this eyeball movement is the movement in screen area;When eyeball stops mobile, eyeball fixes position is in screen coordinate system In position coordinates when being not in the coordinates regional of screen area, determine that user eyeball does not watch screen area attentively, i.e. this eyeball Movement is strabismus movement.
Wherein, settable in order to preferably distinguish the mobile message of strabismus information and eyeball in screen area: opposite eye Ball watches screen centre position attentively, and the displacement of eyeball offset is greater than or equal to 1.5 of position when watching equidirectional calibration point attentively when strabismus Times.
As shown in figure 11, when the corresponding figure of A is the 9th calibration point (i.e. screen centre position) of eyeball fixes in figure, scheming As the eyeball image formed on sensor;When corresponding nine calibration point of graphical representation eyeball fixes of A ', eyeball fixes screen area The position in domain.When the corresponding figure of B is the 6th calibration point of eyeball fixes (i.e. screen left edge position), shape on the image sensor At eyeball image;When corresponding six calibration point of graphical representation eyeball fixes of B ', the position of eyeball fixes screen area.C pairs When the figure answered is strabismus, the eyeball image that eyeball is formed on the image sensor, eye when the corresponding graphical representation of C ' left strabismus Ball relative termination screen watches position attentively.Wherein, the 6th calibration point is and the equidirectional calibration point in strabismus direction.
Eyeball is x1 by watching displacement that the 9th calibration point is moved to during watching the 6th calibration point attentively attentively, and eyeball is by watching the attentively Position when nine calibration points switch to left strabismus is x2, and wherein x2 should be greater than or be equal to 1.5x1.Of course, it should be understood that here Described 1.5 times can be adjusted according to actual needs, but should at least guarantee that x2 is greater than x1.As shown in figure 12, it is infused for eyeball Depending on the contrast schematic diagram of the contrast schematic diagram and eyeball fixes the 7th calibration point and right strabismus of the 6th calibration point and left strabismus.
In the embodiment of the present invention, can preset strabismus information is page-turning instruction, then can be believed at this time according to the strabismus The strabismus direction of instruction is ceased, page-turning instruction corresponding with strabismus direction is executed.In the embodiment of the present invention, it can also preset tiltedly Eye information is sliding screen command, then the strabismus direction that can be indicated at this time according to strabismus information executes corresponding with strabismus direction Screen slide.Design in this way, user may not necessarily manually control again, facilitate use of the user to terminal.
It, can by the page turning of strabismus action control or up and down for example, user watches novel attentively when needing page turning or sliding up and down Sliding.Such as, terminal detects that eyeball is seen to below screen, and when exceeding screen area, standardized section automatic, realizes automatic page turning Function facilitates the operation of user.
Further, terminal control terminal provided in an embodiment of the present invention, can also be using in taking pictures.For example, taking pictures It when focusing, can focus to the picture position of eyeball fixes, to realize auto-focusing, and focus point can be infused with eyeball Switch at any time depending on position.In addition user can move click by blink and take pictures, such user can be with both hands when taking pictures Hand-held photographing device, reduces the generation of jitter conditions when taking pictures, improves rate in blocks.
In conclusion carrying out catching for clear and definite to the movement of eyes of user by true feeling pixel in the embodiment of the present invention It catches, the motion information of eyes of user can be obtained in real time, improve the real-time of eye motion infomation detection, and then realize more Accurate eyes control.
Other side according to an embodiment of the present invention provides a kind of terminal, is able to achieve in aforesaid operations control method Details, and reach identical technical effect.
As shown in figure 13, which includes:
First obtains module 1301, for obtaining user's by the true feeling pixel in the front camera mould group of terminal Eye motion information.
Wherein, the true feeling pixel is used to detect and export the profile information of moving object.
Execution module 1302, for executing the mesh for obtaining the eye motion information association that module 1301 obtains with described first Mark control operation.
Optionally, the true feeling pixel is set on the imaging sensor in the front camera mould group, the true feeling The quantity of pixel is at least two.The eye motion information include: eyeball mobile message, blink information at least one Kind.
Optionally, the terminal further include:
Second obtains module, for obtaining the eyes initial information of the front camera acquisition.
Wherein, the initial information includes: that eye socket profile, eyeball size and the eyeball of eyes of user watch terminal attentively respectively When at least two default calibration point on screen, first coordinate position of the eyeball image in described image sensor coordinate system.
Optionally, the eyes initial information further include: at least one of blink information, strabismus information.
Optionally, the blink information includes: that left eye blink information, right eye blink information, left eye and right eye are blinked simultaneously At least one of information.
Wherein, the blink information is the letter obtained when the duration that user's blink acts is greater than or equal to preset threshold Breath.
Optionally, the terminal further include:
Mapping relations establish module, for watching the first coordinate bit when at least two default calibration points attentively respectively according to eyeball The second coordinate position fastened with the corresponding default calibration point in the screen coordinate is set, described image sensor seat is established Coordinate mapping relations between mark system and the screen coordinate system.
Optionally, the mapping relations establish module and include:
Display sub-module, for showing N number of default calibration point.
Submodule is determined, for watching N number of default calibration point that the display sub-module is shown attentively in user eyeball In the process, when determining that user eyeball watches the terminal screen attentively, maximum of the eyeball image on described image sensor coordinate system Zone of action.
Mapping relations setting up submodule, for watching the first coordinate when N number of default calibration point attentively respectively according to eyeball The second coordinate position that position and the corresponding default calibration point are fastened in the screen coordinate, establishes the determining submodule Coordinate mapping relations between the coordinate of determining maximum active area and the screen coordinate system.
Wherein, N is the integer more than or equal to 2.
Optionally, the eye motion information includes the mobile message of eyeball.
Wherein, the first acquisition module 1301 includes:
Acquisition submodule, for obtaining eyeball image under described image sensor coordinate system by the true feeling pixel First movement trajectory coordinates information in moving process.
Submodule is projected, for mapping according to the coordinate between described image sensor coordinate system and the screen coordinate system The first movement trajectory coordinates information that the acquisition submodule obtains is projected as the second shifting that the screen coordinate is fastened by relationship Dynamic rail mark coordinate information;
The execution module 1302 includes:
Implementation sub-module, for executing target control operation corresponding with the second motion track coordinate information.
Optionally, the projection submodule includes:
Detection unit becomes for detecting the coordinate position of the relatively described screen coordinate system in eye socket profile center of eyes of user Change.
Acquiring unit, the relatively described screen in eye socket profile center for detecting eyes of user in the detection unit are sat When coordinate position variation occurs for mark system, by the true feeling pixel, obtain eyes of user eye socket profile center and the screen Relative coordinate position change information between coordinate system.
Mapping relations adjustment unit, relative coordinate position change information for being obtained according to the acquiring unit and affine Transformation adjusts the coordinate mapping relations between described image sensor coordinate system and the screen coordinate system.
Projecting cell, for according to mapping relations adjustment unit described image sensor coordinate system adjusted and institute The coordinate mapping relations between screen coordinate system are stated, the first movement trajectory coordinates information is projected as the screen coordinate system On the second motion track coordinate information.
Optionally, the relative coordinate position change information includes at least one of the following: in the eye socket profile of eyes of user The relatively described screen coordinate of the heart ties up to the eye socket profile center of coordinate position change information in parallel screen direction, eyes of user The relatively described screen coordinate ties up to the eye socket profile center phase of coordinate position change information on vertical screen direction, eyes of user To in the space angle change information of the screen coordinate system.
Optionally, the implementation sub-module includes:
Determination unit, for determining whether user eyeball watches screen area attentively according to the second motion track coordinate information Domain.
First execution unit, for controlling in the case where the determination unit determines that user eyeball watches screen area attentively Position instruction mark is moved to the screen position of eyeball fixes.
Wherein, the position instruction mark is used to indicate the screen position of eyeball fixes.
Second execution unit, in the case where the determination unit determines that user eyeball does not watch screen area attentively, really The fixed second motion track coordinate information is strabismus information, executes target control operation corresponding with the strabismus information.
Optionally, the determination unit includes:
First determines subelement, at the time of for determining that eyeball stops mobile in the second motion track coordinate information, Third coordinate position of the eyeball fixes position in the screen coordinate system;
Second determines subelement, for determining the determining third coordinate position of subelement in the screen area described first In the case where in the indication range in domain, determine that user eyeball watches screen area attentively;
Third determines subelement, for determining that the third coordinate position that subelement determines exceeds the screen described first In the case where the indication range in region, determine that user eyeball does not watch screen area attentively.
In the embodiment of the present invention, the capture of clear and definite, Neng Goushi are carried out to the movement of eyes of user by true feeling pixel When obtain the motion information of eyes of user, improve the real-time of eye motion infomation detection, and then realize more accurate eye Eyeball control.
A kind of hardware structural diagram of Figure 14 terminal of each embodiment to realize the present invention.
The terminal 1400 includes but is not limited to: radio frequency unit 1401, network module 1402, audio output unit 1403, defeated Enter unit 1404, sensor 1405, display unit 1406, user input unit 1407, interface unit 1408, memory 1409, The components such as processor 1410 and power supply 1411.It will be understood by those skilled in the art that terminal structure shown in Figure 14 is not The restriction of structure paired terminal, terminal may include perhaps combining certain components or difference than illustrating more or fewer components Component layout.In embodiments of the present invention, terminal includes but is not limited to mobile phone, tablet computer, laptop, palm electricity Brain, car-mounted terminal, wearable device and pedometer etc..
Processor 1410, for obtaining the eyes fortune of user by the true feeling pixel in the front camera mould group of terminal Dynamic information, and the target control executed with the eye motion information association operates.
Wherein, the true feeling pixel is used to detect and export the profile information of moving object.
In the embodiment of the present invention, the capture of clear and definite, Neng Goushi are carried out to the movement of eyes of user by true feeling pixel When obtain the motion information of eyes of user, improve the real-time of eye motion infomation detection, and then realize more accurate eye Eyeball control.
It should be understood that the embodiment of the present invention in, radio frequency unit 1401 can be used for receiving and sending messages or communication process in, signal Send and receive, specifically, by from base station downlink data receive after, to processor 1410 handle;In addition, by uplink Data are sent to base station.In general, radio frequency unit 1401 includes but is not limited to antenna, at least one amplifier, transceiver, coupling Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 1401 can also by wireless communication system and network and other Equipment communication.
Terminal provides wireless broadband internet by network module 1402 for user and accesses, and such as user is helped to receive and dispatch electricity Sub- mail, browsing webpage and access streaming video etc..
Audio output unit 1403 can be received by radio frequency unit 1401 or network module 1402 or in memory The audio data stored in 1409 is converted into audio signal and exports to be sound.Moreover, audio output unit 1403 can be with Audio output relevant to the specific function that terminal 1400 executes is provided (for example, call signal receives sound, message sink sound Etc.).Audio output unit 1403 includes loudspeaker, buzzer and receiver etc..
Input unit 1404 is for receiving audio or video signal.Input unit 1404 may include graphics processor (Graphics Processing Unit, GPU) 14041 and microphone 14042, graphics processor 14041 are captured in video In mode or image capture mode by image capture apparatus (such as camera) obtain static images or video image data into Row processing.Treated, and picture frame may be displayed on display unit 1406.Through treated the picture frame of graphics processor 14041 It can store in memory 1409 (or other storage mediums) or carried out via radio frequency unit 1401 or network module 1402 It sends.Microphone 14042 can receive sound, and can be audio data by such acoustic processing.Audio that treated Data can be converted to the lattice that mobile communication base station can be sent to via radio frequency unit 1401 in the case where telephone calling model Formula output.
Terminal 1400 further includes at least one sensor 1405, such as optical sensor, motion sensor and other sensings Device.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment light The light and shade of line adjusts the brightness of display panel 14061, and proximity sensor can close display when terminal 1400 is moved in one's ear Panel 14061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect (generally three in all directions Axis) acceleration size, can detect that size and the direction of gravity when static, can be used to identify terminal posture (such as horizontal/vertical screen Switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap) etc.;Sensor 1405 It can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, hygrometer, temperature Meter, infrared sensor etc. are spent, details are not described herein.
Display unit 1406 is for showing information input by user or being supplied to the information of user.Display unit 1406 can Including display panel 14061, liquid crystal display (Liquid Crystal Display, LCD), organic light-emitting diodes can be used Forms such as (Organic Light-Emitting Diode, OLED) are managed to configure display panel 14061.
User input unit 1407 can be used for receiving the number or character information of input, and generates and set with the user of terminal It sets and the related key signals of function control inputs.Specifically, user input unit 1407 include touch panel 14071 and its His input equipment 14072.Touch panel 14071, also referred to as touch screen collect the touch operation of user on it or nearby (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 14071 or in touch panel Operation near 14071).Touch panel 14071 may include both touch detecting apparatus and touch controller.Wherein, it touches The touch orientation of detection device detection user is touched, and detects touch operation bring signal, transmits a signal to touch controller; Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 1410, It receives the order that processor 1410 is sent and is executed.Furthermore, it is possible to using resistance-type, condenser type, infrared ray and surface The multiple types such as sound wave realize touch panel 14071.In addition to touch panel 14071, user input unit 1407 can also include Other input equipments 14072.Specifically, other input equipments 14072 can include but is not limited to physical keyboard, function key (ratio Such as volume control button, switch key), trace ball, mouse, operating stick, details are not described herein.
Further, touch panel 14071 can be covered on display panel 14061, when touch panel 14071 detects After touch operation on or near it, processor 1410 is sent to determine the type of touch event, is followed by subsequent processing device 1410 Corresponding visual output is provided on display panel 14061 according to the type of touch event.Although in Figure 14, touch panel 14071 and display panel 14061 are the functions that outputs and inputs of realizing terminal as two independent components, but certain In embodiment, touch panel 14071 and display panel 14061 can be integrated and be realized the function that outputs and inputs of terminal, tool Body is herein without limitation.
Interface unit 1408 is the interface that external device (ED) is connect with terminal 1400.For example, external device (ED) may include wired Or wireless head-band earphone port, external power supply (or battery charger) port, wired or wireless data port, storage card end Mouth, port, the port audio input/output (I/O), video i/o port, earphone end for connecting the device with identification module Mouthful etc..Interface unit 1408 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) and By one or more elements that the input received is transferred in terminal 1400 or can be used in terminal 1400 and external dress Data are transmitted between setting.
Memory 1409 can be used for storing software program and various data.Memory 1409 can mainly include storage program Area and storage data area, wherein storing program area can application program needed for storage program area, at least one function (such as Sound-playing function, image player function etc.) etc.;Storage data area, which can be stored, uses created data (ratio according to mobile phone Such as audio data, phone directory) etc..In addition, memory 1409 may include high-speed random access memory, it can also include non- Volatile memory, for example, at least a disk memory, flush memory device or other volatile solid-state parts.
Processor 1410 is the control centre of terminal, using the various pieces of various interfaces and the entire terminal of connection, By running or execute the software program and/or module that are stored in memory 1409, and calls and be stored in memory 1409 Interior data execute the various functions and processing data of terminal, to carry out integral monitoring to terminal.Processor 1410 may include One or more processing units;Preferably, processor 1410 can integrate application processor and modem processor, wherein answer With the main processing operation system of processor, user interface and application program etc., modem processor mainly handles wireless communication. It is understood that above-mentioned modem processor can not also be integrated into processor 1410.
Terminal 1400 can also include the power supply 1411 (such as battery) powered to all parts, it is preferred that power supply 1411 Can be logically contiguous by power-supply management system and processor 1410, to realize management charging by power-supply management system, put The functions such as electricity and power managed.
In addition, terminal 1400 includes some unshowned functional modules, details are not described herein.
Preferably, the embodiment of the present invention also provides a kind of terminal, including processor 1410, and memory 1409 is stored in On reservoir 1409 and the computer program that can run on the processor 1410, the computer program are executed by processor 1410 Each process of Shi Shixian aforesaid operations control method embodiment, and identical technical effect can be reached, to avoid repeating, here It repeats no more.
The embodiment of the present invention also provides a kind of computer readable storage medium, and meter is stored on computer readable storage medium Calculation machine program realizes each process of aforesaid operations control method embodiment, and energy when the computer program is executed by processor Reach identical technical effect, to avoid repeating, which is not described herein again.Wherein, the computer readable storage medium, such as only Read memory (Read-Only Memory, abbreviation ROM), random access memory (Random Access Memory, abbreviation RAM), magnetic or disk etc..
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do There is also other identical elements in the process, method of element, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art The part contributed out can be embodied in the form of software products, which is stored in a storage medium In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal (can be mobile phone, computer, service Device, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much Form belongs within protection of the invention.

Claims (15)

1. a kind of method of controlling operation thereof is applied to terminal characterized by comprising
By the true feeling pixel in the front camera mould group of terminal, the eye motion information of user is obtained;Wherein, the true feeling Pixel is used to detect and export the profile information of moving object;
The target control with the eye motion information association is executed to operate.
2. the method according to claim 1, wherein the true feeling pixel is set to the front camera mould group In imaging sensor on, the quantity of the true feeling pixel is at least two;The eye motion information includes: the movement of eyeball At least one of information, blink information.
3. the method according to claim 1, wherein the true feeling in the front camera mould group by terminal Pixel, before the eye motion information for obtaining user, the method also includes:
Obtain the eyes initial information of the front camera acquisition;
Wherein, the initial information includes: that eye socket profile, eyeball size and the eyeball of eyes of user watch terminal screen attentively respectively On at least two default calibration points when, first coordinate position of the eyeball image in described image sensor coordinate system.
4. according to the method described in claim 3, it is characterized in that, the eyes initial information further include: blink information, strabismus At least one of information.
5. method according to claim 2 or 4, which is characterized in that the blink information includes: left eye blink information, the right side Eye blink information, left eye and right eye are blinked at least one of information simultaneously;
Wherein, the blink information is the information obtained when the duration that user's blink acts is greater than or equal to preset threshold.
6. according to the method described in claim 3, it is characterized in that, the eyes for obtaining the front camera acquisition are initial After information, the method also includes:
Watch the first coordinate position and the corresponding default calibration point when at least two default calibration points attentively respectively according to eyeball In the second coordinate position that the screen coordinate is fastened, establish between described image sensor coordinate system and the screen coordinate system Coordinate mapping relations.
7. according to the method described in claim 6, it is characterized in that, described watch at least two default calibrations attentively according to eyeball respectively The second coordinate position that the first coordinate position and the corresponding default calibration point when point are fastened in the screen coordinate, is established Coordinate mapping relations between described image sensor coordinate system and the screen coordinate system, comprising:
Show N number of default calibration point;
During user eyeball watches N number of default calibration point attentively, when determining that user eyeball watches the terminal screen attentively, eye Maximum active area of the ball image on described image sensor coordinate system;
The first coordinate position and the corresponding default calibration point when watching N number of default calibration point attentively respectively according to eyeball exist The second coordinate position that the screen coordinate is fastened, is established between the coordinate of the maximum active area and the screen coordinate system Coordinate mapping relations;
Wherein, N is the integer more than or equal to 2.
8. the method according to the description of claim 7 is characterized in that the eye motion information includes the mobile message of eyeball;
True feeling pixel in the front camera mould group by terminal, obtains the eye motion information of user, comprising:
By the true feeling pixel, first movement of the eyeball image under described image sensor coordinate system during movement is obtained Trajectory coordinates information;
According to the coordinate mapping relations between described image sensor coordinate system and the screen coordinate system, by the first movement Trajectory coordinates information is projected as the second motion track coordinate information that the screen coordinate is fastened;
The execution is operated with the associated target control of the Eyeball motion information, comprising:
Execute target control operation corresponding with the second motion track coordinate information.
9. according to the method described in claim 8, it is characterized in that, described according to described image sensor coordinate system and the screen Coordinate mapping relations between curtain coordinate system, are projected as what the screen coordinate was fastened for the first movement trajectory coordinates information Second motion track coordinate information, comprising:
Detect the coordinate position variation of the relatively described screen coordinate system in eye socket profile center of eyes of user;
When coordinate position variation occurs for the relatively described screen coordinate system at the eye socket profile center for detecting eyes of user, pass through institute True feeling pixel is stated, the relative coordinate position variation letter between the eye socket profile center of eyes of user and the screen coordinate system is obtained Breath;
According to the relative coordinate position change information and affine transformation, described image sensor coordinate system and the screen are adjusted Coordinate mapping relations between coordinate system;
It, will be described according to the coordinate mapping relations between described image sensor coordinate system adjusted and the screen coordinate system First movement trajectory coordinates information is projected as the second motion track coordinate information that the screen coordinate is fastened.
10. according to the method described in claim 9, it is characterized in that, the relative coordinate position change information include with down toward One item missing: the relatively described screen coordinate in eye socket profile center of eyes of user ties up to the variation of the coordinate position in parallel screen direction Information, eyes of user the relatively described screen coordinate in eye socket profile center tie up to the coordinate position on vertical screen direction variation letter It ceases, in the space angle change information of the relatively described screen coordinate system in eye socket profile center of eyes of user.
11. according to the method described in claim 8, it is characterized in that, the execution and the second motion track coordinate information Corresponding target control operation, comprising:
According to the second motion track coordinate information, determine whether user eyeball watches screen area attentively;
In the case where user eyeball watches screen area attentively, control position indicateing arm is moved to the screen position of eyeball fixes, institute Rheme sets the screen position that indicateing arm is used to indicate eyeball fixes;
In the case where user eyeball does not watch screen area attentively, determines that the second motion track coordinate information is strabismus information, hold Row target control operation corresponding with the strabismus information.
12. according to the method for claim 11, which is characterized in that it is described according to the second motion track coordinate information, Determine whether user eyeball watches screen area attentively, comprising:
At the time of determining that eyeball stops mobile in the second motion track coordinate information, eyeball fixes position is sat in the screen Third coordinate position in mark system;
In the case where the third coordinate position is in the indication range of the screen area, determine that user eyeball watches screen attentively Region;
In the case where the third coordinate position exceeds the indication range of the screen area, determine that user eyeball does not watch screen attentively Curtain region.
13. a kind of terminal characterized by comprising
First obtains module, for obtaining the eye motion of user by the true feeling pixel in the front camera mould group of terminal Information;Wherein, the true feeling pixel is used to detect and export the profile information of moving object;
Execution module is operated for executing the target control for obtaining the eye motion information association that module obtains with described first.
14. a kind of terminal characterized by comprising processor, memory and be stored on the memory and can be at the place The computer program run on reason device is realized when the computer program is executed by the processor as in claim 1 to 12 The step of described in any item method of controlling operation thereof.
15. a kind of computer readable storage medium, which is characterized in that store computer journey on the computer readable storage medium Sequence realizes the method for controlling operation thereof as described in any one of claims 1 to 12 when the computer program is executed by processor The step of.
CN201910467860.9A 2019-05-31 2019-05-31 A kind of method of controlling operation thereof and terminal Pending CN110196640A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910467860.9A CN110196640A (en) 2019-05-31 2019-05-31 A kind of method of controlling operation thereof and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910467860.9A CN110196640A (en) 2019-05-31 2019-05-31 A kind of method of controlling operation thereof and terminal

Publications (1)

Publication Number Publication Date
CN110196640A true CN110196640A (en) 2019-09-03

Family

ID=67753470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910467860.9A Pending CN110196640A (en) 2019-05-31 2019-05-31 A kind of method of controlling operation thereof and terminal

Country Status (1)

Country Link
CN (1) CN110196640A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111182230A (en) * 2019-12-31 2020-05-19 维沃移动通信有限公司 Image processing method and device
CN111722708A (en) * 2020-04-29 2020-09-29 中国人民解放军战略支援部队信息工程大学 Eye movement-based multi-dimensional geographic information self-adaptive intelligent interaction method and device
CN111831119A (en) * 2020-07-10 2020-10-27 Oppo广东移动通信有限公司 Eyeball tracking method and device, storage medium and head-mounted display equipment
CN112288855A (en) * 2020-10-29 2021-01-29 张也弛 Method and device for establishing eye gaze model of operator
CN112672054A (en) * 2020-12-25 2021-04-16 维沃移动通信有限公司 Focusing method and device and electronic equipment
CN113223048A (en) * 2021-04-20 2021-08-06 深圳瀚维智能医疗科技有限公司 Hand-eye calibration precision determination method and device, terminal equipment and storage medium
CN114690909A (en) * 2022-06-01 2022-07-01 润芯微科技(江苏)有限公司 AI visual self-adaption method, device, system and computer readable medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050273185A1 (en) * 2004-06-02 2005-12-08 Winfried Teiwes Method and apparatus for eye tracking latency reduction
CN101909219A (en) * 2010-07-09 2010-12-08 深圳超多维光电子有限公司 Stereoscopic display method, tracking type stereoscopic display and image processing device
CN103310186A (en) * 2012-02-29 2013-09-18 三星电子株式会社 Method for correcting user's gaze direction in image, machine-readable storage medium and communication terminal
CN107003744A (en) * 2016-12-01 2017-08-01 深圳前海达闼云端智能科技有限公司 Viewpoint determines method, device, electronic equipment and computer program product
CN107256375A (en) * 2017-01-11 2017-10-17 西南科技大学 Human body sitting posture monitoring method before a kind of computer
US9836895B1 (en) * 2015-06-19 2017-12-05 Waymo Llc Simulating virtual objects
CN107884947A (en) * 2017-11-21 2018-04-06 中国人民解放军海军总医院 Auto-stereoscopic mixed reality operation simulation system
CN108491072A (en) * 2018-03-05 2018-09-04 京东方科技集团股份有限公司 A kind of virtual reality exchange method and device
WO2019067731A1 (en) * 2017-09-28 2019-04-04 Zermatt Technologies Llc Method and device for eye tracking using event camera data

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050273185A1 (en) * 2004-06-02 2005-12-08 Winfried Teiwes Method and apparatus for eye tracking latency reduction
CN101909219A (en) * 2010-07-09 2010-12-08 深圳超多维光电子有限公司 Stereoscopic display method, tracking type stereoscopic display and image processing device
CN103310186A (en) * 2012-02-29 2013-09-18 三星电子株式会社 Method for correcting user's gaze direction in image, machine-readable storage medium and communication terminal
US9836895B1 (en) * 2015-06-19 2017-12-05 Waymo Llc Simulating virtual objects
CN107003744A (en) * 2016-12-01 2017-08-01 深圳前海达闼云端智能科技有限公司 Viewpoint determines method, device, electronic equipment and computer program product
CN107256375A (en) * 2017-01-11 2017-10-17 西南科技大学 Human body sitting posture monitoring method before a kind of computer
WO2019067731A1 (en) * 2017-09-28 2019-04-04 Zermatt Technologies Llc Method and device for eye tracking using event camera data
CN107884947A (en) * 2017-11-21 2018-04-06 中国人民解放军海军总医院 Auto-stereoscopic mixed reality operation simulation system
CN108491072A (en) * 2018-03-05 2018-09-04 京东方科技集团股份有限公司 A kind of virtual reality exchange method and device

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111182230A (en) * 2019-12-31 2020-05-19 维沃移动通信有限公司 Image processing method and device
CN111722708A (en) * 2020-04-29 2020-09-29 中国人民解放军战略支援部队信息工程大学 Eye movement-based multi-dimensional geographic information self-adaptive intelligent interaction method and device
CN111722708B (en) * 2020-04-29 2021-06-08 中国人民解放军战略支援部队信息工程大学 Eye movement-based multi-dimensional geographic information self-adaptive intelligent interaction method and device
CN111831119A (en) * 2020-07-10 2020-10-27 Oppo广东移动通信有限公司 Eyeball tracking method and device, storage medium and head-mounted display equipment
CN112288855A (en) * 2020-10-29 2021-01-29 张也弛 Method and device for establishing eye gaze model of operator
CN112672054A (en) * 2020-12-25 2021-04-16 维沃移动通信有限公司 Focusing method and device and electronic equipment
CN113223048A (en) * 2021-04-20 2021-08-06 深圳瀚维智能医疗科技有限公司 Hand-eye calibration precision determination method and device, terminal equipment and storage medium
CN113223048B (en) * 2021-04-20 2024-02-27 深圳瀚维智能医疗科技有限公司 Method and device for determining hand-eye calibration precision, terminal equipment and storage medium
CN114690909A (en) * 2022-06-01 2022-07-01 润芯微科技(江苏)有限公司 AI visual self-adaption method, device, system and computer readable medium

Similar Documents

Publication Publication Date Title
CN110196640A (en) A kind of method of controlling operation thereof and terminal
CN107580184B (en) A kind of image pickup method and mobile terminal
WO2019109729A1 (en) Bone posture determining method and device, and computer readable storage medium
CN110647865A (en) Face gesture recognition method, device, equipment and storage medium
CN109660719A (en) A kind of information cuing method and mobile terminal
CN110177221A (en) The image pickup method and device of high dynamic range images
CN110136142A (en) A kind of image cropping method, apparatus, electronic equipment
CN108712603B (en) Image processing method and mobile terminal
TWI701941B (en) Method, apparatus and electronic device for image processing and storage medium thereof
CN108182896B (en) A kind of brightness detection method, device and mobile terminal
US9412190B2 (en) Image display system, image display apparatus, image display method, and non-transitory storage medium encoded with computer readable program
CN110199251A (en) Display device and remote operation control device
CN110198412A (en) A kind of video recording method and electronic equipment
CN106484085A (en) Method and its head mounted display of real-world object is shown in head mounted display
CN107948498B (en) A kind of elimination camera Morie fringe method and mobile terminal
CN107864336B (en) A kind of image processing method, mobile terminal
CN109218626A (en) A kind of photographic method and terminal
CN110136190A (en) A kind of distance measuring method and electronic equipment
CN109474786A (en) A kind of preview image generation method and terminal
CN108289151A (en) A kind of operating method and mobile terminal of application program
CN109120800A (en) A kind of application icon method of adjustment and mobile terminal
CN107092359A (en) Virtual reality visual angle method for relocating, device and terminal
CN109379539A (en) A kind of screen light compensation method and terminal
CN109241832A (en) A kind of method and terminal device of face In vivo detection
CN109525837A (en) The generation method and mobile terminal of image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190903

RJ01 Rejection of invention patent application after publication