CN107003744A - Viewpoint determines method, device, electronic equipment and computer program product - Google Patents

Viewpoint determines method, device, electronic equipment and computer program product Download PDF

Info

Publication number
CN107003744A
CN107003744A CN201680002751.4A CN201680002751A CN107003744A CN 107003744 A CN107003744 A CN 107003744A CN 201680002751 A CN201680002751 A CN 201680002751A CN 107003744 A CN107003744 A CN 107003744A
Authority
CN
China
Prior art keywords
eyeball
user
viewpoint
infrared camera
user eyeball
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201680002751.4A
Other languages
Chinese (zh)
Other versions
CN107003744B (en
Inventor
骆磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Shenzhen Robotics Systems Co Ltd
Cloudminds Inc
Original Assignee
Cloudminds Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Inc filed Critical Cloudminds Inc
Publication of CN107003744A publication Critical patent/CN107003744A/en
Application granted granted Critical
Publication of CN107003744B publication Critical patent/CN107003744B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Position Input By Displaying (AREA)
  • Image Analysis (AREA)

Abstract

Method, device, electronic equipment and computer program product are determined the invention provides viewpoint, methods described includes:User eyeball is obtained in the image space of the imaging sensor of infrared camera and the rotational angle of user eyeball;Determine the distance of the camera lens of user eyeball and infrared camera;The position of user eyeball is determined according to the image space and the distance;The viewpoint position of eyeball is determined according to the position of the user eyeball and the rotational angle.The technical scheme that the present invention is provided makes user carry out multiple Supplements without coupled system equipment in viewpoint determination process, user's viewpoint position directly can be determined according to the shooting of infrared camera, reduce the passive participation of user, the system of shortening determines that the operation of user's viewpoint performs the time, make viewpoint determination process more simple efficient, Consumer's Experience is more preferable.

Description

Viewpoint determines method, device, electronic equipment and computer program product
Technical field
The present invention relates to eyeball technical field, and in particular to a kind of viewpoint determines method, device, electronic equipment and computer Program product.
Background technology
The dynamic control of current eye is a technology for comparing forward position, and user no longer needs mouse, the equipment such as directionkeys, touch pad Cursor is controlled, but by being directly tracked to eyeball, track eyeball see to where cursor with regard to where, realize Mobile control to cursor, so as to largely improve operating efficiency.
Dynamic determination of the control based on user eyeball viewpoint of eye, existing viewpoint determines that technology need to carry out lengthy and tedious calibration, User must be sitting in a position relatively fixed, and the height of sight is also required to enter in a specific scope, so Multiple calibration points that eyeball is watched attentively on screen successively afterwards, can just carry out the determination of user eyeball viewpoint position, such as after completing calibration Relative position between fruit user and system equipment is changed, then needs to recalibrate, viewpoint position could be carried out again It is determined that, therefore, in user's viewpoint determination process, to avoid multiple lengthy and tedious calibration, the locus of user eyeball needs to keep It is as far as possible constant, influence customer users experience.
The content of the invention
The embodiment of the present invention proposes viewpoint and determines method, device, electronic equipment and computer program product, is mainly used to The judicial convenience degree of viewpoint determination process is improved, so as to reduce the passive participation of user in viewpoint determination process.
In one aspect, method is determined the embodiments of the invention provide a kind of viewpoint, comprised the following steps:
User eyeball is obtained in the image space of the imaging sensor of infrared camera and the rotational angle of user eyeball;
Determine the distance of the camera lens of user eyeball and infrared camera;
The position of user eyeball is determined according to the image space and the distance;
The viewpoint position of eyeball is determined according to the position of the user eyeball and the rotational angle.
In another aspect, the embodiments of the invention provide a kind of viewpoint determining device, including:
Acquisition module, for obtaining user eyeball in the image space of the imaging sensor of infrared camera and user's eye The rotational angle of ball;
Viewpoint position determining module, the distance of the camera lens for determining user eyeball and infrared camera, according to it is described into Image position and the distance determine the reference position of user eyeball, are determined according to the reference position and the rotational angle The viewpoint position of eyeball.
In another aspect, the embodiments of the invention provide a kind of electronic equipment, the electronic equipment includes:Infrared photography Head, memory, one or more processors;And one or more modules, one or more of modules are stored in described In memory, and it is configured to by one or more of computing devices, one or more of modules include being used to perform The instruction of each step in any above method.In another aspect, the embodiments of the invention provide a kind of with including camera Electronic equipment be used in combination computer program product, the computer program product include computer-readable storage medium Be embedded in computer program mechanism therein, the computer program mechanism includes being used to perform each in any above method The instruction of step.
Beneficial effects of the present invention are as follows:
The technical scheme that the embodiment of the present invention is provided shoots user eyeball by infrared camera, therefrom obtains user's eye Ball the imaging sensor of infrared camera image space and user eyeball and infrared camera camera lens distance so that The position of user eyeball is determined, and is combined in the viewpoint position that the rotational angle of user eyeball determines eyeball, viewpoint determination process Supplements need not be carried out, user's viewpoint position directly can be determined according to the shooting of infrared camera, reduce the passive of user Participate in, the system of shortening determines that the operation of user's viewpoint performs the time, makes viewpoint determination process more simple efficient, Consumer's Experience More preferably.
Brief description of the drawings
The specific embodiment of the present invention is described below with reference to accompanying drawings, wherein:
Fig. 1 is the step flow chart that a kind of viewpoint for providing determines method in the embodiment of the present invention,;
Fig. 2 is a kind of structural representation of the viewpoint determining device provided in the embodiment of the present invention;
Fig. 3 is the schematic diagram of user eyeball viewpoint determination process in the embodiment of the present invention;
Fig. 4 is a kind of electronic equipment schematic diagram that provides in the embodiment of the present invention.
Embodiment
In order that technical scheme and advantage are more clearly understood, below in conjunction with accompanying drawing to the exemplary of the present invention Embodiment is described in more detail, it is clear that described embodiment is only a part of embodiment of the present invention, rather than The exhaustion of all embodiments.And in the case where not conflicting, the feature in embodiment and embodiment in this explanation can be mutual It is combined.
Inventor notices during invention:Existing eye moves control technology and carried out before the dynamic control of eye, need to be to user's viewpoint Position is determined, and user's coupled system equipment may be needed to carry out multiple multiple spot calibration during this, just can ensure that really The accuracy of fixed user's viewpoint position, and user need to keep the relative position between system equipment to try one's best fixation, if phase Position is changed, it is necessary to which coupled system equipment carries out multiple multiple spot calibration to user again again, and therefore, existing user regards Point determination process is complicated, time-consuming longer, and poor user experience.
For above-mentioned deficiency, the embodiment of the present invention proposes a kind of viewpoint and determines method, is illustrated below.
Fig. 1 is the step flow chart that a kind of viewpoint for providing determines method in the embodiment of the present invention, as shown in figure 1, including Following steps:
S101:User eyeball is obtained in the image space of the imaging sensor of infrared camera and the rotation of user eyeball Angle;
S102:Determine the distance of the camera lens of user eyeball and infrared camera;
S103:The position of user eyeball is determined according to the image space and the distance;
S104:The viewpoint position of eyeball is determined according to the position of the user eyeball and the rotational angle.
In some embodiments, the image of user eyeball is captured by infrared camera, by analyzing the user eyeball Image would know that the rotational angle of user eyeball, and user eyeball and infrared photography are determined according to the size of eyeball in image The distance of the camera lens of head, certainly, the distance can also be measured according to other survey tools and determined, such as infrared laser ranging device, AF is surveyed Away from device, or similar RealSense threedimensional model range finder etc..By obtaining image of the user eyeball in infrared camera The image space of sensor, and combine the user eyeball of above-mentioned determination and the distance of camera lens of infrared camera determines user eyeball Position, and the rotational angle of the position and user eyeball according to the user eyeball of determination would know that user eyeball sight, enter And would know that user's viewpoint position that user eyeball sight intersects with user depending on screen plane.As can be seen here, the present invention is implemented The viewpoint provided in example determines calibration of the method without user's multiple multiple spot of coupled system equipment progress in viewpoint determination process, User's viewpoint position directly can be determined according to the shooting of infrared camera, reduce the passive participation of user, shorten system true The operation for determining user's viewpoint performs the time, makes viewpoint determination process more simple efficient, Consumer's Experience is more preferable.
In some embodiments, the distance for determining the camera lens of user eyeball and infrared camera can be the eyeball of user two Center to the distance of camera lens, user eyeball image space is the centre bit that is imaged in the image sensor of the eyeball of user two Put;Or be the distance of any one eyeball of user to camera lens, user eyeball image space is that this of user eyeball is passed in image It is imaged in sensor.
In some embodiments, image space and use of the user eyeball in the imaging sensor of infrared camera are obtained The distance of the rotational angle of family eyeball and the camera lens of determination user eyeball and infrared camera can be carried out simultaneously, also can be according to certain Order is set for successively operating.
In specific implementation.The viewpoint position of eyeball is determined according to the position of the user eyeball and the rotational angle, It can include:Reference position is determined according to the position of the user eyeball, the reference position be on screen, and with it is described The line of the position of the user eyeball location point perpendicular with the screen that user eyeball is watched attentively;According to the reference position and The rotational angle determines the viewpoint position of eyeball.
In some embodiments, realized according to the position of the user eyeball and the rotational angle and determine eyeball Viewpoint position, can would know that user eyeball sight by the position of the user eyeball of determination and the rotational angle of user eyeball, The intersection point that the user eyeball sight of determination intersects with user depending on screen plane, as user's viewpoint position;Also basis can be passed through The position of user eyeball determines the perpendicular reference position of the screen that the line of the position of user eyeball and user eyeball are watched attentively, The viewpoint position of eyeball is determined according to the reference position and the rotational angle.
In specific implementation, reference position is determined according to the position of the user eyeball, can be included:In described image sensing Two-dimensional coordinate space is set up in device, the central point of described image sensor is the x-axis of the coordinate space and the intersecting original of y-axis Point;User eyeball is obtained in the coordinate of the image space of the imaging sensor of infrared camera, user is determined according to equation below The intersecting origin of eyeball incident ray and described image sensor angle β, x-axis and y-axis is to reference position apart from L2, Yi Jiji The coordinate that level is put:
β=arctan (f/b);
L2=L1*cos β;
The coordinate of reference position is (- L2*cos α, L2*sin α);
Wherein, f is the infrared camera focal length, and b is the distance of the image space and the origin, and L1 is user's eye The distance of the camera lens of ball and infrared camera, α is the image space and the angle of x-axis.
In some embodiments, the viewpoint provided in embodiments of the present invention is determined in method, due to infrared camera Both relative positions are fixed with user institute video curtain, and institute's screen curtain resolution ratio, it is known that setting up two dimension in the image sensor After coordinate space, because both are in same two-dimensional coordinate space, then the coordinate bit of each pixel on institute's video curtain is understood Put, so that user's viewpoint position coordinate is can determine that, so that the dynamic control of follow-up eye is realized.
In specific implementation, the viewpoint position of eyeball is determined according to the reference position and the rotational angle, including:Root According to the horizontal direction angle component j and vertical direction angle component k of the reference position and the rotational angle, eyeball is determined Viewpoint position coordinate be (- L1*cos (arctan (f/b)) * cos α+L1*sin (arctan (f/b)) * tan (j), L1*cos (arctan(f/b))*sinα+L1*sin(arctan(f/b))*tan(k))。
In some embodiments, the viewpoint that the present invention is provided determines in method that infrared camera captures user eyeball figure Picture, can obtain the rotational angle for user eyeball, by split eyeball rotational angle horizontal direction angle component j with it is perpendicular Nogata combines reference position to angle component k, is moved in its x-axis and y-axis direction, obtains corresponding user eyeball and regards Point position coordinates.
In specific implementation, the viewpoint position of eyeball is determined according to the position of the user eyeball and the rotational angle, Including:The user eyeball the image space of the imaging sensor of infrared camera variable quantity be more than first threshold, and/ Or user eyeball is when the variable quantity of the imaging size of the imaging sensor of infrared camera is more than Second Threshold, after change User eyeball infrared camera imaging sensor image space and change after user eyeball and infrared camera The distance of camera lens determine the position of user eyeball;The user eyeball infrared camera imaging sensor into image position The variable quantity put is less than or equal to first threshold, and user eyeball is in the change of the imaging size of the imaging sensor of infrared camera When amount is less than or equal to Second Threshold, the position of user eyeball is determined according to the image space and the distance.
In some embodiments, the determination of user's viewpoint is to carry out in real time, when user's progress eye moves control, the dynamic control of eye The viewpoint of system is probably the control operation of continuous multiple spot, and the relative position of eyeball and the institute screen curtain of user may be due to User walk about or head rotation and be changed, this change is probably the change that is acted and produced of user's subjectivity, It is probably that user is unconscious or mild action and the change that produces, is produced for that may be acted due to user's subjectivity Change, the viewpoint that provides of the present invention determines that method can detect change in location of the ocular imaging in described image sensor of user And/or the ocular imaging of the user is detecting the ocular imaging of user described in the size variation of described image sensor The location variation of imaging sensor be more than first threshold, and/or the user ocular imaging in described image sensor When size variation amount is more than Second Threshold, according to the distance and ocular imaging of the user eyeball after change and the camera lens again Determine reference position, and then the Rotation of eyeball angle-determining eyeball viewpoint position after putting and change according to the level redefined Put, or be less than or equal to first threshold detecting the location variation of the ocular imaging of user in described image sensor, and The ocular imaging of the user is when the size variation amount of described image sensor is less than or equal to Second Threshold, both without counting again New reference position, can continue with the reference position calculated in front and continued viewpoint determination process, based on the reference position with And the Rotation of eyeball angle-determining eyeball viewpoint position after change.Therefore, the viewpoint provided in the embodiment of the present invention determines method It can support user is continuous repeatedly to move control to the eye of screen, and without repeatedly correcting in control process, it is only necessary to re-start Once check that differentiation can be calibrated automatically, redefine the exact position of the blinkpunkt of user, it is not necessary to which user is repeatedly matched somebody with somebody Close system equipment to indicate, shorten viewpoint and determine the time, and control operation is more smooth.
In addition, the viewpoint that the present invention is provided determines that the first threshold set in method and Second Threshold can be allowable errors Value, specifically can draw rational threshold value according to the experiment of specific specification of equipment, different size equipment this threshold value may also can be different, this Field staff planners can freely set according to specific implementation environment or experience.
In some embodiments, the viewpoint that the present invention is provided determines in method user eyeball sight can also be pointed out vertical In institute video curtain to obtain reference position.In implementation, point out to allow user's sight normal to screen to be arbitrary form, such as language Sound is pointed out, picture cues etc..
Based on same inventive concept, a kind of viewpoint determining device is additionally provided in the embodiment of the present invention, due to these devices The principle solved the problems, such as determines that method is similar to viewpoint, therefore the implementation of these devices may refer to the implementation of method, repetition Place is repeated no more.
Fig. 2 is a kind of structural representation of the viewpoint determining device provided in the embodiment of the present invention, as shown in Fig. 2 including:
Acquisition module 201, for obtaining image space and use of the user eyeball in the imaging sensor of infrared camera The rotational angle of family eyeball;
Viewpoint position determining module 202, the distance of the camera lens for determining user eyeball and infrared camera, according to described Image space and the distance determine the reference position of user eyeball, true according to the reference position and the rotational angle Determine the viewpoint position of eyeball.
In some embodiments, the viewpoint determining device that provides of the present invention is preferably fixed to same flat with institute screen curtain The position in face, for example, the viewpoint determining device that can provide the present invention is arranged on the underface of institute's screen curtain, surface or oblique Lower section etc., certainly, could be secured to and the non-conplane position of institute's screen curtain.In addition, what is provided in the embodiment of the present invention regards Point determining device is removed to set respectively with institute screen curtain and installed, and also can be arranged at one with institute's screen curtain, i.e., viewpoint determination be put into collection Into in institute's screen curtain.
In specific implementation, viewpoint position determining module is used for position and the rotational angle according to the user eyeball The viewpoint position of eyeball is determined, including:Reference position is determined according to the position of the user eyeball, the reference position be positioned at On screen, and line with the position of the user eyeball location point perpendicular with screen that user eyeball is watched attentively;According to The reference position and the rotational angle determine the viewpoint position of eyeball.
In specific implementation, the viewpoint position determining module is used to determine benchmark position according to the position of the user eyeball Put, including:Two-dimensional coordinate space is set up in described image sensor, the central point of described image sensor is empty for the coordinate Between x-axis and y-axis intersecting origin;Coordinate of the user eyeball in the image space of the imaging sensor of infrared camera is obtained, User eyeball incident ray and described image sensor angle β are determined according to equation below, the intersecting origin of x-axis and y-axis to base The coordinate apart from L2, and reference position that level is put:
β=arctan (f/b);
L2=L1*cos β;
The coordinate of reference position is (- L2*cos α, L2*sin α);
Wherein, f is the infrared camera focal length, and b is the distance of the image space and the origin, and L1 is user's eye The distance of the camera lens of ball and infrared camera, α is the image space and the angle of x-axis.
In specific implementation, viewpoint position determining module is used to determine eye according to the reference position and the rotational angle The viewpoint position of ball, including:According to the horizontal direction angle component j of the reference position and the rotational angle and vertical side To angle component k, the viewpoint position coordinate for determining eyeball is (- L1*cos (arctan (f/b)) * cos α+L1*sin (arctan (f/b)) * tan (j), L1*cos (arctan (f/b)) * sin α+L1*sin (arctan (f/b)) * tan (k)).
In specific implementation, viewpoint position determining module is used for position and the rotational angle according to the user eyeball The viewpoint position of eyeball is determined, including:The user eyeball the image space of the imaging sensor of infrared camera change Change amount is more than first threshold, and/or user eyeball is more than in the variable quantity of the imaging size of the imaging sensor of infrared camera During Second Threshold, according to the user eyeball after change after the image space of the imaging sensor of infrared camera and change The distance of the camera lens of user eyeball and infrared camera determines the position of user eyeball;In the user eyeball in infrared camera Imaging sensor image space variable quantity be less than or equal to first threshold, and user eyeball infrared camera image pass When the variable quantity of the imaging size of sensor is less than or equal to Second Threshold, user is determined according to the image space and the distance The position of eyeball.
For the ease of the implementation of the present invention, further illustrated below with instantiation.
The viewpoint determining device provided in the embodiment of the present invention may be provided at the underface of institute's screen curtain, certainly, can also set The other positions in the screen are put, for example, setting the same plane of viewpoint determining device not with institute screen curtain, or viewpoint is set The surface in institute's screen curtain of determining device, obliquely downward, or left side etc..In addition, what is provided in the embodiment of the present invention regards Point determining device is removed to set respectively with institute screen curtain and installed, and also can be arranged at one with institute's screen curtain, i.e., by viewpoint determining device It is integrated in institute's screen curtain.
According to convex lens imaging principle, user eyeball is imaged on the image sensor by camera lens, can be passed in image The two-dimensional coordinate space of x-axis and y-axis is set up in sensor, because the horizontal and vertical pixel number of imaging sensor is known 's.For example, imaging sensor is 1920x1080 pixel, center is set to (0,0) position of xy axles, it is assumed that user Ocular imaging is from left to right the 1500th pixel, from top to bottom the 800th pixel, then understand tan α=(800-540)/ (1500-960), then can obtain the value at α angles, wherein, α is user eyeball imaging point and the angle of x-axis, due to each pixel Generally square structure, it is assumed that the length of side be a (known numeric value), pel spacing is 0, it is assumed that human eye on the image sensor into The air line distance of inconocenter point and center sensor point is b, then understands (a* (800-540))2+(a*(1500-960))2=b2, Then b can be tried to achieve, wherein, b is distance of the user eyeball imaging point to origin.
And be exactly given value after the completion of focusing due to the spacing of camera lens to imaging sensor, i.e. focal length, can be by camera Inside is obtained in real time, it is assumed that be f, and the light incident angle with sensor plane in user eyeball center is assumed to be β this moment, then Tan β=f/b is understood, and then β=arctan (f/b) can be tried to achieve.
The schematic diagram of user eyeball viewpoint determination process in Fig. 3 embodiment of the present invention, as shown in Figure 3, it is assumed that A is user's eye Position on the basis of ball position, B, C is user's viewpoint position, the screen phase that AB is watched attentively by the position of user eyeball with user eyeball Vertical vertical line, AO is user eyeball and the distance of the camera lens, and β is OA and the folder of screen plane (image sensor plane) Angle, it is assumed that range finder module to human eye distance is L1, and OB length is L2, then the right angled triangle AOB in solid space, can be obtained To cos β=OB/OA, and then it can obtain L2=L1*cos β, it is assumed that AB length is L3, then L3=L1*sin β, due to OB and x-axis α is derived as before angle, then can obtain B point coordinates for (- L2*cos α, L2*sin α), sign depends on place quadrant.
User eyeball watches institute's screen curtain C points attentively, and when infrared camera acquisition user eyeball watches C points attentively, user eyeball is rotated Angle, is split as lateral rotation by the rotational angle of eyeball and longitudinally rotates two components, respectively angle j and angle k, then blinkpunkt Position is moved to B2 on X-axis component, and B1 is moved on Y-axis component, and would know that BB1 length is L3*tan (k), BB2 length For L3*tan (j), then C point coordinates is (- L2*cos α+L3*tan (j), L2*sin α+L3*tan (k)), brings equation above into, Can obtain C point coordinates is:
X=-L1*cos (arctan (f/b)) * cos α+L1*sin (arctan (f/b)) * tan (j);
Y=L1*cos (arctan (f/b)) * sin α+L1*sin (arctan (f/b)) * tan (k), sign depend on as Limit and Rotation of eyeball direction.L1 is ranging numerical value herein, and f is lens focus, b and α all by human eye center on a sensor Imaging point position immediately arrive at, j and k are the cross stream component and longitudinal component that eye moves the Rotation of eyeball that detecting system is obtained, and are obtained The coordinate of C points is obtained, human eye viewpoint position has also just been obtained.
In implementation, it is contemplated that it is probably a continuous process that user, which carries out eye and moves control, it is determined that user's viewpoint position For C points and complete after control, user watches another position in frequency curtain attentively again, as shown in figure 3, being assumed to be D points, now differentiate user's eye Ball is less than or equal to first threshold in the variable quantity of the image space of the imaging sensor of infrared camera, and user eyeball is red The variable quantity of the imaging size of the imaging sensor of outer camera is less than or equal to Second Threshold, can continue to regard B points as benchmark position Put, the angle of Rotation of eyeball is split when watching user eyeball attentively D points, obtain cross stream component and longitudinal component, and combine base Level puts the position coordinates that B points obtain user's viewpoint D points.
In another embodiment, it is determined that user's viewpoint position is C points and after completing control, user watches attentively in frequency curtain again Another position, as shown in figure 3, be assumed to be E points, now differentiate user eyeball infrared camera imaging sensor into image position The variable quantity put is more than first threshold, and user eyeball is in the variable quantity of the imaging size of the imaging sensor of infrared camera More than Second Threshold, then need according to the user eyeball after change in the image space of the imaging sensor of infrared camera and The distance of the camera lens of user eyeball and infrared camera after change redefine reference position (determination of reference position with it is above-mentioned It is identical in implementation, repeat no more), and then determine the coordinate position of user's viewpoint E points.
Based on same inventive concept, a kind of electronic equipment is additionally provided in the embodiment of the present invention, due to its principle and viewpoint Determine that method is similar, therefore its implementation may refer to the implementation of method, repeats part and repeats no more.
Fig. 4 is a kind of electronic equipment schematic diagram that provides in the embodiment of the present invention, as shown in figure 4, in the embodiment of the present invention The a kind of electronic equipment of offer includes:Infrared camera 403, memory 401, one or more processors 402;And one or Multiple modules, one or more of modules are stored in the memory, and are configured to by one or more of Manage device to perform, one or more of modules include the instruction for being used to perform each step in any above method.Based on same A kind of computer program being used in combination with the electronic equipment including camera is additionally provided in inventive concept, the embodiment of the present invention Product, because its principle and viewpoint determine that method is similar, therefore its implementation may refer to the implementation of method, repeats part and no longer goes to live in the household of one's in-laws on getting married State.
The computer program that a kind of and electronic equipment including camera provided in the embodiment of the present invention is used in combination is produced Product, the computer program product includes computer-readable storage medium and is embedded in computer program mechanism therein, institute State the instruction that computer program mechanism includes being used to perform each step in any preceding method.
The viewpoint that the present invention is provided determines that scheme avoids complicated calibration process, the erect-position for also no longer needing user to fix Or fixed sitting posture, user can very random use before screen, have higher intelligent and flexibility, and shorten The used time that viewpoint is determined, Consumer's Experience, which can obtain, to be substantially improved, and can expand out many new application scenarios, and practicality is obtained Greatly enhancing.
For convenience of description, each several part of apparatus described above is divided into various modules with function or unit is described respectively. Certainly, each module or the function of unit can be realized in same or multiple softwares or hardware when implementing the present invention.
It should be understood by those skilled in the art that, embodiments of the invention can be provided as method, system or computer program Product.Therefore, the present invention can be using the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware Apply the form of example.Moreover, the present invention can be used in one or more computers for wherein including computer usable program code The computer program production that usable storage medium is implemented on (including but is not limited to magnetic disk storage, CD-ROM, optical memory etc.) The form of product.
The present invention is the flow with reference to method according to embodiments of the present invention, equipment (system) and computer program product Figure and/or block diagram are described.It should be understood that can be by every first-class in computer program instructions implementation process figure and/or block diagram Journey and/or the flow in square frame and flow chart and/or block diagram and/or the combination of square frame.These computer programs can be provided The processor of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce A raw machine so that produced by the instruction of computer or the computing device of other programmable data processing devices for real The device for the function of being specified in present one flow of flow chart or one square frame of multiple flows and/or block diagram or multiple square frames.
These computer program instructions, which may be alternatively stored in, can guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works so that the instruction being stored in the computer-readable memory, which is produced, to be included referring to Make the manufacture of device, the command device realize in one flow of flow chart or multiple flows and/or one square frame of block diagram or The function of being specified in multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that in meter Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented processing, thus in computer or The instruction performed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one The step of function of being specified in individual square frame or multiple square frames.
, but those skilled in the art once know basic creation although preferred embodiments of the present invention have been described Property concept, then can make other change and modification to these embodiments.So, appended claims are intended to be construed to include excellent Select embodiment and fall into having altered and changing for the scope of the invention.

Claims (12)

1. a kind of viewpoint determines method, it is characterised in that comprise the following steps:
User eyeball is obtained in the image space of the imaging sensor of infrared camera and the rotational angle of user eyeball;
Determine the distance of the camera lens of user eyeball and infrared camera;
The position of user eyeball is determined according to the image space and the distance;
The viewpoint position of eyeball is determined according to the position of the user eyeball and the rotational angle.
2. the method as described in claim 1, it is characterised in that the position according to the user eyeball and the rotation The viewpoint position of angle-determining eyeball, including:
Reference position is determined according to the position of the user eyeball, the reference position be on screen, and with the user The line of the position of the eyeball location point perpendicular with the screen that user eyeball is watched attentively;
The viewpoint position of eyeball is determined according to the reference position and the rotational angle.
3. method as claimed in claim 2, it is characterised in that the position according to the user eyeball determines benchmark position Put, including:
Two-dimensional coordinate space is set up in described image sensor, the central point of described image sensor is the coordinate space The intersecting origin of x-axis and y-axis;Coordinate of the user eyeball in the image space of the imaging sensor of infrared camera is obtained, according to Equation below determines user eyeball incident ray and described image sensor angle β, the intersecting origin of x-axis and y-axis to benchmark position The coordinate apart from L2, and reference position put:
β=arctan (f/b);
L2=L1*cos β;
The coordinate of reference position is (- L2*cos α, L2*sin α);
Wherein, f is the infrared camera focal length, and b is the distance of the image space and the origin, L1 be user eyeball with The distance of the camera lens of infrared camera, α is the image space and the angle of x-axis.
4. method as claimed in claim 2, it is characterised in that described true according to the reference position and the rotational angle Determine the viewpoint position of eyeball, including:
According to the horizontal direction angle component j of the reference position and the rotational angle and vertical direction angle component k, really Determine the viewpoint position coordinate of eyeball for (- L1*cos (arctan (f/b)) * cos α+L1*sin (arctan (f/b)) * tan (j), L1*cos(arctan(f/b))*sinα+L1*sin(arctan(f/b))*tan(k))。
5. the method as described in claim 1, it is characterised in that the position according to the user eyeball and the rotation The viewpoint position of angle-determining eyeball, including:
The user eyeball the image space of the imaging sensor of infrared camera variable quantity be more than first threshold, and/ Or user eyeball is when the variable quantity of the imaging size of the imaging sensor of infrared camera is more than Second Threshold, after change User eyeball infrared camera imaging sensor image space and change after user eyeball and infrared camera The distance of camera lens determine the position of user eyeball.
6. a kind of viewpoint determining device, it is characterised in that including:
Acquisition module, for obtaining image space and user eyeball of the user eyeball in the imaging sensor of infrared camera Rotational angle;
Viewpoint position determining module, the distance of the camera lens for determining user eyeball and infrared camera, according to described into image position Put and described apart from the reference position for determining user eyeball, eyeball is determined according to the reference position and the rotational angle Viewpoint position.
7. device as claimed in claim 6, it is characterised in that the viewpoint position determining module is used for according to user eye The position of ball and the rotational angle determine the viewpoint position of eyeball, including:
Reference position is determined according to the position of the user eyeball, the reference position be on screen, and with the user The line of the position of the eyeball location point perpendicular with the screen that user eyeball is watched attentively;
The viewpoint position of eyeball is determined according to the reference position and the rotational angle.
8. device as claimed in claim 7, it is characterised in that the viewpoint position determining module is used for according to user eye The position of ball determines reference position, including:
Two-dimensional coordinate space is set up in described image sensor, the central point of described image sensor is the coordinate space The intersecting origin of x-axis and y-axis;Coordinate of the user eyeball in the image space of the imaging sensor of infrared camera is obtained, according to Equation below determines user eyeball incident ray and described image sensor angle β, the intersecting origin of x-axis and y-axis to benchmark position The coordinate apart from L2, and reference position put:
β=arctan (f/b);
L2=L1*cos β;
The coordinate of reference position is (- L2*cos α, L2*sin α);
Wherein, f is the infrared camera focal length, and b is the distance of the image space and the origin, L1 be user eyeball with The distance of the camera lens of infrared camera, α is the image space and the angle of x-axis.
9. device as claimed in claim 7, it is characterised in that the viewpoint position determining module is used for according to benchmark position Put and the rotational angle determines the viewpoint position of eyeball, including:
According to the horizontal direction angle component j of the reference position and the rotational angle and vertical direction angle component k, really Determine the viewpoint position coordinate of eyeball for (- L1*cos (arctan (f/b)) * cos α+L1*sin (arctan (f/b)) * tan (j), L1*cos(arctan(f/b))*sinα+L1*sin(arctan(f/b))*tan(k))。
10. device as claimed in claim 6, it is characterised in that the viewpoint position determining module is used for according to the user The position of eyeball and the rotational angle determine the viewpoint position of eyeball, including:
The user eyeball the image space of the imaging sensor of infrared camera variable quantity be more than first threshold, and/ Or user eyeball is when the variable quantity of the imaging size of the imaging sensor of infrared camera is more than Second Threshold, after change User eyeball infrared camera imaging sensor image space and change after user eyeball and infrared camera The distance of camera lens determine the position of user eyeball;
It is less than or equal to first threshold in the variable quantity of the image space of the imaging sensor of infrared camera in the user eyeball, With user eyeball when the variable quantity of the imaging size of the imaging sensor of infrared camera is less than or equal to Second Threshold, according to institute State image space and the distance determines the position of user eyeball.
11. a kind of electronic equipment, it is characterised in that the electronic equipment includes:Infrared camera, memory is one or more Processor;And one or more modules, one or more of modules are stored in the memory, and be configured to by One or more of computing devices, one or more of modules include requiring any described in 1-5 for perform claim The instruction of the step of each in method.
12. a kind of computer program product being used in combination with the electronic equipment including camera, the computer program product Include using with computer program mechanism therein, the computer program mechanism is embedded in including computer-readable storage medium The instruction of each step in any methods described in 1-5 is required in perform claim.
CN201680002751.4A 2016-12-01 2016-12-01 Viewpoint determines method, apparatus and electronic equipment Active CN107003744B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/108222 WO2018098772A1 (en) 2016-12-01 2016-12-01 Method and apparatus for determining viewpoint, electronic device, and computer program product

Publications (2)

Publication Number Publication Date
CN107003744A true CN107003744A (en) 2017-08-01
CN107003744B CN107003744B (en) 2019-05-10

Family

ID=59431128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680002751.4A Active CN107003744B (en) 2016-12-01 2016-12-01 Viewpoint determines method, apparatus and electronic equipment

Country Status (2)

Country Link
CN (1) CN107003744B (en)
WO (1) WO2018098772A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019062056A1 (en) * 2017-09-26 2019-04-04 广景视睿科技(深圳)有限公司 Smart projection method and system, and smart terminal
CN109960412A (en) * 2019-03-22 2019-07-02 北京七鑫易维信息技术有限公司 A kind of method and terminal device based on touch-control adjustment watching area
CN109995986A (en) * 2017-12-29 2019-07-09 北京亮亮视野科技有限公司 Control the mobile method of intelligent glasses shooting visual angle
CN110196640A (en) * 2019-05-31 2019-09-03 维沃移动通信有限公司 A kind of method of controlling operation thereof and terminal
CN110244853A (en) * 2019-06-21 2019-09-17 四川众信互联科技有限公司 Gestural control method, device, intelligent display terminal and storage medium
CN110286753A (en) * 2019-06-11 2019-09-27 福建天泉教育科技有限公司 Video attention rate judgment method, storage medium
CN110377158A (en) * 2019-07-22 2019-10-25 北京七鑫易维信息技术有限公司 The calibration method and electronic equipment of eyeball tracking based on variation field range
CN112286350A (en) * 2020-10-27 2021-01-29 珠海格力电器股份有限公司 Equipment control method and device, electronic equipment, electronic device and processor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050200806A1 (en) * 2004-03-12 2005-09-15 Honda Motor Co., Ltd. Line-of-sight detection method and apparatus therefor
CN102473033A (en) * 2009-09-29 2012-05-23 阿尔卡特朗讯 Method for viewing points detecting and apparatus thereof
CN102981736A (en) * 2012-10-29 2013-03-20 华为终端有限公司 Screen unlocking method and screen unlocking terminal
CN103514462A (en) * 2012-06-27 2014-01-15 株式会社读卖广告社 Fixation line determination method, fixation line determination device, eyeball convolution point determination method and notice point determination device
CN103809737A (en) * 2012-11-13 2014-05-21 华为技术有限公司 Method and device for human-computer interaction

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662476B (en) * 2012-04-20 2015-01-21 天津大学 Gaze estimation method
JP2014064784A (en) * 2012-09-26 2014-04-17 Renesas Microsystem:Kk Visual line detection device, visual line detection method and program
CN103604412B (en) * 2013-10-30 2015-11-18 北京智谷睿拓技术服务有限公司 Localization method and locating device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050200806A1 (en) * 2004-03-12 2005-09-15 Honda Motor Co., Ltd. Line-of-sight detection method and apparatus therefor
CN102473033A (en) * 2009-09-29 2012-05-23 阿尔卡特朗讯 Method for viewing points detecting and apparatus thereof
CN102473033B (en) * 2009-09-29 2015-05-27 阿尔卡特朗讯 Method for viewing points detecting and apparatus thereof
CN103514462A (en) * 2012-06-27 2014-01-15 株式会社读卖广告社 Fixation line determination method, fixation line determination device, eyeball convolution point determination method and notice point determination device
CN102981736A (en) * 2012-10-29 2013-03-20 华为终端有限公司 Screen unlocking method and screen unlocking terminal
CN103809737A (en) * 2012-11-13 2014-05-21 华为技术有限公司 Method and device for human-computer interaction

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘涛: "眼动跟踪技术的研究及优化实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019062056A1 (en) * 2017-09-26 2019-04-04 广景视睿科技(深圳)有限公司 Smart projection method and system, and smart terminal
CN109995986A (en) * 2017-12-29 2019-07-09 北京亮亮视野科技有限公司 Control the mobile method of intelligent glasses shooting visual angle
CN109960412A (en) * 2019-03-22 2019-07-02 北京七鑫易维信息技术有限公司 A kind of method and terminal device based on touch-control adjustment watching area
CN109960412B (en) * 2019-03-22 2022-06-07 北京七鑫易维信息技术有限公司 Method for adjusting gazing area based on touch control and terminal equipment
CN110196640A (en) * 2019-05-31 2019-09-03 维沃移动通信有限公司 A kind of method of controlling operation thereof and terminal
CN110286753A (en) * 2019-06-11 2019-09-27 福建天泉教育科技有限公司 Video attention rate judgment method, storage medium
CN110286753B (en) * 2019-06-11 2022-06-07 福建天泉教育科技有限公司 Video attention judging method and storage medium
CN110244853A (en) * 2019-06-21 2019-09-17 四川众信互联科技有限公司 Gestural control method, device, intelligent display terminal and storage medium
CN110377158A (en) * 2019-07-22 2019-10-25 北京七鑫易维信息技术有限公司 The calibration method and electronic equipment of eyeball tracking based on variation field range
CN112286350A (en) * 2020-10-27 2021-01-29 珠海格力电器股份有限公司 Equipment control method and device, electronic equipment, electronic device and processor

Also Published As

Publication number Publication date
CN107003744B (en) 2019-05-10
WO2018098772A1 (en) 2018-06-07

Similar Documents

Publication Publication Date Title
CN107003744B (en) Viewpoint determines method, apparatus and electronic equipment
US11544874B2 (en) System and method for calibration of machine vision cameras along at least three discrete planes
CA2961921C (en) Camera calibration method using a calibration target
KR102052873B1 (en) Method and apparatus for generating a three-dimensional model of a region of interest using an imaging system
JP5474202B2 (en) Method and apparatus for detecting a gazing point based on face detection and image measurement
US20160335475A1 (en) 3d image analyzer for determining the gaze direction
US9805509B2 (en) Method and system for constructing a virtual image anchored onto a real-world object
CN105487555B (en) A kind of station keeping method and device of unmanned plane
US8913125B2 (en) Electronic device and method for regulating coordinates of probe measurement system
CN105929963B (en) It is a kind of for tracking the method and detection device of eyeball position
EP3353632A1 (en) Eye-tracking enabled wearable devices
CN105898107B (en) A kind of target object grasp shoot method and system
KR20080111474A (en) Three-dimensional sensing using speckle patterns
CN108363519B (en) Distributed infrared visual detection and projection fusion automatic correction touch display system
CN110312111B (en) Apparatus, system, and method for automatic calibration of image devices
CN103795935B (en) A kind of camera shooting type multi-target orientation method and device based on image rectification
CN107396097B (en) A kind of method and apparatus of the parallax test of virtual reality device
CN106843507A (en) A kind of method and system of virtual reality multi-person interactive
JP2015207861A (en) Imaging device and method
CN105427282B (en) A kind of test method and device of 3D positioning accuracies
CN105333818B (en) 3d space measuring method based on monocular-camera
CN110441326A (en) Defect inspection method and detection device and computer readable storage medium
CN107667522A (en) Adjust the length of live image
JP6817527B1 (en) Information processing equipment, programs and information processing systems
US9842402B1 (en) Detecting foreground regions in panoramic video frames

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant