CN106669139A - Auxiliary selecting method for electronic-sport-game players - Google Patents

Auxiliary selecting method for electronic-sport-game players Download PDF

Info

Publication number
CN106669139A
CN106669139A CN201611098201.5A CN201611098201A CN106669139A CN 106669139 A CN106669139 A CN 106669139A CN 201611098201 A CN201611098201 A CN 201611098201A CN 106669139 A CN106669139 A CN 106669139A
Authority
CN
China
Prior art keywords
player
point
impact point
pupil center
observation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201611098201.5A
Other languages
Chinese (zh)
Other versions
CN106669139B (en
Inventor
左瑜
张捷
单瑚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhou Chun
Original Assignee
Xian Cresun Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Cresun Innovation Technology Co Ltd filed Critical Xian Cresun Innovation Technology Co Ltd
Priority to CN201611098201.5A priority Critical patent/CN106669139B/en
Publication of CN106669139A publication Critical patent/CN106669139A/en
Application granted granted Critical
Publication of CN106669139B publication Critical patent/CN106669139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/0096Reaction time games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0053Apparatus generating random stimulus signals for reaction-time training involving a substantial physical effort

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention relates to an auxiliary selecting method for electronic-sport-game players. The auxiliary selecting method includes the steps that after players wear VR glasses, N target points of a VR scene appear according to the preset rule to guide the players to observe, wherein N is an integer larger than or equal to 1; the response speed of observation of the N target points of the players is counted so that auxiliary selecting of the players can be completed. By means of the auxiliary selecting method for the electronic-sport-game players in the embodiment, reaction speed of the appearing target points can be accurately and quantitatively measured, and subjective evaluation when the electronic-sport-game players are selected is avoided.

Description

A kind of auxiliary of electric competing players selects method
Technical field
The present invention relates to electronic device field, the auxiliary choice method of more particularly to a kind of electric competing players.
Background technology
As Ministry of Education's vocational education and adult education department issue《Intend enrollment specially with regard to carrying out the tertiary vocational school in 2017 The notice of industry declaration work》, the New Specialty for 2016, the promotion of national policy are augmented in e-sports campaign and management profrssion Become apparent from, it will be apparent that the degree for having promoted this motion to be received by broad masses, attract increasing people and join in this Motion, many electricity election contest handss wish to represent the electric competing match of country's participation world high level, are that country strives for honor, therefore, it is right Progressively paid attention to by industry in more potential electric competing athlete is selected.
For electric competing athlete, response speed is a kind of form of expression of speed quality.It is a kind of in electric competing game Very important response speed is embodied in the time length found from the visual field required for target.For example, a response speed is The people of 0.25s is fought to the finish with the people that response speed is 0.15s, and response speed is the people of 0.15s within the extremely short time Target can just be found and taken action, and the talent that response speed is 0.25s has just realized the particular location of target, therefore, instead Answer the advantage of fireballing people clearly.
At present, during electricity election contest handss are selected, also what method the electricity election contest handss can accurately be measured without and is found The response speed of target, causes the appraisal procedure to this ability still relatively rough.
The content of the invention
Therefore, it is to solve technological deficiency and the deficiency that prior art is present, the present invention proposes a kind of electric competing players' Auxiliary choice method, including:
After player wears VR glasses, there is N number of impact point to guide the player to enter according to preset rules in VR scenes Row observation, wherein, N is the integer more than or equal to 1;
The response speed that the player is observed N number of impact point is counted, to complete the auxiliary to the player Select.
On the basis of above-described embodiment, there is N number of impact point according to preset rules in VR scenes, including:
N number of impact point occurs simultaneously in the VR scenes;
Correspondingly, the response speed that the player is observed N number of impact point is counted, including:
Count the player and observe the total time finished used by all N number of impact points successively.
On the basis of above-described embodiment, there is N number of impact point according to preset rules in VR scenes, including:N number of target Point occurs successively in the VR scenes;
Correspondingly, the response speed that the player is observed N number of impact point is counted, including:
The player is counted respectively it was observed that the time needed for each described impact point.
On the basis of above-described embodiment, the statistics player observes the time needed for each described impact point, bag Include:
After player wears VR glasses, when to record moment that each described impact point occurred in VR scenes be first Carve;The player is recorded it was observed that the moment of the impact point was the second moment;
To the impact point, the time interval between first moment and second moment is counted.
On the basis of above-described embodiment, before occurring N number of impact point successively according to preset rules in VR scenes, also wrap Include:
Determine the pupil center of the player;
Determine the matching relationship between the pupil center of the player and player's point of observation.
On the basis of above-described embodiment, the pupil center of the player is determined, including:
The player wears VR glasses, and the VR eyes include infrared shooting equipment;
The infrared shooting equipment shoots to the eye of the player, forms eye frame;
The eye frame is processed, the pupil center of the player is obtained.
On the basis of above-described embodiment, the eye frame is processed, including:
The eyes image information is carried out into gray processing process, gradation of image hum pattern is obtained;
Determine the minimum point of gray value in described image half-tone information figure;
According to the point that the gray value is minimum, the pupil center of the player is determined.
On the basis of above-described embodiment, the matching between the pupil center of the player and player's point of observation is determined Relation, including:
The Matching Model set up between the pupil center of the player and the point of observation of the player, the Matching Model It is input into as the pupil center of the player, the Matching Model is output as according to the pupil center of the player obtains The point of observation of player.
On the basis of above-described embodiment, set up between the pupil center of the player and the point of observation of the player With model, including:
The player wears VR glasses and observes VR scenes, and the VR scenes include screen, go out successively on the screen Point of observation known to existing K coordinate;
The player is observed successively the K points of observation, and is recorded the player and observed each described sight Pupil center when examining;
According to the pupil center and the coordinate of the point of observation, the Matching Model is obtained.
On the basis of above-described embodiment, the response speed that the player is observed N number of impact point is counted, bag Include:
Pupil center and the matching relationship according to the player, the user ought when being observed to determine the player The position of front point of observation;
If the difference between the position of the position of user's current observation point and the impact point occurred according to preset rules Less than threshold value, it is determined that the user successfully observes the impact point;
The interval that the moment that the impact point occurs successfully observes the moment of the impact point with the user is recorded, with Complete the statistics to the response speed.
Become obvious by the other side and feature below with reference to the detailed description of accompanying drawing, the present invention.But should know Road, the accompanying drawing is only the purpose design explained, not as the restriction of the scope of the present invention, this is because it should refer to Appended claims.It should also be noted that unless otherwise noted, it is not necessary to scale accompanying drawing, they only try hard to concept Ground explanation structure described herein and flow process.
Description of the drawings
Below in conjunction with accompanying drawing, the specific embodiment of the present invention is described in detail.
Fig. 1 is a kind of flow chart of the auxiliary choice method of electric competing players provided in an embodiment of the present invention.
Specific embodiment
It is understandable to enable the above objects, features and advantages of the present invention to become apparent from, below in conjunction with the accompanying drawings to the present invention Specific embodiment be described in detail.
The present invention proposes a kind of auxiliary of electric competing players and selects method, the method to include:
After player wears VR glasses, there is N number of impact point to guide the player to enter according to preset rules in VR scenes Row observation, wherein, N is the integer more than or equal to 1;
The response speed that the player is observed N number of impact point is counted, to complete the auxiliary to the player Select.
Further, on the basis of above-described embodiment, there is N number of impact point, Ke Yishi according to preset rules in VR scenes:
N number of impact point occurs simultaneously in the VR scenes;
Correspondingly, the response speed that the player is observed N number of impact point is counted, including:
Count the player and observe the total time finished used by all N number of impact points successively.
Or, on the basis of above-described embodiment, there is N number of impact point according to preset rules in VR scenes, can also be:Institute State N number of impact point to occur successively in the VR scenes;
Correspondingly, the response speed that the player is observed N number of impact point is counted, including:
The player is counted respectively it was observed that the time needed for each described impact point.
Further, on the basis of above-described embodiment, count the player and observe needed for each described impact point Time, can be specifically:
After player wears VR glasses, when to record moment that each described impact point occurred in VR scenes be first Carve;The player is recorded it was observed that the moment of the impact point was the second moment;
To the impact point, the time interval between first moment and second moment is counted.
Further, on the basis of above-described embodiment, occurs N number of impact point successively according to preset rules in VR scenes Before, also execute the following steps:
Determine the pupil center of the player;
Determine the matching relationship between the pupil center of the player and player's point of observation.
Further, on the basis of above-described embodiment, the pupil center of the player is determined, can be using such as lower section Formula:
The player wears VR glasses, and the VR eyes include infrared shooting equipment;
The infrared shooting equipment shoots to the eye of the player, forms eye frame;
The eye frame is processed, the pupil center of the player is obtained.
Further, on the basis of above-described embodiment, the eye frame is processed, following steps can be adopted:
The eyes image information is carried out into gray processing process, gradation of image hum pattern is obtained;
Determine the minimum point of gray value in described image half-tone information figure;
According to the point that the gray value is minimum, the pupil center of the player is determined.
Wherein, the eyes image information is carried out into gray processing process, obtains gradation of image hum pattern, specifically can be adopted Following method:
Coordinate system is set up in gradation of image hum pattern, the original gradation that f0 (x, y) is the point that coordinate is (x, y) place is made Value.The eye frame can be carried out into pretreatment, to increase the pupil region and non-pupil of player using operator is strengthened The contrast of area image.The expression way that one of which strengthens operator En is as follows:
En=c*lg (1+double (f0));
Wherein, to strengthen operator, f0 is original gray value to En, and c is constant coefficient.
Preferably, after the pretreatment to the eye frame has been performed, La Pula can also be adopted to image This algorithm carries out image filtering process, preferably image can be carried out dry.
Wherein, on the basis of above-described embodiment, the minimum point of gray value in described image half-tone information figure is determined, can be with Using following steps:
The minimum point of the gray value is estimated in gradation of image hum pattern, the coordinate for making the point is (xmin, ymin), and And, the formula of xmin and ymin is:
Wherein, min is represented and is taken minimum operation, and sum represents summation operation, and f (i, j) is represented in coordinate (x, y) place image Gray value.
Further, on the basis of above-described embodiment, according to the point that the gray value is minimum, the pupil of the player is determined Hole center, can specifically adopt following steps:
It is the gradient that starting point calculates gray scale along specified directions of rays on gradation of image hum pattern with coordinate (xmin, ymin) Value, and the position being located when Grad is reached into maximum is defined as the position of player's pupil edge point;
The pupil center of the player is obtained according to the position of the pupil edge point.
Wherein, Grad reaches the position that the position being located during maximum is defined as player's pupil edge point, and this is Because:Pupil is inside and outside to have apparent difference to ultrared reflection.The image of shooting is stronger in pupil region light, brightness It is high;In non-lesser ring of Merkel then dark images, brightness is low.Pupil region gray scale is significantly lower than other regions, at marginal position, gradient Change is violent.In the image information crossed after processing, in a specified direction, the gray value of the gray value of lesser ring of Merkel and non-lesser ring of Merkel There is violent change in intersection, in the Grad of intersection maximum can be also reached, the position of pupil edge point is judged accordingly.
For example, if f (i, j) is gray value of the image at coordinate (i, j) place, the partial differential of gray value is:
The then shade of gray of the directionD maximum point is pupil edge point.
Wherein, the pupil center of the player is obtained according to the position of the pupil edge point, can be adopted with the following method:
One center is asked to all of pupil edge point, specifically, all of pupil edge point is obtained respectively in coordinate Central value xmean in the X direction of system and central value ymean on coordinate system y direction;
The center coordinate of eye pupil for determining the player is (xmean, ymean).
Wherein, the pupil center of the player is obtained according to the position of the pupil edge point, can also be using another kind Method is as follows:
Multiple pupil edge points are carried out with process of fitting treatment and forms class elliptic curve, with the center of the class elliptic curve As pupil center's point.
Specifically, the present invention adopts the one of method of least square (LSM) and stochastical sampling concordance (RANSAC) mixed iteration Algorithm is planted, multiple pupil edge points is carried out with process of fitting treatment and is formed class elliptic curve.This method can either overcome a most young waiter in a wineshop or an inn The not high shortcoming of multiplication precision, improves the ageing deficiency of RANSAC algorithms again.Comprise the following steps that:
Step a, any 5 points are chosen from N number of characteristic point, carry out ellipse fitting using method of least square and form the One class elliptic equation;
Step b, N number of characteristic point is carried out using stochastical sampling consistency algorithm by the first kind elliptic equation Intra-office point drawn game exterior point is screened, and statistics obtains M intra-office point and N-M point not in the know;
In the present embodiment, the point fallen on class ellipse, is considered as intra-office point.Certainly, the present invention is not limited herein System.
Step c, judge intra-office point occupation rateWhether first threshold t1 is less than;If, it is determined that 5 points are Atypical characteristics point, fitted ellipse is that atypical characteristics are oval, then re-execute step a;If not, it is determined that 5 points are Characteristic feature point, then execution step d;
Step d, 5 points are arbitrarily chosen according to the M intra-office point, using method of least square to the first kind ellipse side Journey is optimized to form Equations of The Second Kind elliptic equation, and passes through described using stochastical sampling consistency algorithm to N number of characteristic point Equations of The Second Kind elliptic equation carries out the examination of intra-office point drawn game exterior point, and final statistics obtains M1 intra-office point and N-M1 points not in the know;
Step e, judge intra-office point occupation rateWhether Second Threshold t2 is more than;In this way, then iteration is terminated, it is believed that institute Equations of The Second Kind elliptic equation is stated for Optimality equations;If it is not, then execution step d.
Alternatively, the embodiment of the present invention also provides another kind using method of least square (LSM) and stochastical sampling concordance (RANSAC) multiple pupil edge points are carried out process of fitting treatment and form class elliptic curve by a kind of algorithm of mixed iteration.Tool Body step is as follows:
Step S1:Any 5 points are chosen from N number of characteristic point, using LSM ellipse fitting is carried out, by remaining N-5 point Bringing elliptic equation into using RANSAC carries out the examination of intra-office point drawn game exterior point, counts and have in N number of point M intra-office point, N-M Point not in the know;
Step S2:If intra-office point occupation rateLess than threshold value t1, then it is assumed that 5 points that step S1 is chosen are SARS Type characteristic point, fitted ellipse is that atypical characteristics are oval, jumps to step S1;IfThen think 5 that step S1 is chosen Point characteristic feature point, then jump toward step S3;
Step S3:All bring the M intra-office characteristic point screened out in step S1 into LSM equations, obtain the ellipse of optimization Equation, reuses RANSAC algorithms, and bringing new elliptic equation into N-M characteristic point carries out screening again for intra-office point, screens Intra-office point be M1, point not in the know be N-M1;
Step S4:Calculate intra-office point occupation rateIf being more than threshold value t2, terminate iteration, it is believed that current ellipse Equation is Optimality equations;IfThen think that intra-office point ratio cannot meet threshold value, jump into step S3, again iteration.
Further, on the basis of above-described embodiment, the pupil center of the player and player's point of observation are determined Between matching relationship, Ke Yiwei:
The Matching Model set up between the pupil center of the player and the point of observation of the player, the Matching Model It is input into as the pupil center of the player, the Matching Model is output as according to the pupil center of the player obtains The point of observation of player.
Specifically, the Matching Model set up between the pupil center of the player and the point of observation of the player, step is such as Under:
The player wears VR glasses and observes VR scenes, and the VR scenes include screen, go out successively on the screen Point of observation known to existing K coordinate;
The player is observed successively the K points of observation, and is recorded the player and observed each described sight Pupil center when examining;
According to the pupil center and the coordinate of the point of observation, the Matching Model is obtained.
Specifically, matched according to the coordinate (X, Y) of the K point of observation and the coordinate (x, y) of the pupil center Model (X, Y)=F (x, y), can adopt with the following method:
Choosing the Matching Model is:
The matching mould is determined by the coordinate (X, Y) and corresponding pupil center's point coordinates (x, y) of the K point Parameter a, b of type, c, d, e, f, g, h, k, l, m and n, to complete the foundation of the Matching Model.
Specifically, the coordinate record of the K point for occurring successively in screen is X=(X1,X2,X3…Xk), Y=(Y1,Y2, Y3…Yk), corresponding center coordinate of eye pupil is x=(x1,x2,x3…xk), y=(y1,y2,y3…yk), the foundation of model then can be with With following expression matrix:
F (x, y) represents a kind of relation of x and y, and M represents the mapping parameters between pupil center location and screen coordinate.
The model that the present invention sets up is as follows:
The model is expressed as using matrix form:
Then:
In this model, when taking K=6, to there is 6 screen coordinates, pupil accordingly also has 6 correspondence centre coordinates for X, Y, X=| X1, X2, X3, X4, X5, X6|, Y=| Y1, Y2, Y3, Y4, Y5, Y6|, in the same manner, x=| x1, x2, x3, x4, x5, x6|, y=| y1, y2,y3,y4,y5,y6|, then matrix above can be further rewritten as:
A, b, c, d, e, f, g, h, k, l, m and n are obtained by equation group, and then obtains the Matching Model.
Further, on the basis of above-described embodiment, the sound that the player is observed N number of impact point is counted Answer speed, Ke Yiwei:
Pupil center and the matching relationship according to the player, the user ought when being observed to determine the player The position of front point of observation;
If the difference between the position of the position of user's current observation point and the impact point occurred according to preset rules Less than threshold value, it is determined that the user successfully observes the impact point;
The interval that the moment that the impact point occurs successfully observes the moment of the impact point with the user is recorded, with Complete the statistics to the response speed.
A kind of auxiliary of electric competing players provided in an embodiment of the present invention selects method, can be measured with accurate quantification The response speed of impact point of the player to occurring, it is to avoid the SUBJECTIVE occurred when pulling out to electricity election contest picking.
In sum, specific case used herein is set forth to the principle and embodiment of the present invention, the above The explanation of embodiment is only intended to help and understands the method for the present invention and its core concept;Simultaneously for the general skill of this area Art personnel, according to the thought of the present invention, will change in specific embodiments and applications, in sum, this Description should not be construed as limiting the invention, and protection scope of the present invention should be defined by appended claim.

Claims (10)

1. a kind of auxiliary of electric competing players selects method, it is characterised in that include:
After player wears VR glasses, there is N number of impact point to guide the player to be seen according to preset rules in VR scenes Examine, wherein, N is the integer more than or equal to 1;
The response speed that the player is observed N number of impact point is counted, is selected with completing the auxiliary to the player.
2. the method for claim 1, it is characterised in that VR scenes N number of impact point occur according to preset rules, including:
N number of impact point occurs simultaneously in the VR scenes;
Correspondingly, the response speed that the player is observed N number of impact point is counted, including:
Count the player and observe the total time finished used by all N number of impact points successively.
3. the method for claim 1, it is characterised in that VR scenes N number of impact point occur according to preset rules, including: N number of impact point occurs successively in the VR scenes;
Correspondingly, the response speed that the player is observed N number of impact point is counted, including:
The player is counted respectively it was observed that the time needed for each described impact point.
4. method as claimed in claim 3, it is characterised in that statistics the player is observed needed for each described impact point Time, including:
After player wears VR glasses, it was the first moment to record the moment that each described impact point occurred in VR scenes;Note The player is recorded it was observed that the moment of the impact point was the second moment;
To the impact point, the time interval between first moment and second moment is counted.
5. the method for claim 1, it is characterised in that occur N number of target successively according to preset rules in VR scenes Before point, also include:
Determine the pupil center of the player;
Determine the matching relationship between the pupil center of the player and player's point of observation.
6. method as claimed in claim 5, it is characterised in that determine the pupil center of the player, including:
The player wears VR glasses, and the VR eyes include infrared shooting equipment;
The infrared shooting equipment shoots to the eye of the player, forms eye frame;
The eye frame is processed, the pupil center of the player is obtained.
7. method as claimed in claim 6, it is characterised in that process the eye frame, including:
The eyes image information is carried out into gray processing process, gradation of image hum pattern is obtained;
Determine the minimum point of gray value in described image half-tone information figure;
According to the point that the gray value is minimum, the pupil center of the player is determined.
8. method as claimed in claim 5, it is characterised in that determine the pupil center of the player and player's point of observation Between matching relationship, including:
The Matching Model set up between the pupil center of the player and the point of observation of the player, the input of the Matching Model For the pupil center of the player, the Matching Model is output as the player obtained according to the pupil center of the player Point of observation.
9. method as claimed in claim 8, it is characterised in that set up the pupil center of the player and the observation of the player Matching Model between point, including:
The player wears VR glasses and observes VR scenes, and the VR scenes include screen, occurs K on the screen successively Point of observation known to coordinate;
The player is observed successively the K points of observation, and is recorded the player and observed each described point of observation When pupil center;
According to the pupil center and the coordinate of the point of observation, the Matching Model is obtained.
10. method as claimed in claim 5, it is characterised in that statistics the player is observed N number of impact point Response speed, including:
Pupil center and the matching relationship according to the player, determine the player when being observed the user currently see Examine position a little;
If the difference between the position of the position of user's current observation point and the impact point occurred according to preset rules does not surpass Cross threshold value, it is determined that the user successfully observes the impact point;
The interval that the moment that the impact point occurs successfully observes the moment of the impact point with the user is recorded, to complete Statistics to the response speed.
CN201611098201.5A 2016-12-03 2016-12-03 Auxiliary election method for electric competition players Active CN106669139B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611098201.5A CN106669139B (en) 2016-12-03 2016-12-03 Auxiliary election method for electric competition players

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611098201.5A CN106669139B (en) 2016-12-03 2016-12-03 Auxiliary election method for electric competition players

Publications (2)

Publication Number Publication Date
CN106669139A true CN106669139A (en) 2017-05-17
CN106669139B CN106669139B (en) 2019-12-27

Family

ID=58867509

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611098201.5A Active CN106669139B (en) 2016-12-03 2016-12-03 Auxiliary election method for electric competition players

Country Status (1)

Country Link
CN (1) CN106669139B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108205395A (en) * 2018-01-16 2018-06-26 安徽慧视金瞳科技有限公司 A kind of calibration point centre coordinate precise positioning method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1584917A (en) * 2004-06-11 2005-02-23 清华大学 Living body iris patterns collecting method and collector
CN101779960A (en) * 2010-02-24 2010-07-21 沃建中 Test system and method of stimulus information cognition ability value
CN102076387A (en) * 2008-04-29 2011-05-25 百利游戏有限公司 Biofeedback for gaming device, such as electronic gaming machine (egm)
US20120244935A1 (en) * 2011-03-24 2012-09-27 International Business Machines Corporation Synchronizing game character display with game player viewing detection
CN103748599A (en) * 2011-05-20 2014-04-23 爱福露恩斯公司 Systems and methods for measuring reactions of head, eyes, eyelids and pupils
CN104983435A (en) * 2015-08-07 2015-10-21 北京环度智慧智能技术研究所有限公司 Stimulus information establishing method for interest orientation value test

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1584917A (en) * 2004-06-11 2005-02-23 清华大学 Living body iris patterns collecting method and collector
CN102076387A (en) * 2008-04-29 2011-05-25 百利游戏有限公司 Biofeedback for gaming device, such as electronic gaming machine (egm)
CN101779960A (en) * 2010-02-24 2010-07-21 沃建中 Test system and method of stimulus information cognition ability value
US20120244935A1 (en) * 2011-03-24 2012-09-27 International Business Machines Corporation Synchronizing game character display with game player viewing detection
CN103748599A (en) * 2011-05-20 2014-04-23 爱福露恩斯公司 Systems and methods for measuring reactions of head, eyes, eyelids and pupils
CN104983435A (en) * 2015-08-07 2015-10-21 北京环度智慧智能技术研究所有限公司 Stimulus information establishing method for interest orientation value test

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108205395A (en) * 2018-01-16 2018-06-26 安徽慧视金瞳科技有限公司 A kind of calibration point centre coordinate precise positioning method
CN108205395B (en) * 2018-01-16 2021-03-23 安徽慧视金瞳科技有限公司 Method for accurately positioning center coordinates of calibration points

Also Published As

Publication number Publication date
CN106669139B (en) 2019-12-27

Similar Documents

Publication Publication Date Title
CN109670380B (en) Motion recognition and posture estimation method and device
CN102595185B (en) Stereo image quality objective evaluation method
CN109858466A (en) A kind of face critical point detection method and device based on convolutional neural networks
CN107423678A (en) A kind of training method and face identification method of the convolutional neural networks for extracting feature
CN109191428A (en) Full-reference image quality evaluating method based on masking textural characteristics
CN111415304A (en) Underwater vision enhancement method and device based on cascade deep network
CN108711407A (en) Display effect adjusting method, regulating device, display equipment and storage medium
CN107909588A (en) Partition system under MRI cortex based on three-dimensional full convolutional neural networks
CN106780446A (en) It is a kind of to mix distorted image quality evaluating method without reference
CN108038839A (en) Twisted-pair feeder lay real-time detection method on a kind of flow production line
CN108460720A (en) A method of changing image style based on confrontation network model is generated
CN106886992A (en) A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree
CN102567734A (en) Specific value based retina thin blood vessel segmentation method
CN110135508A (en) Model training method, device, electronic equipment and computer readable storage medium
CN106669139A (en) Auxiliary selecting method for electronic-sport-game players
CN109558797B (en) Method for distinguishing human body balance disorder based on gravity center area model under visual stimulation
CN107451975A (en) A kind of view-based access control model weights similar picture quality clarification method
DE112017008230T5 (en) METHOD AND APPARATUS FOR MAPPING A VIRTUAL ENVIRONMENT TO A PHYSICAL ENVIRONMENT
CN103065298A (en) Vision significance detection method imitating retina filtering
CN109905694A (en) A kind of quality evaluating method of three-dimensional video-frequency, device and equipment
Langlois et al. Uncovering visual priors in spatial memory using serial reproduction.
CN114241426B (en) Outdoor intelligent gymnasium management system
JP7095849B1 (en) Eyewear virtual fitting system, eyewear selection system, eyewear fitting system and eyewear classification system
CN102769772B (en) Method and device for evaluating video sequence distortion
CN110944165B (en) Stereoscopic image visual comfort level improving method combining perceived depth quality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20191031

Address after: 318000, No. 618, west section of Taizhou Avenue, Jiaojiang District, Zhejiang

Applicant after: RESEARCH INSTITUTE OF ZHEJIANG University TAIZHOU

Address before: 710071 Shaanxi city of Xi'an province high tech Zone Road No. 86 leading Times Square (B) second building 1 unit 22 floor room 12202 No. 51

Applicant before: XI'AN CREATION KEJI Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201118

Address after: Room 201, No.208 Xinggang West Road, Damaiyu street, Taizhou City, Zhejiang Province

Patentee after: Qihuan Technology Co.,Ltd.

Address before: 318000, No. 618, west section of Taizhou Avenue, Jiaojiang District, Zhejiang

Patentee before: RESEARCH INSTITUTE OF ZHEJIANG University TAIZHOU

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220909

Address after: Room 2704, Building 2, Wanda Huafu, No. 888, Maocheng Middle Road, Yinzhou District, Ningbo City, Zhejiang Province, 315100

Patentee after: Zhou Chun

Address before: 317604 Room 201, 208 Xinggang West Road, Damaiyu street, Taizhou City, Zhejiang Province

Patentee before: Qihuan Technology Co.,Ltd.