CN106708257A - Game interaction method and device - Google Patents
Game interaction method and device Download PDFInfo
- Publication number
- CN106708257A CN106708257A CN201611041552.2A CN201611041552A CN106708257A CN 106708257 A CN106708257 A CN 106708257A CN 201611041552 A CN201611041552 A CN 201611041552A CN 106708257 A CN106708257 A CN 106708257A
- Authority
- CN
- China
- Prior art keywords
- expression
- user
- default
- parameter
- game
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/176—Dynamic expression
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention provides a game interaction method and device. The game interaction method includes that a preset expression parameter of a user under a preset expression is obtained; a real-time expression parameter of the user is obtained; the preset expression degree of the user is judged, and the preset expression degree represents the similarity degree of a real-time expression and the preset expression; according to the preset expression degree, a game display interface is adjusted. The game interaction method and device can adjust the game display interface according to the natural expression of the user, and the user experience is improved.
Description
Technical field
This disclosure relates to human-computer interaction technique field, in particular to a kind of game interaction method and apparatus.
Background technology
In existing somatic sensation television game or reality-virtualizing game, can generally be sentenced by the seizure to user's limb action
The interaction intention of disconnected user.But, when user wants that the visual focusing for adjusting display interface is right to amplify display in game process
As so as to see more clearly, or when user wants to aim at a display object, it usually needs manually manipulate hardware
Realize.Because the process of manual manipulation needs to be interacted with hardware, with certain interruption and time delay, user's game is have impact on
The fluency of experience.
Accordingly, it would be desirable to it is a kind of can to user bring smoothness game experiencing display interface adjustment mode.
It should be noted that information is only used for strengthening the reason of background of this disclosure disclosed in above-mentioned background section
Solution, therefore can include not constituting the information to prior art known to persons of ordinary skill in the art.
The content of the invention
The purpose of the disclosure is to provide a kind of game interaction method and apparatus, at least overcome to a certain extent by
One or more problems caused by the limitation of correlation technique and defect.
According to the first aspect of the embodiment of the present disclosure, there is provided a kind of game interaction method, including:
Obtain default expression parameter of the user under default expression;
Obtain the real-time expression parameter of user;
Judge the default expression degree of user, the default expression degree is characterized expresses one's feelings and the default expression between in real time
The degree of approach;
According to the default expression degree adjustment game display interface.
In a kind of exemplary embodiment of the disclosure, the default expression parameter for obtaining user includes:
Obtain the first coordinate of the face feature point under user's normal condition;
Obtain the second coordinate of the face feature point of the user under the default expression;
The default expression parameter is obtained with second coordinate according to first coordinate.
In a kind of exemplary embodiment of the disclosure, the real-time expression parameter for obtaining user includes:
The face feature point is followed the trail of, to obtain the 3rd coordinate of the face feature point;
The real-time expression parameter is obtained with the 3rd coordinate according to first coordinate.
In a kind of exemplary embodiment of the disclosure, the default facial expression includes narrowing ocular feelings and aiming at expressing one's feelings.
In a kind of exemplary embodiment of the disclosure, obtaining the default expression parameter of user includes obtaining the normal shape of user
Sight line focal position under state.
In a kind of exemplary embodiment of the disclosure, by the real-time expression parameter and the default expression parameter
The default expression degree of ratio in judgement user.
In a kind of exemplary embodiment of the disclosure, when the ratio is more than or equal to 1, the default interface of display.
It is aobvious according to ratio adjustment game when the ratio is less than 1 in a kind of exemplary embodiment of the disclosure
Show the focal length at interface.
According to an aspect of this disclosure, there is provided a kind of game interaction device, including:
Parameter setting module, for obtaining default expression parameter of the user under default expression;
Expression monitoring module, the real-time expression parameter for obtaining user;
Expression judge module, the default expression degree for judging user;
Interface adjusting module, for according to the default expression degree adjustment game display interface.
In a kind of exemplary embodiment of the disclosure, the default facial expression includes narrowing ocular feelings and aiming at expressing one's feelings.
The game interaction method of the disclosure adjusts game display interface by judging that user's face is expressed one's feelings, and is not needing user
The adaptation to user view is realized in the case of active control, compared to traditional game interaction mode, with interactive speed
Hurry up, the advantages of interactive mode is natural, greatly improve the game experiencing of user.
It should be appreciated that the general description of the above and detailed description hereinafter are only exemplary and explanatory, not
The disclosure can be limited.
Brief description of the drawings
Accompanying drawing herein is merged in specification and constitutes the part of this specification, shows the implementation for meeting the disclosure
Example, and it is used to explain the principle of the disclosure together with specification.It should be evident that drawings in the following description are only the disclosure
Some embodiments, for those of ordinary skill in the art, on the premise of not paying creative work, can also basis
These accompanying drawings obtain other accompanying drawings.
Fig. 1 schematically shows a kind of flow chart of game interaction method in disclosure exemplary embodiment.
Fig. 2 schematically shows the schematic diagram of disclosure exemplary embodiment septum reset characteristic point.
Fig. 3 schematically shows disclosure exemplary embodiment septum reset characteristic parameter schematic diagram.
Fig. 4 A show that focal length is the game display circle for initially showing focal length in schematically showing disclosure exemplary embodiment
Face.
Fig. 4 B show that focal length is the game display circle for showing focal length in real time in schematically showing disclosure exemplary embodiment
Face
Fig. 5 A schematically show the schematic diagram of normal display interface in disclosure exemplary embodiment.
Fig. 5 B schematically show the schematic diagram of aiming display interface in disclosure exemplary embodiment.
Fig. 6 schematically shows a kind of block diagram of game interaction device in disclosure exemplary embodiment.
Specific embodiment
Example embodiment is described more fully with referring now to accompanying drawing.However, example embodiment can be with various shapes
Formula is implemented, and is not understood as limited to example set forth herein;Conversely, thesing embodiments are provided so that the disclosure will more
Fully and completely, and by the design of example embodiment those skilled in the art is comprehensively conveyed to.Described feature, knot
Structure or characteristic can be combined in one or more implementation methods in any suitable manner.In the following description, there is provided perhaps
Many details are so as to provide fully understanding for implementation method of this disclosure.It will be appreciated, however, by one skilled in the art that can
Omit one or more in the specific detail to put into practice the technical scheme of the disclosure, or other sides can be used
Method, constituent element, device, step etc..In other cases, be not shown in detail or describe known solution a presumptuous guest usurps the role of the host avoiding and
So that each side of the disclosure thickens.
Additionally, accompanying drawing is only the schematic illustrations of the disclosure, identical reference represents same or similar portion in figure
Point, thus repetition thereof will be omitted.Some block diagrams shown in accompanying drawing are functional entitys, not necessarily necessary and thing
The entity managed or be logically independent is corresponding.These functional entitys can be realized using software form, or at one or more
These functional entitys are realized in hardware module or integrated circuit, or in heterogeneous networks and/or processor device and/or microcontroller
These functional entitys are realized in device.
Disclosure example embodiment is described in detail below in conjunction with the accompanying drawings.
Fig. 1 is a kind of flow chart of game interaction method 100 in disclosure exemplary embodiment.
With reference to Fig. 1, the game interaction method 100 can include:
Step S102:Obtain default expression parameter of the user under default expression;
Step S104:Obtain the real-time expression parameter of user;
Step S106:Judge the default expression degree of user, it is pre- with described that the default expression degree characterizes expression in real time
If the degree of approach between expression;
Step S108:According to the default expression degree adjustment game display interface.
Game display circle is adjusted by judging that user's face is expressed one's feelings by the game interaction method in the present exemplary embodiment
Face, the adaptation to user view is realized in the case where user's active control is not needed, compared to traditional game interaction side
Formula, has the advantages that interactive speed is fast, interactive mode is natural, greatly improves the game experiencing of user.
Below, each step in the game interaction method in exemplary embodiment of this disclosure is made further
It is bright.
In step s 102, default expression parameter of the user under default expression is obtained.
The game interaction method that the disclosure is provided goes for somatic sensation television game equipment, virtual reality device etc. has motion
The equipment of capture function.
It is motion-captured be it is a kind of for accurate measurement moving object three-dimensional space motion situation technology.It is based on calculating
, (or be attached in moving object for moving object by arrangement several video capturing devices in space by machine graphics principle
Tracker) moving situation record in the form of images, the pictorial data is processed using acquisition machine then, from
And obtain the space coordinates of different objects (tracker) in different time measurement unit.
Further, in the illustrative embodiments of the disclosure, the step S102:Acquisition user is under default expression
Default expression parameter, can include:
Step S1022, obtains the first coordinate of the face feature point under user's normal condition.
First, user's face model and characteristic point are obtained by reverse modeling.Can be caught by the optical sensor of equipment
Catch the spectrum of user's facial muscles reflected light in normal state.These spectrum can be that natural light is incident upon on facial muscles
The spectrum, or equipment of reflected light are incident upon the spectrum of the reflected light on facial muscles.By the analysis to above-mentioned spectrum,
Can be reverse modeling --- by way of for example with surface reconstruction, user's face is modeled.
Surface reconstruction is a kind of reverse modeling method.During surface reconstruction, first discrete data point can be joined
Numberization, the then boundary formation initial surface according to data point block, and make the relative position of each data point on initial surface
To present, so as to be distributed by obtaining the respective parameter (u, v) of data point, each data point is set to turn into the real node of patch.
Order arrangement is being carried out to each node, such as the node order, node serial number and node parameter value to each node are ranked up
Afterwards, the patch with orderly multiple nodes can be obtained.
In this example embodiment, can be used as during surface reconstruction by obtaining the characteristic point of user's face
Data point.The international standard that MPEG-4 is defined to 3 D human face animation can be introduced herein【ISO/IEC 1988a】With【IOS/IEC
1988b】, both standards all include determining to FAP (facial animation parameter, people' s face positioning)
Justice.FAP is a kind of parameter with versatility, can be used to represent face exercise data, and its value is generally with FAPU (facial
Animation parameter unit, people' s face positioning unit) be unit so that same FAP be applied to it is different
When on model, generation is same muscle change.
In this example embodiment, default facial expression can include narrowing ocular feelings and aiming at expressing one's feelings.Therefore, this step
The motion of user's periocular muscle can be included obtaining, so as to obtain the expression of the user.Can be by frontalis, temporalis, eye wheel circle
The periocular muscles such as flesh, superciliary corrugator muscle nasalis, musculus as the muscle that can at utmost reflect user's expressive features, in these muscle
Characteristic point is marked on position, then the locally fine point near the eyes of user is obtained by the reverse modeling mode such as surface reconstruction.It is worth mentioning
, reverse modeling mode is not limited to that, relevant technical staff in the field according to actual conditions when can select reverse modeling
Mode.
After the mask for obtaining user, with selected characteristic point, and the coordinate of these characteristic points can be recorded.By now
User's face characteristic point coordinate record is the first coordinate, to represent user's facial expression feature in normal state.
Step S1024, obtains the second coordinate of the face feature point of the user under the default expression.
User can be pointed out to make default expression.The method of user is pointed out to include but is not limited to be regarded in interface
Feel prompting, audio prompt etc. is carried out by equipment.After above-mentioned prompting is sent, can by carry out image procossing, analysis with
Pattern-recognition, the coordinate of the above-mentioned user's face characteristic point of real-time capture, to judge user's face muscular movement, and in user's face
Muscle keeps the expression of steady time recording user.
Can be the second coordinate by user's face characteristic point coordinate record now, to represent user under default expression
Facial expression feature.
Step S1026, the default expression parameter is obtained according to first coordinate with second coordinate.
The distance between selected characteristic point can be drawn according to the first coordinate to obtain first facial expression parameter K0, according to
Second coordinate draws the distance between selected characteristic point to obtain the second Facial Animation Parameters K1, the ratio K 1/ of calculating K1 and K0
K0, default expression parameter is recorded as by the ratio.
When default facial expression includes narrowing ocular feelings and aiming at expression, the default expression parameter of user is obtained except obtaining
The coordinate of user's periocular muscle characteristic point, it is also possible to including obtaining the sight line focal position under user's normal condition.For example, can be with
The sight line focal position of user is determined by following the trail of user's pupil position, and the position is advance according to the coordinate of display interface
Set, when detecting user's sight line focus and staying in the position more than preset time threshold, then trigger corresponding trip
Play operation (e.g., in shooting game, performs and aims at focusing and shoot).
In step S104, the real-time expression parameter of user is obtained.
By following the trail of above-mentioned face feature point, to obtain the 3rd coordinate of the face feature point, and according to described first
Coordinate obtains the real-time expression parameter with the 3rd coordinate.Remember the distance between selected characteristic point according to the 3rd coordinate
To obtain the 3rd Facial Animation Parameters K2, expression parameter can for example be represented by K2/K0 in real time.
In step s 106, the default expression degree of user is judged, the default expression degree characterizes expression and institute in real time
State the degree of approach between default expression.
By real-time parameter and the default expression degree of the ratio in judgement user of the default expression parameter of expressing one's feelings.Example
Such as, the degree of approach between expression in real time and default expression can be obtained by K2/K1.
In step S108, according to the default expression degree adjustment game display interface.
After drawing above-mentioned user's expression judged result, can be according to the degree of approach adjustment between expression in real time and default expression
Game display interface.For example, when the ratio is more than or equal to 1, interface is preset in display.When the ratio is more than 1, represent
The real-time expression of user presets the facial muscle movements scope of expression beyond l, is excessively made that default expression;When described ratio etc.
When 1, represent that the real-time expression of user is sufficiently close to default expression, it is believed that user is made that default expression.Therefore, user is worked as
When making default expression or excessively making default expression, default interface can be shown.
When default expression includes narrowing ocular feelings and aiming at expression, display interface now can be the maximum for allowing display
Interface (can for example include foresight or aim at window) is aimed at visual focusing interface.In the other embodiment of the disclosure
In, when default expression includes other expressions, above-mentioned default interface can also be other display interfaces, this area correlation technique
Personnel work as can implement according to actual conditions.
Meanwhile, when default expression includes narrowing ocular feelings, and when the ratio is less than 1, can be adjusted according to the ratio
The focal length of game display interface.Herein, it is believed that when the ratio is less than 1, the facial muscle movements of user are not up to
The facial muscle movements scope of default expression, it is to narrow ocular feelings for example to preset expression, and user's narrows a degree and may simultaneously be not up to
It is default narrow ocular feelings narrow a degree.At this point it is possible to narrow a focal length for degree adjustment game display interface, example according to user
Such as, user to narrow a degree bigger, narrow a degree closer to default, then the focal length of display interface of playing is bigger, and user can see
Relatively farther scene;User narrows that a degree is smaller, closer to the facial expression of normal condition, the then focal length of display interface of playing
It is smaller, closer to user normally express one's feelings in the case of the game display interface that can see.
More obscure at a distance or when sight line is unintelligible not seen due to people, can subconsciousness to narrow eye farther to see
Object, therefore, by the above method, change that can be by detecting the circumocular muscle of people judges whether user is made that and narrows eye
Expression, so as to judge whether user has the demand for checking farther target.Shown with game by the ocular feelings degree of narrowing for setting up user
Show scale parameter relation between the focal length at interface, it is possible to use family controls game to show by subconscious facial muscle action
Interface, the game display interface of demand is more met so as to see.Similarly, when user makes aiming expression or other default expressions
When, the above method can also make game display interface show that game aims at interface or other game for meeting user's request show boundary
Face, the disclosure is not particularly limited to this.
Embodiment of the present disclosure implementation method is described in detail below by specific embodiment.
Fig. 2 is the schematic diagram of human face's characteristic point that MPEG-4 standards specify.
With reference to Fig. 2, in MPEG-4 standards, FAP includes 84 face feature points, and face feature point can be ordered by position
Name.In the present embodiment, can be using iris diameter, two centre distance, eyes-nose distance as can most characterize user's eye
The characteristic value of expression, selection can express the characteristic point of these characteristic values, for example characteristic point 3.10,3.14,5.4,4.4,4.6,
3.12。
Fig. 3 is disclosure exemplary embodiment septum reset characteristic parameter schematic diagram.
With reference to Fig. 3, can be by length (by taking right eye as an example) L1 (3.10&3.14), the L2 (5.4& between several groups of characteristic points
4.4), L3 (4.6&3.12) sets Facial Animation Parameters K.For example, Facial Animation Parameters K is defined as:K=a1*L1+a2*L2+
A3*L3, wherein a1, a2, a3 are constant, for distributing three sections of weights of L values.
Therefore, it can use the reverse modeling methods such as surface reconstruction to obtain user's by the reflected light of user's face first
Mask, and features described above point is positioned on mask.Then, prompting user makes normal expression, records features described above
The coordinate of point calculates the length between characteristic point as the first coordinate according to the first coordinate, so as to obtain first facial expression
Parameter K0.
Next, user can be pointed out to make default expression, for example, ocular feelings are narrowed, obtain the second seat of features described above point
Mark, and the length between characteristic point is calculated according to the second coordinate, obtain the second Facial Animation Parameters K1.Obtain the mistake of the second coordinate
Journey can be including judging whether the emotional state that user's face muscle is presented is stablized, such as the stabilization time of each feature point coordinates
More than Preset Time, to judge that user is made that its default facial expression being intended by (for example, narrows ocular as indicated
Feelings).
Tracing and positioning features described above point, and the length between characteristic point is calculated, so as to obtain the real-time face expression of user
Parameter K2, and the default facial expression degree of user is obtained according to K2, for example narrow a degree.
When default facial expression for narrow ocular feelings when, can set game display interface visual range initially show focal length as
J0, it is allowed to which the maximum focal length that can be seen is J1, and (K1/K0)=a (J1/J0), the wherein value of constant a can be by (K1/K0)/(J1/
J0) determine.
At this point it is possible to obtain the corresponding relation J=between the focal length J of game display interface and user's face expression parameter K
K/a.Because real-time face expression parameter is represented by K2/K0, therefore, determine the real-time of real-time game display interface by calculating
Display focal length J2, and according to J2 adjustment game display interfaces.
Fig. 4 A are that disclosure example embodiment mid-focal length is the game display interface for initially showing focal length J0.Fig. 4 B are this public affairs
It is the game display interface for showing focal length J2 in real time to open example embodiment mid-focal length.
With reference to Fig. 4 A~Fig. 4 B, when user makes normal expression, K2=K0, J2=J0, display focal length are the game of J0
Display interface;When default expression degree is less than 1, J2 < J1, display focal length is the game display interface of J2;When default expression journey
When degree is more than or equal to 1, it can be deduced that J2 >=J1, maximum focal length J1 now can be still shown.
Fig. 5 A are the schematic diagrames of normal display interface in disclosure example embodiment.Fig. 5 B are disclosure exemplary embodiments
The middle schematic diagram for aiming at display interface.
With reference to Fig. 5 A~Fig. 5 B, when default expression is to aim at expression, can after first facial expression parameter K0 is obtained,
Prompting user makes aiming expression, and is judging that user's face muscle stabilization time meets or exceeds a Preset Time and (for example may be used
Think 0.5s) when, the second coordinate of user's face characteristic point is obtained, so as to obtain the second Facial Animation Parameters K1.Should be noted
, in the present embodiment, obtaining the second Facial Animation Parameters K1 also includes obtaining Jiao of user's sight line in game display interface
Point position F.
When by obtain user's real-time face express one's feelings parameter so as to judge user be made that aiming expression when, can show with
Above-mentioned focal position F is the center of circle, the aiming interface with default value as radius.Further, when judge the 3rd facial expression join
When number drops to K0 suddenly in Preset Time from K1, game display interface is reduced to normal interface.The display at above-mentioned aiming interface
Mode can be various, such as rectangle, target shape, foresight mark etc., and the disclosure is not limited.
Corresponding to above method embodiment, the disclosure also provides a kind of game interaction device, for performing above method reality
Apply example.
Fig. 2 schematically shows a kind of block diagram of game interaction device in disclosure exemplary embodiment.With reference to Fig. 2, trip
Play interactive device 200 includes parameter setting module 202, expression monitoring module 204, expression judge module 206 and interface adjustment
Module 208.
Parameter setting module 202 is used to obtain default expression parameter of the user under default expression.In one kind of the disclosure
In exemplary embodiment, default facial expression can include narrowing ocular feelings and aiming at expressing one's feelings.
Expression monitoring module 204 is used to obtain the real-time expression parameter of user.
Expression judge module 206 is used to judge the default expression degree of user, and the default expression degree characterizes real-time table
The degree of approach between feelings and the default expression.
Interface adjusting module 208 is used for according to the default expression degree adjustment game display interface.
Due to device 200 functions in corresponding embodiment of the method describe in detail, the disclosure in this no longer
Repeat.
The game interaction method and apparatus of the disclosure is by carrying out reverse modeling, and tracing and positioning user plane to user's face
Portion's characteristic point, judges that user's face is expressed one's feelings, and realizes and express one's feelings naturally adjustment game display interface according to user, is not needing user
The adaptation to user view is realized in the case of active control, compared to traditional game interaction mode, with interactive speed
Hurry up, the advantages of interactive mode is natural, greatly improve the game experiencing of user.
Those skilled in the art will readily occur to its of the disclosure after considering specification and putting into practice invention disclosed herein
Its embodiment.The application is intended to any modification, purposes or the adaptations of the disclosure, these modifications, purposes or
Person's adaptations follow the general principle of the disclosure and including the undocumented common knowledge in the art of the disclosure
Or conventional techniques.Description and embodiments are considered only as exemplary, and the true scope of the disclosure and spirit are by appended
Claim is pointed out.
Claims (10)
1. a kind of game interaction method, it is characterised in that including:
Obtain default expression parameter of the user under default expression;
Obtain the real-time expression parameter of user;
Judge the default expression degree of user, the default expression degree characterizes connecing between expression and the default expression in real time
Recency;
According to the default expression degree adjustment game display interface.
2. game interaction method according to claim 1, it is characterised in that the default expression parameter for obtaining user includes:
Obtain the first coordinate of the face feature point under user's normal condition;
Obtain the second coordinate of the face feature point of the user under the default expression;
The default expression parameter is obtained with second coordinate according to first coordinate.
3. game interaction method according to claim 2, it is characterised in that the real-time expression parameter for obtaining user includes:
The face feature point is followed the trail of, to obtain the 3rd coordinate of the face feature point;
The real-time expression parameter is obtained with the 3rd coordinate according to first coordinate.
4. game interaction method according to claim 1, it is characterised in that the default facial expression includes narrowing ocular feelings
Expressed one's feelings with aiming at.
5. game interaction method according to claim 4, it is characterised in that obtaining the default expression parameter of user includes obtaining
Take the sight line focal position under the normal condition of family.
6. game interaction method according to claim 4, it is characterised in that pre- with described by the real-time expression parameter
If the default expression degree of the ratio in judgement user of parameter of expressing one's feelings.
7. game interaction method according to claim 6, it is characterised in that when the ratio is more than or equal to 1, display is pre-
If interface.
8. game interaction method according to claim 6, it is characterised in that when the ratio is less than 1, according to the ratio
The focal length of value adjustment game display interface.
9. a kind of game interaction device, it is characterised in that including:
Parameter setting module, for obtaining default expression parameter of the user under default expression;
Expression monitoring module, the real-time expression parameter for obtaining user;
Expression judge module, the default expression degree for judging user;
Interface adjusting module, for according to the default expression degree adjustment game display interface.
10. game interaction device according to claim 9, it is characterised in that the default facial expression includes narrowing ocular
Feelings and aiming are expressed one's feelings.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611041552.2A CN106708257A (en) | 2016-11-23 | 2016-11-23 | Game interaction method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611041552.2A CN106708257A (en) | 2016-11-23 | 2016-11-23 | Game interaction method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106708257A true CN106708257A (en) | 2017-05-24 |
Family
ID=58933820
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611041552.2A Pending CN106708257A (en) | 2016-11-23 | 2016-11-23 | Game interaction method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106708257A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107583277A (en) * | 2017-09-06 | 2018-01-16 | 合肥庆响网络科技有限公司 | Wearable game interaction device and interactive system |
CN107665074A (en) * | 2017-10-18 | 2018-02-06 | 维沃移动通信有限公司 | A kind of color temperature adjusting method and mobile terminal |
CN109542230A (en) * | 2018-11-28 | 2019-03-29 | 北京旷视科技有限公司 | Image processing method, device, electronic equipment and storage medium |
CN111507143A (en) * | 2019-01-31 | 2020-08-07 | 北京字节跳动网络技术有限公司 | Expression image effect generation method and device and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101593465A (en) * | 2009-07-06 | 2009-12-02 | 北京派瑞根科技开发有限公司 | Electronics with expression shape change is drawn |
CN103971131A (en) * | 2014-05-13 | 2014-08-06 | 华为技术有限公司 | Preset facial expression recognition method and device |
CN104270253A (en) * | 2014-10-21 | 2015-01-07 | 中国建设银行股份有限公司 | Method, devices and system for user identity authentication |
CN105739688A (en) * | 2016-01-21 | 2016-07-06 | 北京光年无限科技有限公司 | Man-machine interaction method and device based on emotion system, and man-machine interaction system |
CN106095112A (en) * | 2016-06-24 | 2016-11-09 | 联想(北京)有限公司 | A kind of information processing method and device |
-
2016
- 2016-11-23 CN CN201611041552.2A patent/CN106708257A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101593465A (en) * | 2009-07-06 | 2009-12-02 | 北京派瑞根科技开发有限公司 | Electronics with expression shape change is drawn |
CN103971131A (en) * | 2014-05-13 | 2014-08-06 | 华为技术有限公司 | Preset facial expression recognition method and device |
CN104270253A (en) * | 2014-10-21 | 2015-01-07 | 中国建设银行股份有限公司 | Method, devices and system for user identity authentication |
CN105739688A (en) * | 2016-01-21 | 2016-07-06 | 北京光年无限科技有限公司 | Man-machine interaction method and device based on emotion system, and man-machine interaction system |
CN106095112A (en) * | 2016-06-24 | 2016-11-09 | 联想(北京)有限公司 | A kind of information processing method and device |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107583277A (en) * | 2017-09-06 | 2018-01-16 | 合肥庆响网络科技有限公司 | Wearable game interaction device and interactive system |
CN107665074A (en) * | 2017-10-18 | 2018-02-06 | 维沃移动通信有限公司 | A kind of color temperature adjusting method and mobile terminal |
CN109542230A (en) * | 2018-11-28 | 2019-03-29 | 北京旷视科技有限公司 | Image processing method, device, electronic equipment and storage medium |
CN111507143A (en) * | 2019-01-31 | 2020-08-07 | 北京字节跳动网络技术有限公司 | Expression image effect generation method and device and electronic equipment |
US12020469B2 (en) | 2019-01-31 | 2024-06-25 | Beijing Bytedance Network Technology Co., Ltd. | Method and device for generating image effect of facial expression, and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6977134B2 (en) | Field of view (FOV) aperture of virtual reality (VR) content on head-mounted display | |
US9943755B2 (en) | Device for identifying and tracking multiple humans over time | |
US10802580B2 (en) | Technique for controlling virtual image generation system using emotional states of user | |
JP7362806B2 (en) | Information processing device, control method for information processing device, information processing system and program | |
Ranganathan et al. | Perception-action coupling and anticipatory performance in baseball batting | |
KR101855639B1 (en) | Camera navigation for presentations | |
JP5943913B2 (en) | User tracking feedback | |
CN109086726A (en) | A kind of topography's recognition methods and system based on AR intelligent glasses | |
CN112198959A (en) | Virtual reality interaction method, device and system | |
US20130095924A1 (en) | Enhancing a sport using an augmented reality display | |
CN106708257A (en) | Game interaction method and device | |
BRPI1011193B1 (en) | method and system for providing assistance with respect to a gesture performed by the user | |
WO2013033842A1 (en) | System and method for using eye gaze information to enhance interactions | |
CN111726518A (en) | System for capturing images and camera device | |
CN113709411A (en) | Sports auxiliary training system of MR intelligent glasses based on eye movement tracking technology | |
KR20190096988A (en) | Information processing apparatus, information processing method, and program | |
CN108829233A (en) | A kind of exchange method and device | |
WO2019210087A1 (en) | Methods, systems, and computer readable media for testing visual function using virtual mobility tests | |
Li et al. | Evaluation of the fine motor skills of children with DCD using the digitalised visual‐motor tracking system | |
WO2021163334A1 (en) | Adaptive virtual rehabilitation | |
EP4083854A1 (en) | System and method of head mounted display personalisation | |
CN104318228A (en) | Method for acquiring optimal visual field through head-mounted video recording device | |
Duchowski | Serious gaze | |
Yang et al. | Bimanual natural user interaction for 3D modelling application using stereo computer vision | |
Zhang et al. | Attention guided deep imitation learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170524 |