CN106570921A - Cartoon character expression display method and system - Google Patents

Cartoon character expression display method and system Download PDF

Info

Publication number
CN106570921A
CN106570921A CN201611018952.1A CN201611018952A CN106570921A CN 106570921 A CN106570921 A CN 106570921A CN 201611018952 A CN201611018952 A CN 201611018952A CN 106570921 A CN106570921 A CN 106570921A
Authority
CN
China
Prior art keywords
target object
coordinate
cartoon
facial expression
expression image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611018952.1A
Other languages
Chinese (zh)
Inventor
李青菲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Genius Technology Co Ltd
Original Assignee
Guangdong Genius Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Genius Technology Co Ltd filed Critical Guangdong Genius Technology Co Ltd
Priority to CN201611018952.1A priority Critical patent/CN106570921A/en
Publication of CN106570921A publication Critical patent/CN106570921A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention is applicable to the technical field of computers, and provides a cartoon character expression display method and system. The method comprises the steps that the dynamic coordinate of a target object is acquired, wherein the dynamic coordinate is the coordinate of the moving target object; according to the dynamic coordinate of the target object and the center coordinate of the cartoon character eyeball, the offset angle is calculated, and the offset angle is the moving angle of the cartoon character eyeball with the target object; and according to a first expression image matching the offset angle, the first expression image is displayed in a game interface. According to the invention, the dynamic coordinate of the target object is acquired in real time; the offset angle is calculated according to the center coordinate of the cartoon character eyeball; according to the expression image matching the offset angle, the cartoon character eyeball rotates according to the moving direction of the target object; and the interactivity and interestingness of a game are enhanced.

Description

A kind of expression display packing of cartoon figure and system
Technical field
The invention belongs to field of computer technology, more particularly to a kind of expression display packing and system of cartoon figure.
Background technology
In the game being much applied on mobile terminal device, all-purpose card personage is normally set up, and user is in game During when carrying out different operations, the eyeball of the cartoon figure is not moved, and expression is fixed.For example, Little Bear eats fruit Game in, set all-purpose card Little Bear, have various fruits in face of Little Bear, as target object, change according on interface Fruit title, user selects corresponding fruit, and after clicking on target object, in being moved to Little Bear mouth, and mesh user is selecting mesh After mark object, the eyeball of cartoon Little Bear can not follow the direction of user's movement target object to enter during mobile target object Row is corresponding to be rotated, and in expression stationary state, the cartoon figure's expression in game is stiff, causes game visual effect on driving birds is not good, Real expression can not be embodied, it is impossible to allow user to produce mutually dynamic, so as to lose interest, playability.
The content of the invention
It is an object of the invention to provide the expression display packing and system of a kind of cartoon figure, it is intended to solve prior art The eyeball of middle cartoon figure can not follow the direction of user's movement target object to be rotated accordingly, cause game visual effect It is not good, it is impossible to allow user to produce mutually dynamic.
On the one hand, the invention provides a kind of expression display packing of cartoon figure, methods described comprises the steps:
The dynamic coordinate of target object is obtained, the dynamic coordinate is the coordinate when target object is moved;
According to the dynamic coordinate and the centre coordinate of cartoon figure's eyeball of the target object, deviation angle, institute are calculated State the angle that deviation angle is that cartoon figure's eyeball is moved with target object;
Corresponding first facial expression image is matched according to the deviation angle, and first expression is shown in interface Image.
On the other hand, the invention provides a kind of expression display system of cartoon figure, the system includes:
Dynamic coordinate acquiring unit, for obtaining the dynamic coordinate of target object, the dynamic coordinate is the object Coordinate when body is moved;
Deviation angle computing unit, for according to the center of the dynamic coordinate of the target object and cartoon figure's eyeball Coordinate, calculates deviation angle, and the deviation angle is the angle that cartoon figure's eyeball is moved with target object;And
First facial expression image display unit, for matching corresponding first facial expression image according to the deviation angle, and First facial expression image is shown in interface.
In embodiments of the present invention, by the dynamic coordinate for obtaining target object in real time, and according to cartoon figure's eyeball Centre coordinate, calculates deviation angle, and according to deviation angle corresponding facial expression image is matched, realize the eyeball of cartoon figure according to The moving direction of target object is rotated, and strengthens interactive, the interest of game.
Description of the drawings
Fig. 1 is the flowchart of the expression display packing of the cartoon figure that the embodiment of the present invention one is provided;
Fig. 2 is the flowchart of the expression display packing of the cartoon figure that the embodiment of the present invention two is provided;
Fig. 3 is the structural representation of the expression display system of the cartoon figure that the embodiment of the present invention three is provided;And
Fig. 4 is the structural representation of the expression display system of the cartoon figure that the embodiment of the present invention four is provided.
Specific embodiment
In order that the objects, technical solutions and advantages of the present invention become more apparent, it is right below in conjunction with drawings and Examples The present invention is further elaborated.It should be appreciated that specific embodiment described herein is only to explain the present invention, and It is not used in the restriction present invention.
Implementing for the present invention is described in detail below in conjunction with specific embodiment:
Embodiment one:
Fig. 1 shows the flowchart of the expression display packing of the cartoon figure that the embodiment of the present invention one is provided, in order to It is easy to explanation, illustrate only the part related to the embodiment of the present invention, details are as follows:
In step S101, the dynamic coordinate of target object is obtained.
In embodiments of the present invention, in gaming, a cartoon figure is set, around cartoon figure's target setting object Interactive operation is carried out with the cartoon figure, a scene of game is constituted, generally can be needed to set one or more according to game Target object, user moves the target object by touch control operation, and in interface, target object is grasped with the touch-control of user Make, its coordinate is constantly changing, and forms dynamic coordinate, the coordinate when dynamic coordinate is moved for target object.
Further, in the region of target object, the touch control operation that user sends is detected;
According to touch control operation, the movement target object;
The dynamic coordinate of target object is obtained in real time.
Specifically, in interface, target object according to default, during the region of a certain setting can be placed on, because This, only need to be in target object region, the touch control operation that real-time detection user sends, and is located when user clicks on target object During region, detection touch control operation is to slide on screen, with the mobile target object of correspondence of sliding, while obtaining object The dynamic coordinate of body, remembers that the dynamic coordinate is (x1, y1)。
In step s 102, according to the dynamic coordinate and the centre coordinate of cartoon figure's eyeball of target object, calculate inclined Move angle.
In embodiments of the present invention, the centre coordinate of cartoon figure's eyeball is fixed coordinate figure, is designated as (x2, y2), and root According to the dynamic coordinate (x of target object1, y1), using trigonometric function, deviation angle can be calculated, deviation angle is cartoon figure The angle that eyeball is moved with target object.
Further, the centre coordinate of cartoon figure's eyeball is obtained;
The difference and horizontal stroke of the vertical coordinate of the centre coordinate of the dynamic coordinate and cartoon figure's eyeball of target object are calculated respectively The difference of coordinate;
The ratio of the difference of the difference and abscissa of vertical coordinate is calculated, deviation angle is obtained.
Specifically, the centre coordinate (x of cartoon figure's eyeball is obtained2, y2), calculate the dynamic coordinate and cartoon of target object The difference of the vertical coordinate of the centre coordinate of personage's eyeball is y2-y1, and calculate the dynamic coordinate and cartoon figure's eyeball of target object The difference of the abscissa of centre coordinate is, then calculates difference y of vertical coordinate2-y1And difference x of abscissa2-x1Ratio (y2-y1)/(x2- x1), as deviation angle.
In step s 103, corresponding first facial expression image is matched according to deviation angle, and shows the in interface One facial expression image.
In embodiments of the present invention, according to the deviation angle for calculating, corresponding first facial expression image is matched, and in game The first facial expression image is shown in interface, so as to the eyeball for realizing cartoon figure is rotated according to the moving direction of target object.
Further, mapping relations are formed with corresponding first facial expression image previously according to different deviation angles, is set up First expression data storehouse;
Corresponding first facial expression image is matched in the first expression data storehouse according to deviation angle;
First facial expression image is included in the interface.
Specifically, the first expression data storehouse is pre-build, first expression data stock contains to be formed with deviation angle to be reflected The first expression data storehouse of relation is penetrated, can be after deviation angle be calculated, according to deviation angle in the first expression data storehouse Corresponding first facial expression image is matched, and the first facial expression image is included in the interface.
In embodiments of the present invention, by the dynamic coordinate for obtaining target object in real time, and according to cartoon figure's eyeball Centre coordinate, calculates deviation angle, and according to deviation angle corresponding facial expression image is matched, realize the eyeball of cartoon figure according to The moving direction of target object is rotated, and strengthens interactive, the interest of game.
Embodiment two:
Fig. 2 shows the flowchart of the expression display packing of the cartoon figure that the embodiment of the present invention two is provided, in order to It is easy to explanation, illustrate only the part related to the embodiment of the present invention, details are as follows:
In step s 201, the dynamic coordinate of target object is obtained.
In step S202, according to the dynamic coordinate and the centre coordinate of cartoon figure's eyeball of target object, calculate inclined Move angle.
In step S203, corresponding first facial expression image is matched according to deviation angle, and show the in interface One facial expression image.
In step S204, the dynamic coordinate of target object is compared with default game rule, obtain comparing knot Really.
In step S205, corresponding second facial expression image is matched according to comparison result, and show the in interface Two facial expression images.
In embodiments of the present invention, the first facial expression image for cartoon figure eyeball image, the eyeball and eye of cartoon figure Socket of the eye distance sets according to real needs, and the distance between 2 points of eyeball for the cartoon figure that for example plays is 6 pixels, therefore in eyeball The distance between distance and eye socket of heart point are set to 6 pixels.The dynamic coordinate of target object is entered with default game rule Row is compared, and for example, regulation selects the target object of a certain setting to throw in the mouth of cartoon figure in default game rule, when When user sends touch control operation, can learn whether selected target object meets a certain specified in default game rule The target object of setting, and can be known whether in the mouth for being moved to cartoon figure according to the dynamic coordinate of the target object, from And comparison result is obtained, the comparison result includes correct operation and faulty operation, and according to comparison result corresponding second table is matched Feelings image simultaneously shows the second facial expression image in interface.For example, when the dynamic coordinate of target object is to be moved to cartoon character When in the mouth of thing, then the second facial expression image that cartoon figure makes laboratory rodent chow of dehiscing is matched;When target object is game rule Specified in a certain setting target object when, then match cartoon figure's performance it is happy/praise/airkiss/laugh etc. any one Second facial expression image of expression;Otherwise match the second table of corresponding angry/nausea/first-class any one expression of making fun/stuck out one's tongue Feelings image.
One of ordinary skill in the art will appreciate that realizing that all or part of step in above-described embodiment method can be Related hardware is instructed to complete by program, described program can be stored in a computer read/write memory medium, Described storage medium, such as ROM/RAM, disk, CD.
Embodiment three:
Fig. 3 shows the structural representation of the expression display system of the cartoon figure that the embodiment of the present invention three is provided, in order to It is easy to explanation, illustrate only the part related to the embodiment of the present invention.In embodiments of the present invention, the expression of cartoon figure shows System includes:Dynamic coordinate acquiring unit 31, the facial expression image display unit 33 of deviation angle computing unit 32 and first, its In:
Dynamic coordinate acquiring unit 31, for obtaining the dynamic coordinate of target object, the dynamic coordinate is the target Coordinate when object is moved.
In embodiments of the present invention, in gaming, a cartoon figure is set, around cartoon figure's target setting object Interactive operation is carried out with the cartoon figure, a scene of game is constituted, generally can be needed to set one or more according to game Target object, user moves the target object by touch control operation, and in interface, target object is grasped with the touch-control of user Make, its coordinate is constantly changing, and forms dynamic coordinate, the coordinate when dynamic coordinate is moved for target object.
Further, dynamic coordinate acquiring unit 31 includes:
Detector unit 311, in the region of target object, detecting the touch control operation that user sends;
Mobile unit 312, for according to touch control operation, movement target object;And
Dynamic coordinate acquiring unit 313, for the dynamic coordinate for obtaining target object in real time.
Specifically, in interface, target object according to default, during the region of a certain setting can be placed on, because This, only need to be in target object region, the touch control operation that real-time detection user sends, and is located when user clicks on target object During region, detection touch control operation is to slide on screen, with the mobile target object of correspondence of sliding, while obtaining object The dynamic coordinate of body, remembers that the dynamic coordinate is (x1, y1)。
Deviation angle computing unit 32, for according in the dynamic coordinate of the target object and cartoon figure's eyeball Heart coordinate, calculates deviation angle, and the deviation angle is the angle that cartoon figure's eyeball is moved with target object.
In embodiments of the present invention, the centre coordinate of cartoon figure's eyeball is fixed coordinate figure, is designated as (x2, y2), and root According to the dynamic coordinate (x of target object1, y1), using trigonometric function, deviation angle can be calculated, deviation angle is cartoon figure The angle that eyeball is moved with target object.
Further, deviation angle computing unit 32 includes:
Centre coordinate acquiring unit 321, for obtaining the centre coordinate of cartoon figure's eyeball;
Difference computational unit 322, the center of dynamic coordinate and cartoon figure's eyeball for calculating target object respectively is sat The difference of target vertical coordinate and the difference of abscissa;And
Ratio calculation unit 323, for calculating the ratio of the difference of vertical coordinate and the difference of abscissa, obtains deviation angle.
Specifically, the centre coordinate (x of cartoon figure's eyeball is obtained2, y2), calculate the dynamic coordinate and cartoon of target object The difference of the vertical coordinate of the centre coordinate of personage's eyeball is y2-y1, and calculate the dynamic coordinate and cartoon figure's eyeball of target object The difference of the abscissa of centre coordinate is, then calculates difference y of vertical coordinate2-y1And difference x of abscissa2-x1Ratio (y2-y1)/(x2- x1), as deviation angle.
First facial expression image display unit 33, for matching corresponding first facial expression image according to the deviation angle, and First facial expression image is shown in interface.
In embodiments of the present invention, according to the deviation angle for calculating, corresponding first facial expression image is matched, and in game The first facial expression image is shown in interface, so as to the eyeball for realizing cartoon figure is rotated according to the moving direction of target object.
Further, the first facial expression image display unit 33 includes:
Database unit 331, for being formed with corresponding first facial expression image previously according to different deviation angles Mapping relations, set up the first expression data storehouse;
Matching unit 332, for corresponding first facial expression image to be matched in the first expression data storehouse according to deviation angle; And
First display unit 333, for the first facial expression image to be included in interface.
Specifically, the first expression data storehouse is pre-build, first expression data stock contains to be formed with deviation angle to be reflected The first expression data storehouse of relation is penetrated, can be after deviation angle be calculated, according to deviation angle in the first expression data storehouse Corresponding first facial expression image is matched, and the first facial expression image is included in the interface.
In embodiments of the present invention, by the dynamic coordinate for obtaining target object in real time, and according to cartoon figure's eyeball Centre coordinate, calculates deviation angle, and according to deviation angle corresponding facial expression image is matched, realize the eyeball of cartoon figure according to The moving direction of target object is rotated, and strengthens interactive, the interest of game.
Example IV:
Fig. 4 shows the structural representation of the expression display system of the cartoon figure that the embodiment of the present invention four is provided, in order to It is easy to explanation, illustrate only the part related to the embodiment of the present invention.In embodiments of the present invention, the expression of cartoon figure shows System includes:Dynamic coordinate acquiring unit 41, deviation angle computing unit 42, the first facial expression image display unit 43, comparison are single The facial expression image display unit 45 of unit 44 and second, wherein:
Dynamic coordinate acquiring unit 41, for obtaining the dynamic coordinate of target object, the dynamic coordinate is the target Coordinate when object is moved;
Deviation angle computing unit 42, for according in the dynamic coordinate of the target object and cartoon figure's eyeball Heart coordinate, calculates deviation angle, and the deviation angle is the angle that cartoon figure's eyeball is moved with target object;
First facial expression image display unit 43, for matching corresponding first facial expression image according to the deviation angle, and First facial expression image is shown in interface;
Comparing unit 44, for the dynamic coordinate of the target object to be compared with default game rule, obtains Comparison result;And
Second facial expression image display unit 45, according to comparison result corresponding second facial expression image is matched, and on game circle Second facial expression image is shown in face.
In embodiments of the present invention, the first facial expression image for cartoon figure eyeball image, the eyeball and eye of cartoon figure Socket of the eye distance sets according to real needs, and the distance between 2 points of eyeball for the cartoon figure that for example plays is 6 pixels, therefore in eyeball The distance between distance and eye socket of heart point are set to 6 pixels.The dynamic coordinate of target object is entered with default game rule Row is compared, and for example, regulation selects the target object of a certain setting to throw in the mouth of cartoon figure in default game rule, when When user sends touch control operation, can learn whether selected target object meets a certain specified in default game rule The target object of setting, and can be known whether in the mouth for being moved to cartoon figure according to the dynamic coordinate of the target object, from And comparison result is obtained, the comparison result includes correct operation and faulty operation, and according to comparison result corresponding second table is matched Feelings image simultaneously shows the second facial expression image in interface.For example, when the dynamic coordinate of target object is to be moved to cartoon character When in the mouth of thing, then the second facial expression image that cartoon figure makes laboratory rodent chow of dehiscing is matched;When target object is game rule Specified in a certain setting target object when, then match cartoon figure's performance it is happy/praise/airkiss/laugh etc. any one Second facial expression image of expression;Otherwise match the second table of corresponding angry/nausea/first-class any one expression of making fun/stuck out one's tongue Feelings image.
In embodiments of the present invention, each unit of the expression display system of cartoon figure can be by corresponding hardware or software list Unit realizes that each unit can be independent soft and hardware unit, it is also possible to be integrated into a soft and hardware unit, here is not to limit The system present invention.The embodiment of the system each unit specifically refers to the description of previous embodiment one, will not be described here.
Presently preferred embodiments of the present invention is the foregoing is only, not to limit the present invention, all essences in the present invention Any modification, equivalent and improvement made within god and principle etc., should be included within the scope of the present invention.

Claims (10)

1. the expression display packing of a kind of cartoon figure, it is characterised in that methods described comprises the steps:
The dynamic coordinate of target object is obtained, the dynamic coordinate is the coordinate when target object is moved;
According to the dynamic coordinate and the centre coordinate of cartoon figure's eyeball of the target object, deviation angle is calculated, it is described inclined Move the angle that angle is that cartoon figure's eyeball is moved with target object;
Corresponding first facial expression image is matched according to the deviation angle, and the first expression figure is shown in interface Picture.
2. the method for claim 1, it is characterised in that the step of obtaining the dynamic coordinate of target object, including:
In the region of the target object, the touch control operation that user sends is detected;
According to the touch control operation, the movement target object;
The dynamic coordinate of the target object is obtained in real time.
3. the method for claim 1, it is characterised in that the dynamic coordinate and cartoon figure according to the target object The step of centre coordinate of eyeball, calculating deviation angle, including:
Obtain the centre coordinate of cartoon figure's eyeball;
Calculate respectively the dynamic coordinate of the target object and the centre coordinate of cartoon figure's eyeball vertical coordinate difference with And the difference of abscissa;
The ratio of the difference of the vertical coordinate and the difference of the abscissa is calculated, deviation angle is obtained.
4. the method for claim 1, it is characterised in that corresponding facial expression image is matched according to the deviation angle, and The step of first facial expression image being shown in interface, including:
Mapping relations are formed with corresponding first facial expression image previously according to different deviation angles, the first expression data is set up Storehouse;
Corresponding first facial expression image is matched in the first expression data storehouse according to the deviation angle;
First facial expression image is included in the interface.
5. the method for claim 1, it is characterised in that methods described also includes:
The dynamic coordinate of the target object is compared with default game rule, comparison result is obtained;
Corresponding second facial expression image is matched according to comparison result, and second facial expression image is shown in interface.
6. the expression display system of a kind of cartoon figure, it is characterised in that the system includes:
Dynamic coordinate acquiring unit, for obtaining the dynamic coordinate of target object, the dynamic coordinate is target object shifting Coordinate when dynamic;
Deviation angle computing unit, for being sat according to the center of the dynamic coordinate of the target object and cartoon figure's eyeball Mark, calculates deviation angle, and the deviation angle is the angle that cartoon figure's eyeball is moved with target object;And
First facial expression image display unit, for matching corresponding first facial expression image according to the deviation angle, and in game First facial expression image is shown in interface.
7. system as claimed in claim 6, it is characterised in that the dynamic coordinate acquiring unit includes:
Detector unit, in the region of the target object, detecting the touch control operation that user sends;
Mobile unit, for according to the touch control operation, the movement target object;And
Dynamic coordinate acquiring unit, for obtaining the dynamic coordinate of the target object in real time.
8. system as claimed in claim 6, it is characterised in that the deviation angle computing unit includes:
Centre coordinate acquiring unit, for obtaining the centre coordinate of cartoon figure's eyeball;
Difference computational unit, the center of dynamic coordinate and cartoon figure's eyeball for calculating the target object respectively is sat The difference of target vertical coordinate and the difference of abscissa;And
Ratio calculation unit, for calculating the ratio of the difference of the vertical coordinate and the difference of the abscissa, obtains deviation angle.
9. system as claimed in claim 6, it is characterised in that the first facial expression image display unit includes:
Database unit, closes for forming mapping previously according to different deviation angles and corresponding first facial expression image System, sets up the first expression data storehouse;
Matching unit, for corresponding first expression figure to be matched in the first expression data storehouse according to the deviation angle Picture;And
First display unit, for first facial expression image to be included in the interface.
10. system as claimed in claim 6, it is characterised in that the system also includes:
Comparing unit, for the dynamic coordinate of the target object to be compared with default game rule, obtains comparing knot Really;And
Second facial expression image display unit, according to comparison result corresponding second facial expression image is matched, and is shown in interface Show second facial expression image.
CN201611018952.1A 2016-11-18 2016-11-18 Cartoon character expression display method and system Pending CN106570921A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611018952.1A CN106570921A (en) 2016-11-18 2016-11-18 Cartoon character expression display method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611018952.1A CN106570921A (en) 2016-11-18 2016-11-18 Cartoon character expression display method and system

Publications (1)

Publication Number Publication Date
CN106570921A true CN106570921A (en) 2017-04-19

Family

ID=58543004

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611018952.1A Pending CN106570921A (en) 2016-11-18 2016-11-18 Cartoon character expression display method and system

Country Status (1)

Country Link
CN (1) CN106570921A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107765987A (en) * 2017-11-03 2018-03-06 北京密境和风科技有限公司 A kind of user interaction approach and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102331792A (en) * 2011-06-24 2012-01-25 天津市亚安科技电子有限公司 Method and system for controlling presetting bit of holder
CN103020983A (en) * 2012-09-12 2013-04-03 深圳先进技术研究院 Human-computer interaction device and method used for target tracking
CN103365434A (en) * 2013-08-02 2013-10-23 崔一郎 Method for aiming with mobile terminal
CN103777760A (en) * 2014-02-26 2014-05-07 北京百纳威尔科技有限公司 Method and device for switching screen display direction
CN104883557A (en) * 2015-05-27 2015-09-02 世优(北京)科技有限公司 Real time holographic projection method, device and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102331792A (en) * 2011-06-24 2012-01-25 天津市亚安科技电子有限公司 Method and system for controlling presetting bit of holder
CN103020983A (en) * 2012-09-12 2013-04-03 深圳先进技术研究院 Human-computer interaction device and method used for target tracking
CN103365434A (en) * 2013-08-02 2013-10-23 崔一郎 Method for aiming with mobile terminal
CN103777760A (en) * 2014-02-26 2014-05-07 北京百纳威尔科技有限公司 Method and device for switching screen display direction
CN104883557A (en) * 2015-05-27 2015-09-02 世优(北京)科技有限公司 Real time holographic projection method, device and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107765987A (en) * 2017-11-03 2018-03-06 北京密境和风科技有限公司 A kind of user interaction approach and device

Similar Documents

Publication Publication Date Title
US9952820B2 (en) Augmented reality representations across multiple devices
RU2648573C2 (en) Allocation of machine learning resource
CN102520574B (en) Time-of-flight depth imaging
US9292083B2 (en) Interacting with user interface via avatar
TWI469813B (en) Tracking groups of users in motion capture system
US8351651B2 (en) Hand-location post-process refinement in a tracking system
US9824260B2 (en) Depth image processing
US9886094B2 (en) Low-latency gesture detection
KR101741864B1 (en) Recognizing user intent in motion capture system
US20150070263A1 (en) Dynamic Displays Based On User Interaction States
US8619198B1 (en) Adjusting frame rates for video applications
CN108875539B (en) Expression matching method, device and system and storage medium
TWI758869B (en) Interactive object driving method, apparatus, device, and computer readable storage meidum
US11657627B2 (en) Focusing regions of interest using dynamic object detection for textual information retrieval
US20220030179A1 (en) Multilayer three-dimensional presentation
Gillies et al. Eye movements and attention for behavioural animation
US11024094B2 (en) Methods and apparatus to map a virtual environment to a physical environment
US20110085018A1 (en) Multi-User Video Conference Using Head Position Information
WO2022026603A1 (en) Object recognition neural network training using multiple data sources
CN106570921A (en) Cartoon character expression display method and system
US20230030260A1 (en) Systems and methods for improved player interaction using augmented reality
US20230334790A1 (en) Interactive reality computing experience using optical lenticular multi-perspective simulation
US20230334792A1 (en) Interactive reality computing experience using optical lenticular multi-perspective simulation
US20230334791A1 (en) Interactive reality computing experience using multi-layer projections to create an illusion of depth
US11074452B1 (en) Methods and systems for generating multiscale data representing objects at different distances from a virtual vantage point

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170419