CN107526443A - Augmented reality method, device, system, electronic equipment and storage medium - Google Patents

Augmented reality method, device, system, electronic equipment and storage medium Download PDF

Info

Publication number
CN107526443A
CN107526443A CN201710915128.4A CN201710915128A CN107526443A CN 107526443 A CN107526443 A CN 107526443A CN 201710915128 A CN201710915128 A CN 201710915128A CN 107526443 A CN107526443 A CN 107526443A
Authority
CN
China
Prior art keywords
image
target object
augmented reality
touch command
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710915128.4A
Other languages
Chinese (zh)
Inventor
郭佳楠
乔立
焦弟琴
余芝兰
张梁予笑
朱铭恩
陈小清
厉源
李源
甘彧
侯雯佩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Kingsoft Internet Security Software Co Ltd
Original Assignee
Beijing Kingsoft Internet Security Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Kingsoft Internet Security Software Co Ltd filed Critical Beijing Kingsoft Internet Security Software Co Ltd
Priority to CN201710915128.4A priority Critical patent/CN107526443A/en
Publication of CN107526443A publication Critical patent/CN107526443A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method, a device and a system for augmented reality, electronic equipment and a storage medium, relates to the technical field of computers, and can improve the user participation degree of augmented reality and improve the user experience. The augmented reality method comprises the following steps: acquiring a scene image through a camera and displaying the scene image on a screen; and receiving a touch instruction and generating a virtual image on the scenery image according to the touch instruction. The method is suitable for the augmented reality scene.

Description

A kind of augmented reality method, apparatus, system, electronic equipment and storage medium
Technical field
The present invention relates to field of computer technology, more particularly to a kind of augmented reality method, apparatus, system, electronic equipment And storage medium.
Background technology
AR (Augmented Reality, augmented reality) technology is proposed by nineteen ninety earliest, is that one kind believes real world Breath and " seamless " the integrated technology of virtual world information.For example, in the AR implementations based on mobile telephone display, can be by hand The real world picture input of machine camera shooting exports to cell phone system after being synthesized with caused virtual image in cell phone system To mobile telephone display.User is from mobile phone screen it can be seen that final augmented reality scene.
However, the virtual image applied at present in augmented reality is the ready-made image provided by service of goods side mostly, After mobile phone camera shooting real world image, and according to set program virtual image is projected on screen, it is whole to increase During strong reality, the participation of user is relatively low, and man-machine interaction is weaker, greatly reduces Consumer's Experience.
The content of the invention
In view of this, the embodiment of the present invention provides a kind of augmented reality method, apparatus, system, electronic equipment and storage and is situated between Matter, it is possible to increase user's participation, lift Consumer's Experience.
In a first aspect, the embodiment of the present invention provides a kind of augmented reality method, including:Scene image is obtained by camera And the scene image is included on screen;Receive touch command and the life on the scene image according to the touch command Into virtual image.
With reference in a first aspect, in the first embodiment of first aspect, the scene image includes target object Image;It is described virtual image is generated on the scene image according to the touch command to include:According to the touch command Virtual shooting image of the generation to the target object on the scene image.
It is described according to institute in second of embodiment of first aspect with reference to the first embodiment of first aspect Stating touch command and being generated on the scene image includes to the virtual shooting image of the target object:Referred to according to the touch-control Order generates the shooting track virtually shot on the scene image;According to the shooting track and the target object Motion state, determine whether the virtual shooting hits the target object on screen;Virtually screen is hit described In the case of the target object on curtain, Overlapping display hits effect accordingly on the image of the target object.
It is described according to institute in the third embodiment of first aspect with reference to second of embodiment of first aspect Stating touch command the shooting track virtually shot is generated on the scene image includes:Determined according to the touch command The firing direction virtually shot and ammunition flying speed;According to the firing direction and the ammunition flying speed described The shooting track virtually shot is generated on scene image.
With reference to the third embodiment of first aspect, in the 4th kind of embodiment of first aspect, the touch-control refers to Order includes glide direction of the finger on screen;The firing direction virtually shot is determined by the glide direction.
With reference to the third embodiment of first aspect, in the 5th kind of embodiment of first aspect, the touch-control refers to Order includes following at least one:Touch duration, pressing dynamics, sliding distance to screen;The ammunition flight virtually shot Speed determines according at least one of the touch duration to screen, the pressing dynamics, the sliding distance.
It is described according to institute in the 6th kind of embodiment of first aspect with reference to the third embodiment of first aspect State before touch command determines the firing direction virtually shot and ammunition flying speed, methods described also includes:Selection institute The weaponry virtually shot and used is stated, wherein different types of weaponry has different flying speeds and shooting model Enclose;It is described to determine that the firing direction virtually shot and ammunition flying speed include according to the touch command:According to described Touch command determines the firing direction virtually shot, and the flying speed of the weaponry is adjusted.
With reference to any embodiment in second to the 6th kind of first aspect, in the 7th kind of embodiment party of first aspect It is described according to the motion state for shooting track and the target object in formula, determine whether the virtual shooting hits The target object on screen includes:Detect and whether sent out before the ammunition flies out screen with the image of the target object Raw collision;Determine whether the virtual shooting hits the target object on screen according to testing result.
With reference to first aspect the first to any embodiment in the 6th kind, in the 8th kind of embodiment party of first aspect In formula, in addition to:Display participates in the head portrait for the other users virtually shot to the target object;When the other users In the image of from any user to target object when virtually being shot, the use is identified on the head portrait of any user Weaponry that family uses simultaneously shows the fire effect of the weaponry.
With reference to the 8th kind of embodiment of first aspect, in the 9th kind of embodiment of first aspect, in addition to:Display Other users hit number and/or score rank information to the target object.
With reference in a first aspect, in the tenth kind of embodiment of first aspect, the scene image includes target object Image;It is described virtual image is generated on the scene image according to the touch command to include:According to the touch command Virtual trap image of the generation to the target object on the scene image.
With reference to the tenth kind of embodiment of first aspect, in a kind of the tenth embodiment of first aspect, the basis The touch command generates on the scene image to be included to the virtual trap image of the target object:According to the touch-control The position on the scene image outside target object pre-determined distance described in distance is instructed to generate the virtual trap;By pre- If after duration, determine whether the target object falls into the virtual trap;The virtual trap is fallen into the target object In the case of, Overlapping display falls into effect accordingly on the image of the target object.
Second aspect, embodiments of the invention also provide a kind of augmented reality device, including:Acquiring unit, for passing through Camera obtains scene image and includes the scene image on screen;Generation unit, for receiving touch command and root According to the touch command virtual image is generated on the scene image.
With reference to second aspect, in the first embodiment of second aspect, the scene image includes target object Image;The generation unit includes shooting generation module, for being generated according to the touch command on the scene image To the virtual shooting image of the target object.
With reference to the first embodiment of second aspect, in second of embodiment of second aspect, the shooting life Include into module:First generation submodule, described is virtually penetrated for being generated according to the touch command on the scene image The shooting track hit;First determination sub-module, for according to it is described shooting track and the target object motion state, really Whether the fixed virtual shooting hits the target object on screen;First superposition submodule, in the virtual shooting In the case of hitting the target object on screen, Overlapping display hits effect accordingly on the image of the target object Fruit.
With reference to second of embodiment of second aspect, in the third embodiment of second aspect, first life Into submodule, it is specifically used for:The firing direction virtually shot and ammunition flying speed are determined according to the touch command;Root The shooting track virtually shot is generated on the scene image according to the firing direction and the ammunition flying speed.
With reference to second of second aspect or the third embodiment, in the 4th kind of embodiment of second aspect, institute The first determination sub-module is stated to be specifically used for:Detect and whether sent out before the ammunition flies out screen with the image of the target object Raw collision;Determine whether the virtual shooting hits the target object on screen according to testing result.
With reference to second aspect the first to the third embodiment, in the 5th kind of embodiment of second aspect, institute Stating augmented reality device also includes display unit, is used for:Display participates in other use virtually shot to the target object The head portrait at family;When the image of any user in the other users to the target object is virtually shot, described The weaponry that the user uses is identified on the head portrait of any user and shows the fire effect of the weaponry.
With reference to the 5th kind of embodiment of second aspect, in the 6th kind of embodiment of second aspect, the display is single Member, it is additionally operable to show that other users hit number and/or score rank information to the target object.
With reference to second aspect, in the 7th kind of embodiment of second aspect, the scene image includes target object Image;The generation unit includes trap generation module, for being generated according to the touch command on the scene image To the virtual trap image of the target object.
With reference to the 7th kind of embodiment of second aspect, in the 8th kind of embodiment of second aspect, the trap life Include into module:Second generation submodule, for the object according to touch command distance on the scene image Position outside body pre-determined distance generates the virtual trap;Second determination sub-module, for after preset duration, determining institute State whether target object falls into the virtual trap;Second superposition submodule, it is described virtual for being fallen into the target object In the case of trap, Overlapping display falls into effect accordingly on the image of the target object.
The third aspect, embodiments of the invention also provide a kind of augmented reality system, including server and at least two Any augmented reality device that embodiments of the invention provide;The server and each augmented reality device difference phase Even, it is used for:Receive the virtual image data that each augmented reality device is sent;The virtual image data is carried out at analysis Reason;Analysis processing result is fed back to each augmented reality device.
Fourth aspect, embodiments of the invention also provide a kind of electronic equipment, and the electronic equipment includes:Housing, processing Device, memory, circuit board and power circuit, wherein, circuit board is placed in the interior volume that housing surrounds, processor and memory Set on circuit boards;Power circuit, for each circuit or the device power supply for above-mentioned electronic equipment;Memory is used to store Executable program code;The executable program code that processor is stored by reading in memory is run and executable program generation Program corresponding to code, for performing any augmented reality method of embodiments of the invention offer.
5th aspect, embodiments of the invention also provide a kind of computer-readable recording medium, described computer-readable to deposit Storage media is stored with one or more program, one or more of programs can by one or more computing device, To realize any augmented reality method of embodiments of the invention offer.
6th aspect, embodiments of the invention also provide a kind of application program, and the application program is performed to realize this Any augmented reality method that the embodiment of invention provides.
Augmented reality method, apparatus, system, electronic equipment and the storage medium that embodiments of the invention provide, Neng Goutong Cross camera to obtain scene image and include scene image on screen, then receive touch command and existed according to touch command Virtual image is generated on scene image, so not only makes user observe situation from screen in time and is made by touch command Feedback, and virtual image and can be illustrated according to corresponding to generating the feedback on screen, exist so as to substantially increase user Participation in augmented reality, application of the man-machine interaction in augmented reality is strengthened, effectively improves Consumer's Experience.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is the required accompanying drawing used in technology description to be briefly described, it should be apparent that, drawings in the following description are only this Some embodiments of invention, for those of ordinary skill in the art, on the premise of not paying creative work, can be with Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is a kind of flow chart for the augmented reality method that embodiments of the invention provide;
Fig. 2 is a kind of detail flowchart for the augmented reality method that embodiments of the invention provide;
Fig. 3 is effect diagram corresponding to the augmented reality method of the embodiment offer shown in Fig. 2;
Fig. 4 is a kind of structural representation for the augmented reality device that embodiments of the invention provide;
Fig. 5 is a kind of structural representation for the electronic equipment that embodiments of the invention provide.
Embodiment
The embodiment of the present invention is described in detail below in conjunction with the accompanying drawings.
It will be appreciated that described embodiment is only part of the embodiment of the present invention, rather than whole embodiments.Base Embodiment in the present invention, those of ordinary skill in the art obtained under the premise of creative work is not made it is all its Its embodiment, belongs to the scope of protection of the invention.
In a first aspect, the embodiment of the present invention provides a kind of augmented reality method, user can be allowed to play an active part in enhancing existing Real environment, so as to effectively lift Consumer's Experience.
Augmented reality method provided in an embodiment of the present invention, target is fused to promote augmented reality and man-machine interaction, It can be applied to the system with image zooming-out equipment (such as camera) and human-computer interaction interface (such as touch screen).Particularly with The high speed development of mobile terminal and mobile interchange, augmented reality method provided in an embodiment of the present invention, which is particularly suitable for use in possess, to be taken the photograph As head and the mobile terminal of touch screen.
Specifically, the real world picture input that mobile terminal can extract camera is shown to mobile terminal On mobile terminal display screen, user can make a response to the image seen in display screen and be inputted to mobile terminal corresponding Control instruction, mobile terminal can be superimposed corresponding virtual image according to the wish of user in the original scene of screen, from And user can more participate in the specific implementation of augmented reality, Consumer's Experience is greatly improved.
Fig. 1 is a kind of flow chart for the augmented reality method that embodiments of the invention provide, as shown in figure 1, the present embodiment Augmented reality method can include:
S11, scene image is obtained by camera and includes the scene image on screen;
Wherein, camera only can carry out image acquisition to scenery and real-time display is on screen, but not to the figure of acquisition As data are preserved (being similar to focusing when taking pictures), can also in the form of video record photographed and real-time display exists On screen, while view data is preserved and (is similar to recorded video).Optionally, screen both can be only to have display work( The ordinary screen of energy or the touch screen for having display and touch-control input function concurrently, embodiments of the invention are not limited this It is fixed.
S12, receive touch command and virtual image is generated on the scene image according to the touch command.
Optionally, the touch command of reception such as can be click on, press, sliding at one or more groups in various gestures Close, and the bandwagon effect that touch command is formed on screen is also had nothing in common with each other, specifically can according to the scene image photographed come It is determined that both can be by touch command input word or scribble or by touch command trigger some operation by Button etc., embodiments of the invention are not limited this.
The virtual image of generation refers to, according to the touch command that receives newly-generated image in the terminal, both be different from taking the photograph The scene image that the real world photographed as head is converted is ready-made in terminal or server also different from being stored in advance in Image.Virtual image is generated on scene image, i.e. on the basis of the scene image of camera shooting, folded in some positions Plus according to the newly-generated virtual image of the touch command that receives, so as to reach the augmented reality effect closer to user's request Fruit.
The augmented reality method that embodiments of the invention provide, scene image can be obtained by camera and by scenery figure As being shown on screen, then receive touch command and virtual image is generated on scene image according to touch command, so not Only make user to observe situation from screen in time and feedback is made by touch command, and can be given birth to according to the feedback Into corresponding virtual image and it is illustrated on screen, so as to substantially increase participation of the user in augmented reality, strengthens Application of the man-machine interaction in augmented reality, effectively improves Consumer's Experience.
For example, when camera photographs someone and moved ahead in spite of the rain, one can be generated by touch command virtual umbrella Propped up for that people;When photographing the vehicle quickly crossed, a tracking life can be launched on screen by touch command Can order, look at the vehicle that catch up with screen;When the scene image photographed is the leaf to fall in flakes, touch-control can be passed through Some position on screen is instructed to generate a container, how many fallen leaves can be stored by checking after a while in container;Work as bat Take the photograph to a snail slowly creep when, touch command can be passed through and set one outside snail predetermined distance scope Individual or multiple virtual snares, check whether snail these drill-through snares can continue to creep just after a while.From these realities Example can allow user to participate in augmented reality scene to a greater degree as can be seen that generating new virtual image by touch command , interest is added, therefore greatly improve Consumer's Experience.
Below by taking different application scenarios as an example, augmented reality method provided in an embodiment of the present invention is carried out specifically It is bright.
In one embodiment of the invention, can include in the scene image obtained in step S11 by camera The image of target object;The target object can be part or all of human body, such as face, the upper part of the body of people etc..Target Object both may be at inactive state, can also be kept in motion.Camera can be recorded by scanning the target object Simultaneously target object is identified in follow-up process for the feature of the target object.
Included by the scene image for obtaining camera on screen, can be checked at any time for user, meanwhile, also allow User reacts to the scene shown on screen and fed back by touch command, user is preferably participated in scene work It is dynamic.
Specifically, in step S12, generating virtual image on scene image according to touch command may include:According to tactile The control instruction virtual shooting image of generation to target object on scene image.Wherein, virtually shooting image refers to show in screen In the scene image shown, the figure that virtual ammunition is projected from the position away from the segment distance of target object one to the target object is depicted Picture, optionally, the virtual ammunition can be virtual spitball, virtual brick, virtual cannonball etc..
The generation method of virtual shooting image can be diversified, can both show shooting track, and also might be used Shooting result is only shown not show shooting track, embodiments of the invention are not limited this.Optionally, the present invention's In a kind of embodiment, generated according to touch command on scene image may include to the virtual shooting image of target object:
The shooting track virtually shot is generated on scene image according to touch command;
According to shooting track and the motion state of target object, it is determined that whether virtual shooting hits the object on screen Body;
In the case of the virtual target object hit on screen, Overlapping display is corresponding on the image of target object Hit effect.
That is, in the present embodiment, embody shooting scene in order to true to nature, can after touch command is received, according to The virtual shooting track of touch command generation is simultaneously shown on scene image, and the virtual display for shooting track specifically may include as follows Step:
Firing direction and the ammunition flying speed for determining virtually to shoot according to touch command;
The shooting track virtually shot is generated on scene image according to firing direction and ammunition flying speed.
In one embodiment of the invention, touch command may include glide direction of the finger on screen, then virtually penetrates The firing direction hit can be determined by the glide direction.Optionally, the position that the ammunition virtually shot projects point both can be pre- First it is set on screen, can also be specified temporarily by touch command, likewise, ammunition flying speed both can be default value, It can also be specified by touch command.
In another embodiment of the present invention, touch command may include following one or more:During touch to screen Length, pressing dynamics, sliding distance;The ammunition flying speed can then virtually shot is according to the touch duration to screen, pressing One or more in dynamics, sliding distance determine.For example, touching, duration is longer, then ammunition flying speed is faster, during touch Length is shorter, then ammunition flying speed is slower, and pressing dynamics are bigger, then ammunition flying speed is faster, and pressing dynamics are smaller, then ammunition Flying speed is slower, and sliding distance of the finger on screen is more remote, then ammunition flying speed is faster, and sliding distance is nearer, then bullet Medicine flying speed is slower etc..Certainly, in other embodiments of the invention, can also give touch duration, pressing dynamics, slip away from From different weights is assigned ammunition flying speed is determined to integrate.
It should be noted that the ammunition flying speed virtually shot in the embodiment of the present invention refers to virtual ammunition on screen Flying speed, the ammunition flying speed can be a relative value for screen size, such as per second fly over screen 1/10 or absolute value of curtain length, such as several pixels per second that flew, embodiments of the invention are not limited this It is fixed.
Further, since the goal virtually shot is the target object after image conversion, fortune of the target object on screen Dynamic state depends on how the target object in real world moves relative to camera.For example, when mesh in real world Mark object close to camera move when, target object is also according to the movement of same direction, and the image of target object will increasingly Greatly, the accounting in screen is also increasing, when the target object in real world moves away from camera, target object Image will be less and less, and the accounting in screen also will be less and less.It is in target object in the coverage of camera In the case of, camera can identify the target object and the target object is tracked, when target object because motion is jumped to In the case of the coverage of camera, then need to carry out shooting scanning to target object again.
Further, in one embodiment of the invention, a variety of weaponrys can be provided for virtual shooting, can made One of which weaponry is selected to initiate shooting to the image of target object with the weaponry of acquiescence or according to touch command, It is interesting so as to further improve.
Specifically, before the firing direction and ammunition flying speed virtually shot according to touch command determination, the present invention is real The augmented reality method for applying example offer may also include:The weaponry virtually shot and used is selected, wherein different types of The weaponry has different flying speed and the area of fire;Based on this, determine that virtually shoots penetrates according to touch command Hitting direction and ammunition flying speed can specifically include:
The firing direction virtually shot is determined and to the flying speed of the weaponry according to the touch command It is adjusted.
For example, when selecting image of the spitball as weaponry goal object, corresponding ammunition flying speed is 10 pixels per second, now if sliding distance of the gesture of touch command on screen exceedes pre-determined distance threshold value, then may be used To do certain raising on the basis of the flying speed, such as bring up to 12 pixels each second.Other flying speeds are adjusted Whole situation is also similar, no longer illustrates one by one herein.In virtual shooting, the flight path of ammunition can be straight line or The curves such as parabola are with the more life-like actual shooting scene of simulation.Wherein, straight line or parabolical equation can be according to touch-controls Instruction determines.
It is understood that although the touch command that control is virtually shot is that user is real-time according to situation about being shown on screen The reaction made, but because target object can be with autokinetic movement, virtual shooting can not necessarily hit the target object in screen On image.Therefore, it is necessary to further according to the shooting track and the motion state of the target object, determine that this is virtual Whether shooting is fired upon the target object on screen.
In one embodiment of the invention, according to it is described shooting track and the target object motion state, Determine that the target object whether hit on screen of virtually shooting may include:
Detect and whether collided before ammunition flies out screen with the image of target object;
Determine whether virtual shooting hits the target object on screen according to testing result.
It should be understood that the flight path of ammunition when touch command is sent it has been determined that i.e. ammunition arrives at what moment Up to where can knowing for screen, therefore it may only be necessary to according to tracking of the camera to target object, it is determined that with for the moment Carve ammunition and whether target object appears in same screen position, you can determine whether the image of ammunition and target object touches Hit.
It should be noted that the same screen position that appears in the embodiment of the present invention refers to, the image and target of ammunition Whether the image of object can be overlapping in motion process or be partly overlapped, if it is, determining the figure of ammunition and target object As colliding, otherwise, it is determined that the image of unit and target object does not collide.For the big weapon of ammunition Area comparison Equip, there is the collision of the image of bigger probability and target object under square one;When target object is nearer away from camera, on an equal basis In the case of be easier hit by ammunition, this also complies with the shooting scene in real world, so that augmented reality scene is more forced Very.
Further, while a terminal-pair target object is virtually shot, there can also be other multiple use Family is virtually shot using the same target object of other terminal-pairs.For wherein any one terminal, embodiments of the invention The augmented reality method of offer may also include:
Display participates in the head portrait for the other users virtually shot to the target object;
When the image of any user in the other users to the target object is virtually shot, at described The weaponry that the user uses is identified on the head portrait of one user and shows the fire effect of the weaponry.
For example, in one embodiment of the invention, it can also show and a target object is virtually shot jointly Other 2 users Robert and Gloria head portrait, if Gloria uses " grenade " to hit the image of the target object, Weaponry " grenade " then is it used to shoot the target object in Gloria head portrait mark, and is shown on screen " grenade " flies at Gloria head portrait to the image of target object, to help user to check other opponents or teammate Shooting situation.
Further, it can also show that other users hit number and/or score rank to the target object Information, so as to by racing and coordinating enhancing interesting, further lift Consumer's Experience.
Further, in addition to the above-mentioned application scenarios virtually shot, the augmented reality side of embodiments of the invention offer Method can also be embodied in other scenes.
In another way of example of the present invention, equally include in the scene image that camera obtains in step S11 The image of target object, except that, generating virtual image on scene image according to touch command in step S12 can wrap Include:The virtual trap image to target object is generated on scene image according to touch command.Wherein, virtual trap image refers to In the scene image of screen display, outside distance objective object predetermined distance, variously-shaped virtual container or void are set Intend obstacle, to wait target object " passing through " or " stranded " wherein.
Specifically, generated according to touch command on scene image may include to the virtual trap image of target object:
Virtual trap is generated according to position of the touch command on scene image outside distance objective object pre-determined distance;
After preset duration, determine whether target object falls into virtual trap;
In the case where target object falls into virtual trap, Overlapping display falls into effect accordingly on the image of target object Fruit.
, then can be with for example, in one embodiment of the invention, camera, which is got in fish jar, 2 small goldfish Scenery in fish jar is included on screen, then receives touch command and is applied according to the touch command in some position of fish jar Crow forms a virtual fishing net (i.e. virtual trap), and the shapes and sizes of virtual fishing net can be set according to touch command. Camera can lock this two small goldfish and be tracked, when small goldfish swims over to virtual fishing net position, you can it is determined that Small goldfish is fallen into fishing net, captures a fish.
Augmented reality method provided by the invention is further described below by specific embodiment.
Embodiments of the invention provide a kind of augmented reality method, and by face tracking, user can pass through cell-phone camera Whether head tracking target face, and project the weaponry provided to target face on mobile phone screen in real time, seeing can hit Target face.More people can be allowed to simultaneously participate in the same target face of scanning and equipped simultaneously to target face stand-off weapon, passed through The APP numbers thrown away of statistics carry out ranking.
Fig. 2 is a kind of flow chart of augmented reality method provided in an embodiment of the present invention, and Fig. 3 is enhancing corresponding with Fig. 2 Real effect figure.With reference to Fig. 2 and Fig. 3, the augmented reality method that the present embodiment provides specifically may include following steps:
S201, APP is opened, identification is scanned with mobile phone camera alignment target face.
S202, after identifying successfully, there is row's weapon in bottom of screen, and cell phone system can be from server or locally crawl can The weaponry of selection, user can select to want the article used by the touch command to horizontally slip.
S203, the touch command that user slides weaponry is received, equip with arms and send ammunition progress to target face Virtual shooting.
S204, according to the shooting track virtually shot and the motion state of target face, it is determined that whether virtual shooting hits Target face on middle screen.
S205, if hit, effect is hit accordingly in target face Overlapping display, such as be superimposed corresponding countenance Feedback, and record and hit number.
Optionally, if continuously hitting, then target face can have stronger feedback.If target face walks out screen Curtain, then rescaned, and the pause of time schedule bar is advanced.
Optionally, when there are the other users to hit same target face with the time, the head portrait and from the beginning image position of the user are shown Put the weaponry of projection.
S206, virtual shooting time expire, and will hit number upload server.
S207, server is according to number calculated for rank is hit, before hitting the higher ranking more of number.
S208, the user's ranking and fraction of the reception server feedback.
Second aspect, accordingly, the embodiment of the present invention provide a kind of augmented reality device, it is possible to increase user is existing in enhancing Participation in reality, strengthen application of the man-machine interaction in augmented reality.
As shown in figure 4, the augmented reality device that the present embodiment provides, including:
Acquiring unit 31, for obtaining scene image by camera and including the scene image in screen;
Generation unit 32, for receiving touch command and being generated virtually on the scene image according to the touch command Image.
The augmented reality device that embodiments of the invention provide, scene image can be obtained by camera and by scenery figure As being shown on screen, then receive touch command and virtual image is generated on scene image according to touch command, so not Only make user observe situation from screen in time and feedback is made by touch command, and can be according to feedback generation pair The virtual image answered simultaneously is illustrated on screen, so as to substantially increase participation of the user in augmented reality, is strengthened man-machine Application of the interaction in augmented reality, effectively improves Consumer's Experience.
Optionally, the scene image includes the image of target object;Generation unit 32 may include shooting generation module, For the virtual shooting image to the target object to be generated on the scene image according to the touch command.
Optionally, the shooting generation module may include:
First generation submodule, described virtually shot for being generated on the scene image according to the touch command Shoot track;
First determination sub-module, for the motion state according to the shooting track and the target object, determine institute State the virtual target object shot and whether hit on screen;
First superposition submodule, in the case of the virtual target object hit on screen, Overlapping display hits effect accordingly on the image of the target object.
Optionally, the first generation submodule, is particularly used in:
The firing direction virtually shot and ammunition flying speed are determined according to the touch command;
The shooting track virtually shot is generated according to the firing direction and the ammunition flying speed.
Optionally, first determination sub-module is particularly used in:
Detect and whether collided before the ammunition flies out screen with the image of the target object;
Determine whether the virtual shooting hits the target object on screen according to testing result.
Further, the augmented reality device may also include display unit, be used for:Display is participated in the target object The head portrait for the other users virtually shot;When the image of any user in the other users to the target object enters When row is virtually shot, the weaponry that the user uses is identified on the head portrait of any user and shows the weaponry Fire effect.
Optionally, the display unit, it may also be used for display other users hit number to the target object And/or score rank information.
Optionally, the scene image includes the image of target object;Generation unit 32 includes trap generation module, uses In the virtual trap image to the target object is generated on the scene image according to the touch command.
Optionally, the trap generation module may include:
Second generation submodule, it is pre- for the target object according to touch command distance on the scene image If the position outside distance generates the virtual trap;
Second determination sub-module, for after preset duration, determining it is described virtual sunken whether the target object falls into Trap;
Second superposition submodule, in the case of falling into the virtual trap in the target object, in the target Overlapping display falls into effect accordingly on the image of object.
The third aspect, accordingly, the embodiment of the present invention provide a kind of augmented reality system, it is possible to increase user is existing in enhancing Participation in reality, strengthen application of the man-machine interaction in augmented reality.
The augmented reality system that embodiments of the invention provide, including server and at least two previous embodiments provide Augmented reality device;The server is respectively connected with each augmented reality device, is used for:
Receive the virtual image data that each augmented reality device is sent;
The virtual image data is analyzed and processed;
Analysis processing result is fed back to each augmented reality device.
Augmented reality system provided in an embodiment of the present invention, will be played by combining man-machine interaction and augmented reality and The scene of reality becomes closer, while also makes the aspect for being extended to amusement of the technologies such as image recognition, by more people participate in come The online social AR scenes of more people are realized, constantly shares, greatly improves Consumer's Experience
Fourth aspect, accordingly, the embodiment of the present invention provide a kind of electronic equipment, it is possible to increase user is in augmented reality Participation, strengthen application of the man-machine interaction in augmented reality.
As shown in figure 5, a kind of electronic equipment that embodiments of the invention provide, can include:Housing 41, processor 42, Memory 43, circuit board 44 and power circuit 45, wherein, circuit board 44 is placed in the interior volume that housing 41 surrounds, processor 42 and memory 43 be arranged on circuit board 44;Power circuit 45, for each circuit or the device confession for above-mentioned electronic equipment Electricity;Memory 43 is used to store executable program code;Processor 42 is by reading the executable program stored in memory 43 Code runs program corresponding with executable program code, for performing the augmented reality side described in foregoing any embodiment Method.
Processor 42 to the specific implementation procedures of above-mentioned steps and processor 42 by run executable program code come The step of further performing, the description of previous embodiment is may refer to, will not be repeated here.
The electronic equipment exists in a variety of forms, includes but is not limited to:
(1) mobile communication equipment:The characteristics of this kind equipment is that possess mobile communication function, and to provide speech, data Communicate as main target.This Terminal Type includes:Smart mobile phone (such as iPhone), multimedia handset, feature mobile phone, and it is low Hold mobile phone etc..
(2) super mobile personal computer equipment:This kind equipment belongs to the category of personal computer, there is calculating and processing work( Can, typically also possess mobile Internet access characteristic.This Terminal Type includes:PDA, MID and UMPC equipment etc., such as iPad.
(3) portable entertainment device:This kind equipment can show and play content of multimedia.The kind equipment includes:Audio, Video player (such as iPod), handheld device, e-book, and intelligent toy and portable car-mounted navigation equipment.
(4) server:The equipment for providing the service of calculating, the composition of server are total including processor, hard disk, internal memory, system Line etc., server is similar with general computer architecture, but due to needing to provide highly reliable service, therefore in processing energy Power, stability, reliability, security, scalability, manageability etc. require higher.
(5) other electronic equipments with data interaction function.
5th aspect, embodiments of the invention also provide a kind of computer-readable recording medium, described computer-readable to deposit Storage media is stored with one or more program, one or more of programs can by one or more computing device, To realize any augmented reality method of previous embodiment offer, therefore corresponding technique effect can be also realized, above It is described in detail, here is omitted.
6th aspect, embodiments of the invention also provide a kind of application program, and the application program is performed to realize this Any augmented reality method that the embodiment of invention provides, therefore corresponding technique effect can be also realized, carry out above Describe in detail, here is omitted.
It should be noted that herein, such as first and second or the like relational terms are used merely to a reality Body or operation make a distinction with another entity or operation, and not necessarily require or imply and deposited between these entities or operation In any this actual relation or order.Moreover, term " comprising ", "comprising" or its any other variant are intended to Nonexcludability includes, so that process, method, article or equipment including a series of elements not only will including those Element, but also the other element including being not expressly set out, or it is this process, method, article or equipment also to include Intrinsic key element.In the absence of more restrictions, the key element limited by sentence " including one ... ", it is not excluded that Other identical element in the process including the key element, method, article or equipment also be present.
Each embodiment in this specification is described by the way of related, identical similar portion between each embodiment Divide mutually referring to what each embodiment stressed is the difference with other embodiment.
For device embodiment, because it is substantially similar to embodiment of the method, so the comparison of description is simple Single, the relevent part can refer to the partial explaination of embodiments of method.
For convenience of description, it is to be divided into various units/modules with function to describe respectively to describe apparatus above.Certainly, exist The function of each unit/module can be realized in same or multiple softwares and/or hardware when implementing of the invention.
One of ordinary skill in the art will appreciate that realize all or part of flow in above-described embodiment method, being can be with The hardware of correlation is instructed to complete by computer program, described program can be stored in a computer read/write memory medium In, the program is upon execution, it may include such as the flow of the embodiment of above-mentioned each method.Wherein, described storage medium can be magnetic Dish, CD, read-only memory (Read-Only Memory, ROM) or random access memory (Random Access Memory, RAM) etc..
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any Those familiar with the art the invention discloses technical scope in, the change or replacement that can readily occur in, all should It is included within the scope of the present invention.Therefore, protection scope of the present invention should be defined by scope of the claims.

Claims (10)

  1. A kind of 1. augmented reality method, it is characterised in that including:
    Scene image is obtained by camera and includes the scene image on screen;
    Receive touch command and virtual image is generated on the scene image according to the touch command.
  2. 2. according to the method for claim 1, it is characterised in that the scene image includes the image of target object;
    It is described virtual image is generated on the scene image according to the touch command to include:According to the touch command in institute State the virtual shooting image generated on scene image to the target object.
  3. 3. according to the method for claim 2, it is characterised in that it is described according to the touch command on the scene image Generate includes to the virtual shooting image of the target object:
    The shooting track virtually shot is generated on the scene image according to the touch command;
    According to the shooting track and the motion state of the target object, determine whether the virtual shooting is hit on screen The target object;
    In the case of the virtual target object hit on screen, it is superimposed on the image of the target object Display hits effect accordingly.
  4. 4. according to the method for claim 3, it is characterised in that it is described according to the touch command on the scene image Generating the shooting track virtually shot includes:
    The firing direction virtually shot and ammunition flying speed are determined according to the touch command;
    The shooting virtually shot is generated on the scene image according to the firing direction and the ammunition flying speed Track.
  5. 5. according to the method for claim 4, it is characterised in that the touch command includes finger the slip side on screen To;The firing direction virtually shot is determined by the glide direction.
  6. 6. according to the method for claim 4, it is characterised in that the touch command includes following at least one:To screen Touch duration, pressing dynamics, sliding distance;
    The ammunition flying speed virtually shot is according to the touch duration to screen, the pressing dynamics, the slip At least one of distance determines.
  7. A kind of 7. augmented reality device, it is characterised in that including:
    Acquiring unit, for obtaining scene image by camera and including the scene image in screen;
    Generation unit, for receiving touch command and generating virtual image on the scene image according to the touch command.
  8. 8. a kind of augmented reality system, it is characterised in that including any in server and at least two claims 13 to 21 Augmented reality device described in;
    The server is respectively connected with each augmented reality device, is used for:
    Receive the virtual image data that each augmented reality device is sent;
    The virtual image data is analyzed and processed;
    Analysis processing result is fed back to each augmented reality device.
  9. 9. a kind of electronic equipment, it is characterised in that the electronic equipment includes:Housing, processor, memory, circuit board and electricity Source circuit, wherein, circuit board is placed in the interior volume that housing surrounds, and processor and memory are set on circuit boards;Power supply Circuit, for each circuit or the device power supply for above-mentioned electronic equipment;Memory is used to store executable program code;Processing The executable program code that device is stored by reading in memory runs program corresponding with executable program code, for holding Augmented reality method described in any one of row preceding claims 1 to 6.
  10. A kind of 10. computer-readable recording medium, it is characterised in that the computer-readable recording medium storage have one or Multiple programs, one or more of programs can by one or more computing device, with realize preceding claims 1 to Augmented reality method any one of 6.
CN201710915128.4A 2017-09-29 2017-09-29 Augmented reality method, device, system, electronic equipment and storage medium Pending CN107526443A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710915128.4A CN107526443A (en) 2017-09-29 2017-09-29 Augmented reality method, device, system, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710915128.4A CN107526443A (en) 2017-09-29 2017-09-29 Augmented reality method, device, system, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN107526443A true CN107526443A (en) 2017-12-29

Family

ID=60684120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710915128.4A Pending CN107526443A (en) 2017-09-29 2017-09-29 Augmented reality method, device, system, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN107526443A (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108304075A (en) * 2018-02-11 2018-07-20 亮风台(上海)信息科技有限公司 A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment
CN108391174A (en) * 2018-03-22 2018-08-10 乐蜜有限公司 Living broadcast interactive method, apparatus and electronic equipment
CN108519815A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Control method for vehicle, device, storage medium based on augmented reality and electronic equipment
CN108829479A (en) * 2018-06-01 2018-11-16 北京市商汤科技开发有限公司 Information processing method and device, storage medium
CN108958475A (en) * 2018-06-06 2018-12-07 阿里巴巴集团控股有限公司 virtual object control method, device and equipment
CN109086097A (en) * 2018-07-03 2018-12-25 百度在线网络技术(北京)有限公司 A kind of starting method, apparatus, server and the storage medium of small routine
CN110083266A (en) * 2019-04-02 2019-08-02 上海墨案智能科技有限公司 Information processing method, device and storage medium
CN110162258A (en) * 2018-07-03 2019-08-23 腾讯数码(天津)有限公司 The processing method and processing device of individual scene image
CN110837764A (en) * 2018-08-17 2020-02-25 广东虚拟现实科技有限公司 Image processing method and device, electronic equipment and visual interaction system
CN111617471A (en) * 2020-06-08 2020-09-04 浙江商汤科技开发有限公司 Virtual shooting display method and device, electronic equipment and storage medium
CN111632377A (en) * 2020-06-08 2020-09-08 浙江商汤科技开发有限公司 Shooting track display method and device, electronic equipment and storage medium
CN111638793A (en) * 2020-06-04 2020-09-08 浙江商汤科技开发有限公司 Aircraft display method and device, electronic equipment and storage medium
WO2020244649A1 (en) * 2019-06-06 2020-12-10 深圳市道通智能航空技术有限公司 Obstacle avoidance method and apparatus, and electronic device
CN113359988A (en) * 2021-06-03 2021-09-07 北京市商汤科技开发有限公司 Information display method and device, computer equipment and storage medium
CN113359985A (en) * 2021-06-03 2021-09-07 北京市商汤科技开发有限公司 Data display method and device, computer equipment and storage medium
CN114584681A (en) * 2020-11-30 2022-06-03 北京市商汤科技开发有限公司 Target object motion display method and device, electronic equipment and storage medium
US20220254116A1 (en) 2021-02-09 2022-08-11 Beijing Zitiao Network Technology Co., Ltd. Display method based on augmented reality, device, storage medium and program product

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103369288A (en) * 2012-03-29 2013-10-23 深圳市腾讯计算机系统有限公司 Instant communication method based on network video and system thereof
CN106383587A (en) * 2016-10-26 2017-02-08 腾讯科技(深圳)有限公司 Augmented reality scene generation method, device and equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103369288A (en) * 2012-03-29 2013-10-23 深圳市腾讯计算机系统有限公司 Instant communication method based on network video and system thereof
CN106383587A (en) * 2016-10-26 2017-02-08 腾讯科技(深圳)有限公司 Augmented reality scene generation method, device and equipment

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108304075A (en) * 2018-02-11 2018-07-20 亮风台(上海)信息科技有限公司 A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment
CN108391174A (en) * 2018-03-22 2018-08-10 乐蜜有限公司 Living broadcast interactive method, apparatus and electronic equipment
CN108391174B (en) * 2018-03-22 2021-08-20 卓米私人有限公司 Live broadcast interaction method and device and electronic equipment
CN108519815B (en) * 2018-03-26 2020-10-02 Oppo广东移动通信有限公司 Augmented reality-based vehicle control method and device, storage medium and electronic equipment
CN108519815A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Control method for vehicle, device, storage medium based on augmented reality and electronic equipment
CN108829479A (en) * 2018-06-01 2018-11-16 北京市商汤科技开发有限公司 Information processing method and device, storage medium
CN108958475A (en) * 2018-06-06 2018-12-07 阿里巴巴集团控股有限公司 virtual object control method, device and equipment
CN109086097A (en) * 2018-07-03 2018-12-25 百度在线网络技术(北京)有限公司 A kind of starting method, apparatus, server and the storage medium of small routine
CN110162258A (en) * 2018-07-03 2019-08-23 腾讯数码(天津)有限公司 The processing method and processing device of individual scene image
CN109086097B (en) * 2018-07-03 2023-02-28 百度在线网络技术(北京)有限公司 Method and device for starting small program, server and storage medium
CN110837764A (en) * 2018-08-17 2020-02-25 广东虚拟现实科技有限公司 Image processing method and device, electronic equipment and visual interaction system
CN110083266A (en) * 2019-04-02 2019-08-02 上海墨案智能科技有限公司 Information processing method, device and storage medium
CN110083266B (en) * 2019-04-02 2022-05-13 上海墨案智能科技有限公司 Information processing method, device and storage medium
WO2020244649A1 (en) * 2019-06-06 2020-12-10 深圳市道通智能航空技术有限公司 Obstacle avoidance method and apparatus, and electronic device
CN111638793A (en) * 2020-06-04 2020-09-08 浙江商汤科技开发有限公司 Aircraft display method and device, electronic equipment and storage medium
CN111638793B (en) * 2020-06-04 2023-09-01 浙江商汤科技开发有限公司 Display method and device of aircraft, electronic equipment and storage medium
CN111632377A (en) * 2020-06-08 2020-09-08 浙江商汤科技开发有限公司 Shooting track display method and device, electronic equipment and storage medium
CN111617471A (en) * 2020-06-08 2020-09-04 浙江商汤科技开发有限公司 Virtual shooting display method and device, electronic equipment and storage medium
CN114584681A (en) * 2020-11-30 2022-06-03 北京市商汤科技开发有限公司 Target object motion display method and device, electronic equipment and storage medium
US20220254116A1 (en) 2021-02-09 2022-08-11 Beijing Zitiao Network Technology Co., Ltd. Display method based on augmented reality, device, storage medium and program product
EP4071725A4 (en) * 2021-02-09 2023-07-05 Beijing Zitiao Network Technology Co., Ltd. Augmented reality-based display method and device, storage medium, and program product
US11763533B2 (en) 2021-02-09 2023-09-19 Beijing Zitiao Network Technology Co., Ltd. Display method based on augmented reality, device, storage medium and program product
CN113359988A (en) * 2021-06-03 2021-09-07 北京市商汤科技开发有限公司 Information display method and device, computer equipment and storage medium
CN113359985A (en) * 2021-06-03 2021-09-07 北京市商汤科技开发有限公司 Data display method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN107526443A (en) Augmented reality method, device, system, electronic equipment and storage medium
US11045727B2 (en) Opposition trigger of moment clips
CN109034397B (en) Model training method and device, computer equipment and storage medium
CN108958475B (en) Virtual object control method, device and equipment
US11745103B2 (en) Methods for providing customized camera views in virtualized environments based on touch-based user input
US20170192500A1 (en) Method and electronic device for controlling terminal according to eye action
US20200368620A1 (en) Systems and Methods for Customized Camera Views and Customizable Objects in Virtualized Environments
US8957858B2 (en) Multi-platform motion-based computer interactions
US20140071069A1 (en) Techniques for touch and non-touch user interaction input
TWI610247B (en) Method of identifying, capturing, presenting, and processing photos during play of a game and computer readable storage device therefor
CN105324736A (en) Techniques for touch and non-touch user interaction input
JP6450875B1 (en) GAME PROGRAM, GAME METHOD, AND INFORMATION PROCESSING DEVICE
CN114155605B (en) Control method, device and computer storage medium
CN109529340A (en) Virtual object control method, device, electronic equipment and storage medium
CN113544697A (en) Analyzing athletic performance with data and body posture to personalize predictions of performance
CN105468249A (en) Intelligent interaction system and control method therefor
CN111013135A (en) Interaction method, device, medium and electronic equipment
JP2019098163A (en) Game program, method, and information processor
US20220394194A1 (en) Computer-readable recording medium, computer apparatus, and control method
JP6791520B2 (en) Game controls, game systems, and programs
CN113873162A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN115278082A (en) Video shooting method, video shooting device and electronic equipment
CN117959710A (en) Game data processing method, device, equipment and medium
CN115866352A (en) Interaction method, interaction device, electronic equipment, readable storage medium and program product
CN116966564A (en) Method and device for playing back action trace, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB03 Change of inventor or designer information

Inventor after: Guo Jianan

Inventor after: Gan Yu

Inventor after: Hou Wenpei

Inventor after: Qiao Li

Inventor after: Jiao Diqin

Inventor after: Yu Zhilan

Inventor after: Gong Xue

Inventor after: Zhu Mingen

Inventor after: Chen Xiaoqing

Inventor after: Li Yuan

Inventor before: Guo Jianan

Inventor before: Gan Yu

Inventor before: Hou Wenpei

Inventor before: Qiao Li

Inventor before: Jiao Diqin

Inventor before: Yu Zhilan

Inventor before: Zhangliang Yuxiao

Inventor before: Zhu Mingen

Inventor before: Chen Xiaoqing

Inventor before: Li Yuan

CB03 Change of inventor or designer information
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171229

RJ01 Rejection of invention patent application after publication