CN109189302A - The control method and device of AR dummy model - Google Patents
The control method and device of AR dummy model Download PDFInfo
- Publication number
- CN109189302A CN109189302A CN201810993135.0A CN201810993135A CN109189302A CN 109189302 A CN109189302 A CN 109189302A CN 201810993135 A CN201810993135 A CN 201810993135A CN 109189302 A CN109189302 A CN 109189302A
- Authority
- CN
- China
- Prior art keywords
- plane
- dummy model
- motion track
- scene
- target object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present invention provides a kind of control method and device of AR dummy model, and wherein method includes: acquisition control command, and the first dummy model that control command is used to control in the first AR scene is mobile according to motion track;The first plane where determining the first dummy model in the first AR scene;Determine the first dummy model in the motion track of the first plane according to control command;It is mobile according to motion track in the first plane to control the first dummy model.The control method and device of AR dummy model provided by the invention, the motion track of first plane of first dummy model where it can be determined according to control command, and it controls the first dummy model and is moved in the first plane according to motion track, so that the first dummy model is moved in the plane in the first AR scene, to improve AR scene when controlling AR dummy model for the immersive effects of user.
Description
Technical field
The present invention relates to augmented reality (Augmented Reality, AR) technical field more particularly to a kind of virtual moulds of AR
The control method and device of type.
Background technique
Augmented reality (Augmented Reality, abbreviation AR) is a kind of information that can be provided by computer system
Increase the technology that user perceives real world.AR technology passes through AR display device example by the dummy model for generating computer
Such as AR glasses, be added to user it can be seen that real scene in, make user while seeing real scene and virtual mould therein
Type, to realize the enhancing of user's reality perception.
In the prior art, in some game or some application programs, user can by the interface of mobile phone or
Different controls is operated in other input units of person, realizes the behaviour such as mobile or overturning to control AR dummy model in AR scene
Make, to increase the interaction of user Yu AR scene.
But dummy model is usually made when showing to user with model itself in AR scene in the prior art
For fixed viewpoint in the first visual angle, or use AR scene.Such first visual angle or fixed viewpoint make user's control AR
When dummy model carries out mobile or turning operation, AR dummy model can only be moved in fixed viewpoint, will cause user institute
The AR dummy model that can be seen often " floats " in AR scene or the state of " hanging ", and is not able to satisfy user's reality
The AR dummy model move mode for wishing operation, can not provide a user good AR scene immersive effects.Therefore, how
AR scene is improved when AR virtual model control for the immersive effects of user, is a technical problem to be solved urgently.
Summary of the invention
The present invention provides a kind of control method and device of AR dummy model, to determine the first virtual mould according to control command
The motion track of first plane of the type where it, and control the first dummy model and moved in the first plane according to motion track
It is dynamic, so that the first dummy model is moved in the plane in the first AR scene, in control AR dummy model
Immersive effects of the Shi Tigao AR scene for user.
First aspect present invention provides a kind of control method of AR dummy model, comprising:
Control command is obtained, the control command is used to control the first dummy model in the first AR scene according to moving rail
Mark is mobile;
The first plane where determining first dummy model in the first AR scene;
Determine first dummy model in the motion track of first plane according to the control command;
It is mobile according to the motion track in first plane to control first dummy model.
In one embodiment of first aspect present invention, the acquisition control command, comprising:
Target object is obtained in the motion track of the second plane, wherein second plane is the target object place
Plane;
It is described to determine first dummy model in the track of first plane according to the control command, comprising:
Using the target object second plane motion track as first dummy model described first
The motion track of plane.
In one embodiment of first aspect present invention, it is described by the target object second plane motion track
As first dummy model first plane motion track, comprising:
Determine the target object in the motion track of second plane;
The target object is mapped to first plane in the motion track of second plane, obtains described first
Motion track of the dummy model in first plane.
In one embodiment of first aspect present invention, it is described by the target object second plane motion track
It is mapped to first plane, obtains first dummy model in the motion track of first plane, comprising:
By the target object second plane motion track by projection matrix mapping, viewing matrix mapping and
After imaging plane mapping, first dummy model is obtained in the motion track of first plane.
In one embodiment of first aspect present invention, it is described by the target object second plane motion track
After projection matrix mapping, viewing matrix mapping and Planar Mapping, first dummy model is obtained in first plane
Motion track, comprising:
The target object is converted in the bivector coordinate of the motion track of second plane as screen coordinate
And it projects on far plane and hither plane;
Using projection matrix and viewing matrix along of the bivector between the far plane and the hither plane
Point and terminal are separately connected ray;
The coincidence line for determining the ray and the imaging plane is first dummy model in first plane
Motion track.
In one embodiment of first aspect present invention, determination first dummy model is in the first AR scene
First plane at place, comprising:
Determine first dummy model in the first AR scene according to synchronous positioning and map reconstruct SLAM algorithm
First plane at place.
In one embodiment of first aspect present invention, the control first dummy model first plane according to
The motion track is mobile, comprising:
First dummy model described in each frame video image for determining the first AR scene according to the motion track
In the position of first plane;
Play each frame video image of the first AR scene.
In one embodiment of first aspect present invention, motion track packet of first dummy model in first plane
Include direction and angle of first dummy model in first rotation in surface;
Then control first dummy model is mobile according to the motion track in first plane, further includes:
First dummy model is controlled to be rotated in first plane according to the motion track.
In one embodiment of first aspect present invention, second plane is the display for showing the first AR scene
Screen;
The target object is the functionality controls on user's finger or the display screen.
Second aspect of the present invention provides a kind of control device of AR dummy model, comprising:
Receiver, for obtaining control command, the control command is used to control the first virtual mould in the first AR scene
Type is mobile according to motion track;
Processor, for the first plane where determining first dummy model in the first AR scene;
The processor is also used to, and determines first dummy model in first plane according to the control command
Motion track;
AR display, it is mobile according to the motion track in first plane for controlling first dummy model.
In one embodiment of second aspect of the present invention, the receiver is specifically used for, and obtains target object in the second plane
Motion track, wherein second plane be the target object where plane;
The processor is specifically used for, using the target object second plane motion track as described first
Motion track of the dummy model in first plane.
In one embodiment of second aspect of the present invention, the processor is specifically used for,
Determine the target object in the motion track of second plane;
The target object is mapped to first plane in the motion track of second plane, obtains described first
Motion track of the dummy model in first plane.
In one embodiment of second aspect of the present invention, the processor is specifically used for,
By the target object second plane motion track by projection matrix mapping, viewing matrix mapping and
After imaging plane mapping, first dummy model is obtained in the motion track of first plane.
In one embodiment of second aspect of the present invention, the processor is specifically used for,
The target object is converted in the bivector coordinate of the motion track of second plane as screen coordinate
And it projects on far plane and hither plane;
Using projection matrix and viewing matrix along of the bivector between the far plane and the hither plane
Point and terminal are separately connected ray;
The coincidence line for determining the ray and the imaging plane is first dummy model in first plane
Motion track.
In one embodiment of second aspect of the present invention, the processor is specifically used for, and is reconstructed according to synchronous positioning with map
SLAM algorithm determines first plane at first dummy model place in the first AR scene.
In one embodiment of second aspect of the present invention, the display module is specifically used for,
First dummy model described in each frame video image for determining the first AR scene according to the motion track
In the position of first plane;
Play each frame video image of the first AR scene.
In one embodiment of second aspect of the present invention, motion track packet of first dummy model in first plane
Include direction and angle of first dummy model in first rotation in surface;
The display is specifically used for, and controls first dummy model in first plane according to the motion track
It is rotated.
In one embodiment of second aspect of the present invention, second plane is the display for showing the first AR scene
Screen;
The target object is the functionality controls on user's finger or the display screen.
The third aspect, the embodiment of the present application provide a kind of control device of AR dummy model, comprising: processor and storage
Device;The memory, for storing program;The processor, the program for calling the memory to be stored, to execute sheet
Apply for any method in first aspect.
Fourth aspect, the embodiment of the present application provide a kind of computer readable storage medium, the computer-readable storage medium
Matter stores program code, when said program code is performed, to execute the method as described in the application first aspect is any.
To sum up, the present invention provides a kind of control method and device of AR dummy model, and wherein method includes: to obtain control life
It enables, the first dummy model that control command is used to control in the first AR scene is mobile according to motion track;Determine the first virtual mould
First plane of the type where in the first AR scene;Determine the first dummy model in the moving rail of the first plane according to control command
Mark;It is mobile according to motion track in the first plane to control the first dummy model.The controlling party of AR dummy model provided by the invention
Method and device, can determine the motion track of first plane of first dummy model where it according to control command, and control
First dummy model is moved in the first plane according to motion track, so that the first dummy model is at the first AR
It is moved in plane in scape, to improve AR scene when controlling AR dummy model for the immersive effects of user.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
Some embodiments of invention without any creative labor, may be used also for those of ordinary skill in the art
To obtain other drawings based on these drawings.
Fig. 1 is the schematic diagram of one embodiment of display structure of prior art AR dummy model;
Fig. 2 is the schematic diagram of one embodiment of display structure of prior art AR dummy model;
Fig. 3 is the display structural schematic diagram of the control method embodiment of prior art AR dummy model;
Fig. 4 is the display structural schematic diagram of the control method embodiment of prior art AR dummy model;
Fig. 5 is the display structural schematic diagram of the control method embodiment of prior art AR dummy model;
Fig. 6 is the flow diagram of one embodiment of control method of AR dummy model of the present invention;
Fig. 7 is the display structural schematic diagram that the present invention obtains one embodiment of control instruction;
Fig. 8 is the display structural schematic diagram that the present invention obtains one embodiment of control instruction;
Fig. 9 is the display structural schematic diagram that the present invention obtains one embodiment of control instruction;
Figure 10 is the display structural schematic diagram of one embodiment of control method of AR dummy model of the present invention;
Figure 11 is the display structural schematic diagram of one embodiment of control method of AR dummy model of the present invention;
Figure 12 is the display structural schematic diagram of one embodiment of control method of AR dummy model of the present invention;
Figure 13 is the display structural schematic diagram that the present invention obtains one embodiment of control instruction;
Figure 14 is the display structural schematic diagram that the present invention obtains one embodiment of control instruction;
Figure 15 is the display structural schematic diagram of dummy model of the present invention;
Figure 16 be present invention determine that the first dummy model the first planar movement track structural schematic diagram;
Figure 17 is the flow diagram of one embodiment of control method of AR dummy model of the present invention;
Figure 18 is the structural schematic diagram of one embodiment of control device of AR dummy model of the present invention;
Figure 19 is the structural schematic diagram of one embodiment of control device of AR dummy model of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
Description and claims of this specification and term " first ", " second ", " third ", " in above-mentioned attached drawing
The (if present)s such as four " are to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should manage
The data that solution uses in this way are interchangeable under appropriate circumstances, so that the embodiment of the present invention described herein for example can be to remove
Sequence other than those of illustrating or describe herein is implemented.In addition, term " includes " and " having " and theirs is any
Deformation, it is intended that cover it is non-exclusive include, for example, containing the process, method of a series of steps or units, system, production
Product or equipment those of are not necessarily limited to be clearly listed step or unit, but may include be not clearly listed or for this
A little process, methods, the other step or units of product or equipment inherently.
Fig. 1 is the schematic diagram of one embodiment of display structure of prior art AR dummy model.As shown in Figure 1, existing
AR dummy model shows in structure that the AR scene provided by AR display device including dummy model is shown to user
When, usually using dummy model itself as the first visual angle.As shown in Figure 1, the center being located in AR display interface 1 is AR
A dummy model 3 in scene, the dummy model 3 are a virtual portraits in AR scene.And it is shown in existing AR scene
No matter in the AR dummy model AR scene shown, there are which type of content or different planes, all only in AR display interface 1
Center show dummy model, result in AR scene as shown in Figure 1 to user present real scene be a desk
2, the plane where the desktop of the desk 2 is not parallel with the plane of AR display device.If at this time also in AR display interface 1
Heart position shows dummy model 3, and can see that the front of desk 2 is in AR scene in user it will cause dummy model 3 is
The state of " floating " or " hanging ".
Fig. 2 is the schematic diagram of one embodiment of display structure of prior art AR dummy model.The prior art as shown in Figure 2
In, certain improvement carried out to display structure shown in FIG. 1, no longer with centre bit in the AR display interface 1 of AR display device
Displaying dummy model is set, but dummy model is placed on the specific plane in AR display interface 1.Such as shown in Fig. 2, AR
In the AR scene of display interface 1, dummy model is no longer at the center of AR scene, but the desk 2 being located in AR scene
Desktop where plane.It can be improved feeling of immersion when user watches AR scene in this way.
But when using the display structure of AR dummy model as shown in Figure 2, it is also necessary to solve how to control virtual mould
The mobile problem of type.Wherein, in some game or some application programs, user can be by operating on the interface of mobile phone
The movement of control realizes the operations such as mobile or overturning to control AR dummy model in AR scene, to increase user and AR
The interaction of scene.But due to specific flat in AR scene when AR dummy model as shown in Figure 2 is shown in AR scene
Face is shown, and possessed plane is different from different AR scenes.According to the AR dummy model provided in the prior art
Control method still can only be moved for the plane where control of the dummy model operated by user, and can not expire
The practical move mode for wishing the AR dummy model operated on specific plane of sufficient user, can not provide a user good AR
Scene immersive effects.For example, being illustrated below with Fig. 3 to display structure shown in fig. 5, wherein Fig. 3 is prior art AR empty
The display structural schematic diagram of the control method embodiment of analog model;Fig. 4 is that the control method of prior art AR dummy model is implemented
The display structural schematic diagram of example;Fig. 5 is the display structural schematic diagram of the control method embodiment of prior art AR dummy model.Tool
Body, in display structure as shown in Figure 3, dummy model is no longer in AR scene in the AR scene of AR display interface 1
Heart position, but the plane where the desktop for the desk 2 being located in AR scene.When user needs to control dummy model in AR scene
It is mobile when, it is virtual to realize in AR scene to the movement of different directions to need the control provided for example, by AR display interface 1
The movement of model.Four possible moving directions of user's possible operation control as shown in figure 1, such as the control is exactly virtual
Model 3 itself, then user can carry out the movement of four direction up and down as shown in Figure 3 by touching dummy model 3.More
For specifically, in Fig. 4, for user's operation dummy model 3 moves right, when user on AR display interface 1 by touching
Dummy model 3 and when sliding to the right, dummy model can move right according to the glide direction of user's operation.Void as shown in Figure 5
Analog model 3 moves to the right according to direction indicated by user, but due in AR scene dummy model and user into
When row interaction, plane of the plane that user can operate usually only where AR display interface, and when dummy model is located at AR
In scene when different planes, the exchange method of existing virtual model control still can only make dummy model show boundary in AR
Plane where face is moved, and is caused in display structure as shown in Figure 5, and there is no the AR where it for dummy model 3
It moves on the desktop of desk 2 in scene, but is translated in AR display interface 1, cause a kind of visually virtual mould to user
The illusion of type " drift ".And it is not able to satisfy the practical AR dummy model for wishing to operate of user in the table of specific plane such as desk 2
Move mode on face can not provide a user good AR scene immersive effects.
Therefore, in order to overcome above-mentioned the technical problems existing in the prior art, the present invention is true by the control command of user
The motion track of fixed first plane of first dummy model where it, and the first dummy model is controlled in the first plane according to shifting
Dynamic rail mark is moved, so that the first dummy model is moved in the plane in the first AR scene, to control
AR scene is improved when AR dummy model processed for the immersive effects of user.
Technical solution of the present invention is described in detail with specifically embodiment below.These specific implementations below
Example can be combined with each other, and the same or similar concept or process may be repeated no more in some embodiments.
Fig. 6 is the flow diagram of one embodiment of control method of AR dummy model of the present invention.As shown in fig. 6, this implementation
The control method of AR dummy model that example provides includes:
S101: obtaining control command, and control command is used to control the first dummy model in the first AR scene according to movement
Track is mobile.
S102: the first plane where determining the first dummy model in the first AR scene.
S103: determine the first dummy model in the motion track of the first plane according to control command.
S104: the first dummy model of control is mobile according to motion track in the first plane.
Wherein, in the application scenarios of the present embodiment, the executing subject of this method embodiment can be arbitrary AR and show
Device, the AR display device can be any electronic equipment for having AR display function, such as: mobile phone, tablet computer, desktop
Brain or AR glasses etc., user can check the dummy model in AR scene and AR scene by AR display device.And it can be with
Movement by operating target object is moved to control dummy model in AR scene, so that user passes through operation target
Object realizes the interaction with AR scene.
Specifically, it is obtained first by S101 for controlling the first AR as the AR display device of the present embodiment executing subject
The control instruction that the first dummy model in scene is moved according to motion track.Wherein, the first AR scene refers to acquisition control life
AR display device is passing through the AR scene that its display interface is shown when enabling, and includes at least a virtual mould in the first AR scene
Type, the dummy model that control instruction can control are the first above-mentioned dummy model.For example, in the embodiment shown in fig. 7, the
One AR scene includes first on the desk 2 and desk that are showing at this time in the AR display interface 1 of AR display device
Dummy model 3 is a personage.Acquired control command can be used for controlling virtual in AR scene as shown in Figure 7 in S101
The movement of model 3, or before obtaining control command, can also first receive instruction order, be used to indicate dummy model 3 according to institute
The control command of acquisition is moved.
Then in S102, AR display device determines first plane at the first dummy model place in the first AR scene.
For example, in the embodiment shown in fig. 7, the plane where determining the first dummy model 3 is the table of the desk 2 in the first AR scene
Plane where face.And it optionally, can be used in this step and (Simultaneous reconstructed with map according to synchronous positioning
Localization and Mapping, SLAM) algorithm determine the first dummy model in the first AR scene where it is first flat
Face.Wherein, specifically determine that the mode of plane present in scene can be by display interface 1 acquired in AR display device
Video image carries out the mode of image recognition processing, and plane included in image, specific image are determined according to SLAM algorithm
It handles implementation and algorithm is identical in the prior art, repeat no more its principle, place, which is not shown, can refer to this field correlation often
Know.
Then in S103, AR display device determines that the first dummy model exists according to control command acquired in S101
The motion track of the first acquired plane in S102.
Wherein, in order to realize the S103 in the present embodiment, optionally, acquired control command can be in aforementioned S101
It include: motion track of the target object in the second plane, wherein the second plane is the plane where target object.For example, Fig. 8
The display structural schematic diagram of one embodiment of control instruction is obtained for the present invention, Fig. 9 is that the present invention obtains one embodiment of control instruction
Display structural schematic diagram.It is the functionality controls 11 or user's finger shown on screen as Fig. 8 and Fig. 9 show target object
When 12, the mode of the motion track of target object in the second plane is obtained, wherein the second plane is for showing the first AR scene
AR display device display interface 1.
Specifically, in the embodiment shown in fig. 8, target object is the functionality controls 11 shown on screen, and user can be with
It is moved by touch function control 11 to different directions.In the embodiment in fig. 8, when AR display device detects user's operation
Functionality controls 11 move right, i.e., control function control 11 is in the planar movement where display interface 1, according to functionality controls 11
It is mobile to determine target object in the A where the vector 31 of the motion track of the second plane, the vector 31 from the dummy model 3 in figure
Point is directed toward B point.It should be noted that obtained vector 31 is the direction actually moved of functionality controls 11 in dummy model here
Projection, i.e. functionality controls 11 have actually moved right a certain distance, which corresponds to functionality controls 11 in display interface 1
Position obtain above-mentioned vector 31.It is all made of the mode quantificational description motion track of vector in each embodiment of the application, and moves
Dynamic rail mark can also be using other describing modes such as straight line or line segment etc., and the present embodiment does not limit this.
Alternatively, target object is user's finger 12 in Fig. 9 the embodiment described, then user's finger is from 3 institute of dummy model
A point set out and slide into B point, AR display device can determine user's finger 12 by the touch sensor on display interface 1
Movement after obtain for indicating user's finger 12 in the vector 31 of the motion track of the second plane, the vector 31 is from the void in figure
The mobile direction B point of A point where analog model 3.Above-mentioned Fig. 8 and embodiment shown in Fig. 9 can be used as and obtain object in S101
Body is in the mode of the motion track of the second plane, and AR display device can be realized using any one of said two devices or AR
Display device allows user by way of operating function control 11 simultaneously on display interface 1 and the mode of user's finger 12 obtains
To the motion track of the second plane.
Optionally, then according in above-described embodiment pass through S101 acquired in target object the second plane motion track
Afterwards, above-mentioned S103 is specifically included: target object is flat first as the first dummy model in the motion track of the second plane
The motion track in face.And it is mobile according to motion track in the first plane that the first dummy model is controlled in subsequent S104.Wherein,
The motion track of second plane can be mapped in the first plane by way of mapping, to obtain the first dummy model first
The motion track of plane.That is S103 can be specifically included: determine target object in the motion track of the second plane;By target object
It is mapped to the first plane in the motion track of the second plane, obtains the first dummy model in the motion track of the first plane.
Above-mentioned process S103 and S104 is illustrated below by Figure 10 to Figure 12, wherein Figure 10 is that AR of the present invention is virtual
The display structural schematic diagram of one embodiment of control method of model;Figure 11 is that the control method one of AR dummy model of the present invention is implemented
The display structural schematic diagram of example;Figure 12 is the display structural schematic diagram of one embodiment of control method of AR dummy model of the present invention.
Specifically, in embodiment as shown in Figure 10, several possible first dummy models are shown in the first plane
Motion track.Wherein, it may be used to indicate the first dummy model 3 according to control command acquired in S101 to be obtained in S102
The motion track in the first plane 21 where the first dummy model 3 taken, the four direction up and down of desktop 21 is only in figure
For example.
More specifically, in the embodiment shown in fig. 11, AR display device is shown according to shown in Fig. 8 or Fig. 9
The target object motion track of the first dummy model in the first plane determined by the motion track of the second plane.Wherein,
The target object acquired in Fig. 8 or the end AR display device shown in Fig. 9 is to the right in the motion track of the second plane
Vector 31.And the vector 31 is the second plane relative to 1 place of display screen, but the first dummy model in Figure 11
3 be in the first plane 21, and acquired control command be also instruction the first dummy model 3 moved in the first plane 21
It is dynamic.Therefore, AR display device needs the motion track according to acquired target object in the second plane in S103 at this time, really
Fixed first dummy model the first plane motion track, i.e., using target object the second plane motion track as the first void
Motion track of the analog model in the first plane.The first dummy model 3 determined by S103 is eventually passed through the 21 of the first plane
Motion track is along the vector 32 of the first plane 21 as shown in figure 11, and the vector and the vector 31 in Fig. 8 and Fig. 9 to the right are uneven
Row, but in the vector of the first plane 21 to the right where the desktop of desk 2.The vector is will be second shown in Fig. 8 and Fig. 9
The vector 31 of the motion track of plane is mapped to the vector 32 obtained after the first plane where the first dummy model.
Finally, execute obtained result after S104 as Figure 12 shows AR display device, wherein the first dummy model 3 from
Position where in Figure 11 is moved to the terminal of vector 32 along the starting point of vector 32 identified in Figure 11 in the first plane,
Finally the first dummy model 3 is realized to be moved in the first plane 21 according to control command.Optionally, in above-described embodiment
A kind of middle possible implementation of S104 is to determine the first void in each frame video image of the first AR scene according to motion track
Analog model 3 is in the position of the first plane 21;After determining the position of the first dummy model 3 in all video images, the is played
Each frame video image of one AR scene, wherein for the first dummy model 3 in continuous each frame video image position not
The rule that same and presentation is gradually changed along vector 32.
Optionally, in addition, in the above-described embodiments, Figure 13 is the display structure that the present invention obtains one embodiment of control instruction
Schematic diagram.In a kind of embodiment as shown in fig. 13 that, the first dummy model 3 includes first in the motion track of the first plane 21
In the direction and angle, such as Figure 13 that dummy model 3 rotates in the first plane 21, control command is that target object is flat second
The arc vector 33 of motion track in face after being then mapped to the first plane, determines the first dummy model 31 in the first plane 21
Motion track be the first dummy model 3 in the first plane 21 according to the rotation direction of the vector of the mapping and rotational angle into
Row rotation, then S104 is specifically further include: the first dummy model 3 of control is rotated in the first plane 21 according to motion track.
Specifically in Figure 13, the first dummy model 3 rotates counterclockwise according to the rotation direction and rotational angle of the vector of motion track
180 degree.
Optionally, only illustratively illustrated using the motion track of the first dummy model as straight line in previous embodiment,
Similarly, motion track of first dummy model in the first plane can also be curve.For example, Figure 14 is that the present invention obtains control
The display structural schematic diagram of system one embodiment of instruction.The vector 34 shown in Figure 14 be in S103 AR display device according to control
System orders identified motion track of first dummy model 3 in the first plane 21, wherein target object exists in control command
The motion track of second plane is irregular curve, then is mapped in the first plane 21 and is obtained according to the motion track of the second plane
Vector page 34 to motion track of first dummy model 3 in the first plane 21 are irregular curves.
To sum up, the control method of AR dummy model provided in this embodiment can determine that first is virtual according to control command
The motion track of first plane of the model where it, and control the first dummy model and carried out in the first plane according to motion track
It is mobile, so that the first dummy model is moved in the plane in the first AR scene, in the control virtual mould of AR
AR scene is improved when type for the immersive effects of user.
Optionally, the vector 31 that the present embodiment also provides a kind of motion track by target object in the second plane is mapped to
When the vector 32 obtained after the first plane where the first dummy model, possible implementation: target object is flat second
The motion track in face is mapped by projection matrix, after viewing matrix mapping and imaging plane mapping, is obtained the first dummy model and is existed
The motion track of first plane.This method specifically includes: the bivector by target object in the motion track of the second plane is sat
Mark is converted to as screen coordinate and is projected on far plane and hither plane;Using projection matrix and viewing matrix as screen coordinate
Far plane and hither plane between along the beginning and end of bivector be separately connected ray;Determine the weight of ray and imaging plane
Zygonema is motion track of first dummy model in the first plane.
It is illustrated by taking Figure 15 and Figure 16 as an example below, wherein Figure 15 is the display structural representation of dummy model of the present invention
Figure;Figure 16 be present invention determine that the first dummy model the first planar movement track structure schematic diagram.It is this hair as shown in figure 15
The display structure of bright dummy model needs the first plane where the first dummy model 3 and the first dummy model to be shown
Imaging plane 43 obtains display structure as shown in figure 15 after the mapping of projection matrix and viewing matrix.Wherein, first is empty
Analog model 3 and imaging plane 43 obtain after projection matrix and viewing matrix are converted to as screen coordinate in the position as screen
It is between far plane 41 and hither plane 42.Viewing matrix in the present embodiment refer to need the first dummy model 3 to be shown away from
Position from observer, the position of observer are the origin of viewing matrix intersection as dashed lines.By the institute in the world
There is model to regard a large-sized model as, indicates the model matrix of All Around The World transformation multiplied by one in the left side of all model matrixs
Above-mentioned viewing matrix can be obtained.And projection matrix refers to the vertex in view coordinate system is transformed into plane, mostly uses
The object of distant place seems smaller between perspective projection the mode far plane 41 simulated and hither plane 42 as shown in Figure 15,
Object nearby seems larger.Display structure used in the present embodiment and identical in the prior art, viewing matrix, projection
The definition of matrix and as the place that is not shown of screen coordinate can refer to the common knowledge of this field.The present embodiment is primarily upon
In Figure 16, target object is converted in the bivector coordinate of the motion track of the second plane as screen coordinate first, is obtained
Vector 411 on far plane 41 and project the vector 421 on hither plane 42.Then, 421 edge of vector 411 and vector can be passed through
The beginning and end of vector be separately connected ray, the starting point of the ray can be the position of observer in above-mentioned figure, and will penetrate
The coincidence line 431 of line and imaging plane 43 is motion track of first dummy model 3 in imaging plane, that is, first virtual
Motion track of the model 3 in the first plane 21.
Or optionally, the present embodiment can also for example establish the vector of the motion track of the second plane using other modes
The motion track of the first plane is determined with the mode of the mapping relations of the vector of the motion track of the first plane.Such as in such as Fig. 3
Shown in four direction vector be user's possible operation target object in the motion track of the second plane, which can
Four vectors being respectively mapped in the first plane as shown in Figure 8.When the motion track for detecting the second plane in Fig. 3
The motion track that mapping relations determine the first plane is inquired after vector, four vectors only included in Fig. 8 are merely illustrative, actually may be used
There are the mapping relations of multiple vectors.And the direction that the corresponding relationship of four direction can be different according to the first plane in such as Fig. 8
Feature determines, for example, in Fig. 8 with four of the first plane desk in four of corresponding second flat-paneled display screens, while with while
Corresponding relationship can be adjusted according to angulation between the first plane and the second plane, for example, the first plane four direction with
The corresponding four direction angulation of second plane should be less than 45 degree, can carry out to mapping relations when angle is greater than 45 degree
The corresponding four direction for rotating adjustment and making the first plane and the four direction of the second plane are less than 45 degree.
Still optionally further, it since user is when using AR display device, may be in moving process at any time, therefore,
AR display device can determine that first is virtual by above-mentioned S101 to S104 in showing AR video image when each frame image
The first plane where model is real-time, and determine at any time and adjust moving rail of first dummy model in real-time first plane
Mark.
Figure 17 is the flow diagram of one embodiment of control method of AR dummy model of the present invention.Implementation as shown in figure 17
The process of a kind of AR virtual model control method in combination the various embodiments described above is provided in example, including: (1) AR algorithm
SLAM tracking ability is provided to provide to the actual position posture of plane cognitive ability and current device camera in true environment
Tracking ability provides technical foundation for AR scene role's exchange method.(2) it is provided in virtually rendering space by AR algorithm
Plane information places dummy model, and starts to control virtual game role progress AR game by operation input, and screen operator is defeated
Enter including game remote sensing, key, screen touch, sound, mobile phone vibration etc..(3) the input quilt of control role position angle variation
Be equivalent to the bivector of screen space, be transferred in script environment with corresponding virtual role animation be associated with, wherein two dimension to
Amount is passed through on the plane of motion where projection matrix mapping, viewing matrix mapping and Planar Mapping finally fall in practical role.(4)
(3) step obtains rotation and motion vector of the role in true three-dimension space, is corresponding to frame driving virtual role in game scripts
Movement rotates and plays corresponding model animation.
Figure 18 is the structural schematic diagram of one embodiment of control device of AR dummy model of the present invention.As shown in figure 18, this reality
The control device 18 for applying the AR dummy model of example offer includes: to obtain module 1801, processing module 1802 and display module 1803.
Wherein, it obtains module 1801 and is used to control the first dummy model in the first AR scene for obtaining control command, control command
It is mobile according to motion track;First where processing module 1802 is used to determine the first dummy model in the first AR scene is flat
Face;Processing module 1802 is also used to, and determines the first dummy model in the motion track of the first plane according to control command;Show mould
Block 1803 is mobile according to motion track in the first plane for controlling the first dummy model.
The AR that the control device of AR dummy model provided in this embodiment can be used for executing aforementioned embodiment as shown in Figure 6 is empty
The control method of analog model, specific implementation is identical as principle, repeats no more.
Optionally, in the above-described embodiments, it obtains module 1801 to be specifically used for, obtains target object in the shifting of the second plane
Dynamic rail mark, wherein the second plane is the plane where target object;Processing module 1802 is specifically used for, by target object
The motion track of two planes as the first dummy model the first plane motion track.
Optionally, in the above-described embodiments, processing module 1802 is specifically used for, and determines target object in the shifting of the second plane
Dynamic rail mark;Target object is mapped to the first plane in the motion track of the second plane, it is flat first to obtain the first dummy model
The motion track in face.
Optionally, in the above-described embodiments, processing module 1802 is specifically used for, by target object the second plane movement
Track is mapped by projection matrix, after viewing matrix mapping and imaging plane mapping, obtains the first dummy model in the first plane
Motion track.
Optionally, in the above-described embodiments, processing module 1802 is specifically used for, by target object the second plane movement
The bivector coordinate of track is converted to as screen coordinate and is projected on far plane and hither plane;Utilize projection matrix and view
Matrix is separately connected ray along the beginning and end of bivector between the far plane and the hither plane;Determine ray with
The coincidence line of imaging plane is motion track of first dummy model in the first plane.
Optionally, in the above-described embodiments, processing module 1802 is specifically used for, and reconstructs SLAM with map according to synchronous positioning
Algorithm determines first plane at the first dummy model place in the first AR scene.
Optionally, in the above-described embodiments, display module 1803 is specifically used for, and determines the first AR scene according to motion track
Each frame video image in the first dummy model in the position of the first plane;Play each frame video figure of the first AR scene
Picture.
Optionally, in the above-described embodiments, the first dummy model includes the first virtual mould in the motion track of the first plane
Direction and angle of the type in the first rotation in surface;Display module 1803 is specifically used for, and the first dummy model of control is flat first
Face is rotated according to motion track.
Optionally, in the above-described embodiments, the second plane is the display screen for showing the first AR scene;Target object
For the functionality controls in user's finger or display screen.
The control device of AR dummy model provided in this embodiment can be used for executing the virtual mould of AR shown in previous embodiment
The control method of type, specific implementation is identical as principle, repeats no more.
It should be noted that being schematical, only a kind of logic function to the division of module in the embodiment of the present application
It divides, there may be another division manner in actual implementation.Each functional module in embodiments herein can integrate
In one processing module, it is also possible to modules and physically exists alone, one can also be integrated in two or more modules
In a module.Above-mentioned integrated module both can take the form of hardware realization, can also use the form of software function module
It realizes.
If the integrated module is realized in the form of software function module and sells or use as independent product
When, it can store in a computer readable storage medium.Based on this understanding, the technical solution of the application is substantially
The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words
It embodies, which is stored in a storage medium, including some instructions are used so that a computer
It is each that equipment (can be personal computer, server or the network equipment etc.) or processor (processor) execute the application
The all or part of the steps of embodiment the method.And storage medium above-mentioned includes: USB flash disk, mobile hard disk, read-only memory
(Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic or disk
Etc. the various media that can store program code.
In the above-described embodiments, can come wholly or partly by software, hardware, firmware or any combination thereof real
It is existing.When implemented in software, it can entirely or partly realize in the form of a computer program product.The computer program
Product includes one or more computer instructions.When loading on computers and executing the computer program instructions, all or
It partly generates according to process or function described in the embodiment of the present application.The computer can be general purpose computer, dedicated meter
Calculation machine, computer network or other programmable devices.Computer instruction may be stored in a computer readable storage medium, or
It is transmitted from a computer readable storage medium to another computer readable storage medium, for example, the computer instruction can
To pass through wired (such as coaxial cable, optical fiber, Digital Subscriber Line from a web-site, computer, server or data center
(DSL)) or wireless (such as infrared, wireless, microwave etc.) mode is into another web-site, computer, server or data
The heart is transmitted.The computer readable storage medium can be any usable medium or include that computer can access
The data storage devices such as one or more usable mediums integrated server, data center.The usable medium can be magnetism
Medium, (for example, floppy disk, hard disk, tape), optical medium (for example, DVD) or semiconductor medium (such as solid state hard disk Solid
State Disk (SSD)) etc..
Figure 19 is the structural schematic diagram of one embodiment of control device of AR dummy model of the present invention.As shown in figure 19, this reality
The control device 19 for applying the AR dummy model of example offer includes: receiver 1901, processor 1902 and AR display 1903.Its
In, receiver 1901 for obtaining control command, control command be used to control the first dummy model in the first AR scene according to
Motion track is mobile;Processor 1902 is used to determine first plane at the first dummy model place in the first AR scene;Processing
Device 1902 is also used to, and determines the first dummy model in the motion track of the first plane according to control command;AR display 1903 is used
It is mobile according to motion track in the first plane in controlling the first dummy model.
The AR that the control device of AR dummy model provided in this embodiment can be used for executing aforementioned embodiment as shown in Figure 6 is empty
The control method of analog model, specific implementation is identical as principle, repeats no more.
Optionally, in the above-described embodiments, receiver 1901 is specifically used for, and obtains target object in the movement of the second plane
Track, wherein the second plane is the plane where target object;Processor 1902 is specifically used for, and target object is flat second
The motion track in face as the first dummy model the first plane motion track.
Optionally, in the above-described embodiments, processor 1902 is specifically used for, and determines target object in the movement of the second plane
Track;Target object is mapped to the first plane in the motion track of the second plane, obtains the first dummy model in the first plane
Motion track.
Optionally, in the above-described embodiments, processor 1902 is specifically used for, by target object the second plane moving rail
Mark is mapped by projection matrix, after viewing matrix mapping and imaging plane mapping, obtains the first dummy model in the first plane
Motion track.
Optionally, in the above-described embodiments, processor 1902 is specifically used for, by target object the second plane moving rail
The bivector coordinate of mark is converted to as screen coordinate and is projected on far plane and hither plane;Utilize projection matrix and view square
Beginning and end of the battle array between the far plane and the hither plane along bivector is separately connected ray;Determine ray at
As the coincidence line of plane is motion track of first dummy model in the first plane.
Optionally, in the above-described embodiments, processor 1902 is specifically used for, and is calculated according to synchronous positioning and map reconstruct SLAM
Method determines first plane at the first dummy model place in the first AR scene.
Optionally, in the above-described embodiments, AR display 1903 is specifically used for, and determines the first AR scene according to motion track
Each frame video image in the first dummy model in the position of the first plane;Play each frame video figure of the first AR scene
Picture.
Optionally, in the above-described embodiments, the first dummy model includes the first virtual mould in the motion track of the first plane
Direction and angle of the type in the first rotation in surface;
AR display 1903 is specifically used for, and the first dummy model of control is rotated in the first plane according to motion track.
Optionally, in the above-described embodiments, the second plane is the display screen for showing the first AR scene;Target object
For the functionality controls in user's finger or display screen.
The control device of AR dummy model provided in this embodiment can be used for executing the virtual mould of AR shown in previous embodiment
The control method of type, specific implementation is identical as principle, repeats no more.
The present invention also proposes a kind of electronic equipment readable storage medium storing program for executing, including program, when it runs on an electronic device,
So that electronic equipment executes the control method of AR dummy model described in any of the above-described embodiment.
One embodiment of the invention also provides a kind of electronic equipment, comprising: processor;And memory, it is handled for storage
The executable instruction of device;Wherein, processor is configured to execute in any of the above-described embodiment via executable instruction is executed
The control method of AR dummy model.
It invents an embodiment and a kind of program product is also provided, which includes: computer program (executing instruction),
The computer program is stored in readable storage medium storing program for executing.At least one processor of encoding device can be read from readable storage medium storing program for executing
The computer program is taken, at least one processor executes the computer program and encoding device is made to implement various embodiment party above-mentioned
The control method for the AR dummy model that formula provides.
The above described is only a preferred embodiment of the present invention, be not intended to limit the present invention in any form, according to
According to technical spirit any simple modification, equivalent change and modification to the above embodiments of the invention, this hair is still fallen within
In the range of bright technical solution.
Those of ordinary skill in the art will appreciate that: realize that all or part of the steps of above-mentioned each method embodiment can lead to
The relevant hardware of program instruction is crossed to complete.Program above-mentioned can be stored in a computer readable storage medium.The journey
When being executed, execution includes the steps that above-mentioned each method embodiment to sequence;And storage medium above-mentioned include: ROM, RAM, magnetic disk or
The various media that can store program code such as person's CD.
Finally, it should be noted that the above embodiments are only used to illustrate the technical solution of the present invention., rather than its limitations;To the greatest extent
Pipe present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: its according to
So be possible to modify the technical solutions described in the foregoing embodiments, or to some or all of the technical features into
Row equivalent replacement;And these are modified or replaceed, various embodiments of the present invention technology that it does not separate the essence of the corresponding technical solution
The range of scheme.
Claims (20)
1. a kind of control method of augmented reality AR dummy model characterized by comprising
Control command is obtained, the first dummy model that the control command is used to control in the first AR scene is moved according to motion track
It is dynamic;
The first plane where determining first dummy model in the first AR scene;
Determine first dummy model in the motion track of first plane according to the control command;
It is mobile according to the motion track in first plane to control first dummy model.
2. the method according to claim 1, wherein
The acquisition control command, comprising:
Target object is obtained in the motion track of the second plane, wherein second plane is flat where the target object
Face;
It is described to determine first dummy model in the track of first plane according to the control command, comprising:
Using the target object second plane motion track as first dummy model in first plane
Motion track.
3. according to the method described in claim 2, it is characterized in that, it is described by the target object second plane shifting
Dynamic rail mark as first dummy model first plane motion track, comprising:
Determine the target object in the motion track of second plane;
The target object is mapped to first plane in the motion track of second plane, it is virtual to obtain described first
Motion track of the model in first plane.
4. according to the method described in claim 3, it is characterized in that, it is described by the target object second plane shifting
Dynamic rail mark is mapped to first plane, obtains first dummy model in the motion track of first plane, comprising:
Motion track by the target object in second plane by projection matrix mapping, viewing matrix mapping and is imaged
After Planar Mapping, first dummy model is obtained in the motion track of first plane.
5. according to the method described in claim 4, it is characterized in that, it is described by the target object second plane shifting
Dynamic rail mark is mapped by projection matrix, after viewing matrix mapping and Planar Mapping, obtains first dummy model described the
The motion track of one plane, comprising:
The target object is converted to as screen coordinate and is thrown in the bivector coordinate of the motion track of second plane
On shadow to far plane and hither plane;
Using projection matrix and viewing matrix between the far plane and the hither plane along the starting point of the bivector and
Terminal is separately connected ray;
The coincidence line for determining the ray and the imaging plane is movement of first dummy model in first plane
Track.
6. method according to claim 1-5, which is characterized in that determination first dummy model is in institute
The first plane where stating in the first AR scene, comprising:
Where determining first dummy model in the first AR scene with map reconstruct SLAM algorithm according to synchronous positioning
The first plane.
7. method according to claim 1-5, which is characterized in that control first dummy model is in institute
It is mobile according to the motion track to state the first plane, comprising:
First dummy model described in each frame video image for determining the first AR scene according to the motion track is in institute
State the position of the first plane;
Play each frame video image of the first AR scene.
8. method according to claim 1-5, which is characterized in that first dummy model is flat described first
The motion track in face includes direction and angle of first dummy model in first rotation in surface;
Then control first dummy model is mobile according to the motion track in first plane, further includes:
First dummy model is controlled to be rotated in first plane according to the motion track.
9. method according to claim 1-5, which is characterized in that
Second plane is the display screen for showing the first AR scene;
The target object is the functionality controls on user's finger or the display screen.
10. a kind of control device of augmented reality AR dummy model characterized by comprising
Receiver, for obtaining control command, the first dummy model that the control command is used to control in the first AR scene is pressed
It is mobile according to motion track;
Processor, for the first plane where determining first dummy model in the first AR scene;
The processor is also used to, and determines first dummy model in the movement of first plane according to the control command
Track;
AR display, it is mobile according to the motion track in first plane for controlling first dummy model.
11. device according to claim 10, which is characterized in that
The receiver is specifically used for, and obtains target object in the motion track of the second plane, wherein second plane is institute
State the plane where target object;
The processor is specifically used for, and the target object is virtual as described first in the motion track of second plane
Motion track of the model in first plane.
12. device according to claim 11, which is characterized in that the processor is specifically used for,
Determine the target object in the motion track of second plane;
The target object is mapped to first plane in the motion track of second plane, it is virtual to obtain described first
Motion track of the model in first plane.
13. device according to claim 12, which is characterized in that the processor is specifically used for,
Motion track by the target object in second plane by projection matrix mapping, viewing matrix mapping and is imaged
After Planar Mapping, first dummy model is obtained in the motion track of first plane.
14. device according to claim 13, which is characterized in that the processor is specifically used for,
The target object is converted to as screen coordinate and is thrown in the bivector coordinate of the motion track of second plane
On shadow to far plane and hither plane;
Using projection matrix and viewing matrix between the far plane and the hither plane along the starting point of the bivector and
Terminal is separately connected ray;
The coincidence line for determining the ray and the imaging plane is movement of first dummy model in first plane
Track.
15. the described in any item devices of 0-14 according to claim 1, which is characterized in that the processor is specifically used for, according to same
Step positioning determines first plane at first dummy model place in the first AR scene with map reconstruct SLAM algorithm.
16. the described in any item devices of 0-14 according to claim 1, which is characterized in that the display module is specifically used for,
First dummy model described in each frame video image for determining the first AR scene according to the motion track is in institute
State the position of the first plane;
Play each frame video image of the first AR scene.
17. the described in any item devices of 0-14 according to claim 1, which is characterized in that first dummy model is described
The motion track of one plane includes direction and angle of first dummy model in first rotation in surface;
The display is specifically used for, and controls first dummy model and carries out in first plane according to the motion track
Rotation.
18. the described in any item devices of 0-14 according to claim 1, which is characterized in that
Second plane is the display screen for showing the first AR scene;
The target object is the functionality controls on user's finger or the display screen.
19. a kind of control device of augmented reality AR dummy model characterized by comprising processor, memory and calculating
Machine program;Wherein, the sequence of having the records of distance by the log is stored in the memory, and is configured as being executed by the processor, described
Computer program includes for executing the instruction such as the described in any item methods of claim 1-9.
20. a kind of computer readable storage medium, which is characterized in that the computer-readable recording medium storage has computer journey
Sequence, the computer program make server perform claim require the described in any item methods of 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810993135.0A CN109189302B (en) | 2018-08-29 | 2018-08-29 | Control method and device of AR virtual model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810993135.0A CN109189302B (en) | 2018-08-29 | 2018-08-29 | Control method and device of AR virtual model |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109189302A true CN109189302A (en) | 2019-01-11 |
CN109189302B CN109189302B (en) | 2021-04-06 |
Family
ID=64917072
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810993135.0A Active CN109189302B (en) | 2018-08-29 | 2018-08-29 | Control method and device of AR virtual model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109189302B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111984172A (en) * | 2020-07-15 | 2020-11-24 | 北京城市网邻信息技术有限公司 | Furniture moving method and device |
CN111984171A (en) * | 2020-07-15 | 2020-11-24 | 北京城市网邻信息技术有限公司 | Method and device for generating furniture movement track |
CN112148197A (en) * | 2020-09-23 | 2020-12-29 | 北京市商汤科技开发有限公司 | Augmented reality AR interaction method and device, electronic equipment and storage medium |
CN112337097A (en) * | 2020-10-27 | 2021-02-09 | 网易(杭州)网络有限公司 | Game simulation method and device |
CN112672185A (en) * | 2020-12-18 | 2021-04-16 | 脸萌有限公司 | Augmented reality-based display method, device, equipment and storage medium |
WO2022088523A1 (en) * | 2020-11-02 | 2022-05-05 | 网易(杭州)网络有限公司 | Object moving method and apparatus, and storage medium and electronic apparatus |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102915171A (en) * | 2011-08-04 | 2013-02-06 | 王振兴 | Moving trajectory generation method |
CN103996215A (en) * | 2013-11-05 | 2014-08-20 | 深圳市云立方信息科技有限公司 | Method and apparatus for realizing conversion from virtual view to three-dimensional view |
CN106886285A (en) * | 2017-01-20 | 2017-06-23 | 西安电子科技大学 | A kind of historical relic interactive system and operating method based on virtual reality |
CN107291266A (en) * | 2017-06-21 | 2017-10-24 | 腾讯科技(深圳)有限公司 | The method and apparatus that image is shown |
CN107564089A (en) * | 2017-08-10 | 2018-01-09 | 腾讯科技(深圳)有限公司 | Three dimensional image processing method, device, storage medium and computer equipment |
-
2018
- 2018-08-29 CN CN201810993135.0A patent/CN109189302B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102915171A (en) * | 2011-08-04 | 2013-02-06 | 王振兴 | Moving trajectory generation method |
CN103996215A (en) * | 2013-11-05 | 2014-08-20 | 深圳市云立方信息科技有限公司 | Method and apparatus for realizing conversion from virtual view to three-dimensional view |
CN106886285A (en) * | 2017-01-20 | 2017-06-23 | 西安电子科技大学 | A kind of historical relic interactive system and operating method based on virtual reality |
CN107291266A (en) * | 2017-06-21 | 2017-10-24 | 腾讯科技(深圳)有限公司 | The method and apparatus that image is shown |
CN107564089A (en) * | 2017-08-10 | 2018-01-09 | 腾讯科技(深圳)有限公司 | Three dimensional image processing method, device, storage medium and computer equipment |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111984172A (en) * | 2020-07-15 | 2020-11-24 | 北京城市网邻信息技术有限公司 | Furniture moving method and device |
CN111984171A (en) * | 2020-07-15 | 2020-11-24 | 北京城市网邻信息技术有限公司 | Method and device for generating furniture movement track |
CN112148197A (en) * | 2020-09-23 | 2020-12-29 | 北京市商汤科技开发有限公司 | Augmented reality AR interaction method and device, electronic equipment and storage medium |
CN112337097A (en) * | 2020-10-27 | 2021-02-09 | 网易(杭州)网络有限公司 | Game simulation method and device |
WO2022088523A1 (en) * | 2020-11-02 | 2022-05-05 | 网易(杭州)网络有限公司 | Object moving method and apparatus, and storage medium and electronic apparatus |
CN112672185A (en) * | 2020-12-18 | 2021-04-16 | 脸萌有限公司 | Augmented reality-based display method, device, equipment and storage medium |
CN112672185B (en) * | 2020-12-18 | 2023-07-07 | 脸萌有限公司 | Augmented reality-based display method, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN109189302B (en) | 2021-04-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11861059B2 (en) | System and method for generating a virtual reality scene based on individual asynchronous motion capture recordings | |
CN109189302A (en) | The control method and device of AR dummy model | |
JP6643357B2 (en) | Full spherical capture method | |
US9818228B2 (en) | Mixed reality social interaction | |
US11010958B2 (en) | Method and system for generating an image of a subject in a scene | |
US11282264B2 (en) | Virtual reality content display method and apparatus | |
US9829996B2 (en) | Operations in a three dimensional display system | |
US10762694B1 (en) | Shadows for inserted content | |
US11765335B2 (en) | Synthetic stereoscopic content capture | |
CN107211117A (en) | The second eyes viewport is synthesized using interweaving | |
US20140317575A1 (en) | Zero Parallax Drawing within a Three Dimensional Display | |
JP7353782B2 (en) | Information processing device, information processing method, and program | |
US20120114200A1 (en) | Addition of immersive interaction capabilities to otherwise unmodified 3d graphics applications | |
CN116057577A (en) | Map for augmented reality | |
US20170103562A1 (en) | Systems and methods for arranging scenes of animated content to stimulate three-dimensionality | |
KR101741149B1 (en) | Method and device for controlling a virtual camera's orientation | |
Ali et al. | 3D VIEW: Designing of a Deception from Distorted View-dependent Images and Explaining interaction with virtual World. | |
Ohta et al. | Photo-based Desktop Virtual Reality System Implemented on a Web-browser | |
CN111679806A (en) | Play control method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20190111 Assignee: Beijing Intellectual Property Management Co.,Ltd. Assignor: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd. Contract record no.: X2023110000098 Denomination of invention: Control Method and Device of AR Virtual Model Granted publication date: 20210406 License type: Common License Record date: 20230822 |
|
EE01 | Entry into force of recordation of patent licensing contract |