CN109189302B - Control method and device of AR virtual model - Google Patents

Control method and device of AR virtual model Download PDF

Info

Publication number
CN109189302B
CN109189302B CN201810993135.0A CN201810993135A CN109189302B CN 109189302 B CN109189302 B CN 109189302B CN 201810993135 A CN201810993135 A CN 201810993135A CN 109189302 B CN109189302 B CN 109189302B
Authority
CN
China
Prior art keywords
plane
virtual model
moving track
scene
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810993135.0A
Other languages
Chinese (zh)
Other versions
CN109189302A (en
Inventor
张岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201810993135.0A priority Critical patent/CN109189302B/en
Publication of CN109189302A publication Critical patent/CN109189302A/en
Application granted granted Critical
Publication of CN109189302B publication Critical patent/CN109189302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a control method and a device of an AR virtual model, wherein the method comprises the following steps: acquiring a control command, wherein the control command is used for controlling a first virtual model in a first AR scene to move according to a moving track; determining a first plane in which the first virtual model is located in the first AR scene; determining the moving track of the first virtual model on the first plane according to the control command; and controlling the first virtual model to move on the first plane according to the moving track. The method and the device for controlling the AR virtual model can determine the moving track of the first virtual model on the first plane where the first virtual model is located according to the control command, and control the first virtual model to move on the first plane according to the moving track, so that the first virtual model can move on the plane in the first AR scene, and the immersion effect of the AR scene on a user is improved when the AR virtual model is controlled.

Description

Control method and device of AR virtual model
Technical Field
The invention relates to the technical field of Augmented Reality (AR), in particular to a method and a device for controlling an AR virtual model.
Background
Augmented Reality (AR) is a technology that can increase a user's perception of the real world through information provided by a computer system. The AR technology enables a user to see a real scene and a virtual model therein at the same time by superimposing a computer-generated virtual model onto the real scene that the user can see through an AR display device such as AR glasses, thereby achieving an enhancement of the user's perception of reality.
In the prior art, in some games or some application programs, a user may control an AR virtual model to implement operations such as moving or turning in an AR scene by operating different controls on an interface of a mobile phone or in other input devices, so as to increase interaction between the user and the AR scene.
However, when displaying a virtual model in an AR scene to a user in the prior art, the model itself is usually used as a first view angle, or a fixed view angle in the AR scene is adopted. Such a first viewing angle or a fixed viewing angle enables the user to control the AR virtual model to move or turn over, the AR virtual model can only move at the fixed viewing angle, which may cause the AR virtual model that the user can see to be in a "floating" or "suspended" state in the AR scene, but cannot satisfy the AR virtual model moving mode that the user actually wants to operate, and cannot provide a good AR scene immersion effect for the user. Therefore, how to improve the immersion effect of the AR scene on the user when controlling the AR virtual model is a technical problem to be solved urgently at present.
Disclosure of Invention
The invention provides a control method and a control device of an AR virtual model, which are used for determining a moving track of a first virtual model on a first plane where the first virtual model is located according to a control command and controlling the first virtual model to move on the first plane according to the moving track, so that the first virtual model can move on the plane in a first AR scene, and the immersion effect of the AR scene on a user is improved when the AR virtual model is controlled.
The first aspect of the present invention provides a method for controlling an AR virtual model, including:
acquiring a control command, wherein the control command is used for controlling a first virtual model in a first AR scene to move according to a moving track;
determining a first plane in which the first virtual model resides in the first AR scene;
determining the moving track of the first virtual model on the first plane according to the control command;
and controlling the first virtual model to move on the first plane according to the movement track.
In an embodiment of the first aspect of the present invention, the obtaining the control command includes:
acquiring a moving track of a target object on a second plane, wherein the second plane is a plane where the target object is located;
the determining the trajectory of the first virtual model in the first plane according to the control command includes:
and taking the moving track of the target object on the second plane as the moving track of the first virtual model on the first plane.
In an embodiment of the first aspect of the present invention, the taking the moving track of the target object in the second plane as the moving track of the first virtual model in the first plane includes:
determining a moving track of the target object on the second plane;
and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
In an embodiment of the first aspect of the present invention, the mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane includes:
and after the moving track of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping and imaging plane mapping, the moving track of the first virtual model on the first plane is obtained.
In an embodiment of the first aspect of the present invention, after the moving trajectory of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping, and plane mapping, obtaining the moving trajectory of the first virtual model on the first plane includes:
converting the two-dimensional vector coordinates of the moving track of the target object on the second plane into image screen coordinates and projecting the image screen coordinates onto a far plane and a near plane;
connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively;
and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
In an embodiment of the first aspect of the present invention, the determining a first plane in which the first virtual model is located in the first AR scene includes:
and determining a first plane of the first virtual model in the first AR scene according to a synchronous positioning and map reconstruction SLAM algorithm.
In an embodiment of the first aspect of the present invention, the controlling the first virtual model to move on the first plane according to the movement trajectory includes:
determining a position of the first virtual model in each frame of video image of the first AR scene at the first plane according to the movement track;
playing the each frame of video image of the first AR scene.
In an embodiment of the first aspect of the present invention, a moving track of the first virtual model in the first plane includes a direction and an angle of rotation of the first virtual model in the first plane;
then the controlling the first virtual model to move on the first plane according to the movement trajectory further includes:
and controlling the first virtual model to rotate on the first plane according to the moving track.
In an embodiment of the first aspect of the present invention, the second plane is a display screen for displaying the first AR scene;
the target object is a user finger or a function control on the display screen.
A second aspect of the present invention provides an AR virtual model control apparatus, including:
the receiver is used for acquiring a control command, and the control command is used for controlling a first virtual model in a first AR scene to move according to a moving track;
a processor to determine a first plane in which the first virtual model resides in the first AR scene;
the processor is further used for determining a moving track of the first virtual model on the first plane according to the control command;
and the AR display is used for controlling the first virtual model to move on the first plane according to the moving track.
In an embodiment of the second aspect of the present invention, the receiver is specifically configured to obtain a moving track of a target object on a second plane, where the second plane is a plane where the target object is located;
the processor is specifically configured to use a movement trajectory of the target object in the second plane as a movement trajectory of the first virtual model in the first plane.
In an embodiment of the second aspect of the present invention, the processor is specifically configured to,
determining a moving track of the target object on the second plane;
and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
In an embodiment of the second aspect of the present invention, the processor is specifically configured to,
and after the moving track of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping and imaging plane mapping, the moving track of the first virtual model on the first plane is obtained.
In an embodiment of the second aspect of the present invention, the processor is specifically configured to,
converting the two-dimensional vector coordinates of the moving track of the target object on the second plane into image screen coordinates and projecting the image screen coordinates onto a far plane and a near plane;
connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively;
and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
In an embodiment of the second aspect of the present invention, the processor is specifically configured to determine a first plane in which the first virtual model is located in the first AR scene according to a simultaneous localization and mapping SLAM algorithm.
In an embodiment of the second aspect of the present invention, the display module is specifically configured to,
determining a position of the first virtual model in each frame of video image of the first AR scene at the first plane according to the movement track;
playing the each frame of video image of the first AR scene.
In an embodiment of the second aspect of the present invention, the moving track of the first virtual model in the first plane includes the direction and angle of rotation of the first virtual model in the first plane;
the display is specifically configured to control the first virtual model to rotate on the first plane according to the movement trajectory.
In an embodiment of the second aspect of the present invention, the second plane is a display screen for displaying the first AR scene;
the target object is a user finger or a function control on the display screen.
In a third aspect, an embodiment of the present application provides a control apparatus for an AR virtual model, including: a processor and a memory; the memory is used for storing programs; the processor is configured to call a program stored in the memory to perform the method according to any one of the first aspect of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing program code, which when executed, performs the method according to any one of the first aspect of the present application.
In summary, the present invention provides a method and an apparatus for controlling an AR virtual model, wherein the method includes: acquiring a control command, wherein the control command is used for controlling a first virtual model in a first AR scene to move according to a moving track; determining a first plane in which the first virtual model is located in the first AR scene; determining the moving track of the first virtual model on the first plane according to the control command; and controlling the first virtual model to move on the first plane according to the moving track. The method and the device for controlling the AR virtual model can determine the moving track of the first virtual model on the first plane where the first virtual model is located according to the control command, and control the first virtual model to move on the first plane according to the moving track, so that the first virtual model can move on the plane in the first AR scene, and the immersion effect of the AR scene on a user is improved when the AR virtual model is controlled.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a diagram illustrating an embodiment of a display structure of a prior art AR virtual model;
FIG. 2 is a diagram illustrating an embodiment of a display structure of a prior art AR virtual model;
FIG. 3 is a schematic diagram of a display structure of an embodiment of a control method for an AR virtual model in the prior art;
FIG. 4 is a schematic diagram of a display structure of an embodiment of a control method for an AR virtual model in the prior art;
FIG. 5 is a schematic diagram of a display structure of an embodiment of a control method for an AR virtual model in the prior art;
FIG. 6 is a flowchart illustrating an embodiment of a method for controlling an AR virtual model according to the present invention;
FIG. 7 is a diagram illustrating a display structure for acquiring a control command according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a display structure for acquiring a control command according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a display structure for acquiring a control command according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a display structure of an embodiment of a method for controlling an AR virtual model according to the present invention;
FIG. 11 is a schematic diagram of a display structure of an embodiment of a method for controlling an AR virtual model according to the present invention;
FIG. 12 is a schematic diagram of a display structure of an embodiment of a method for controlling an AR virtual model according to the present invention;
FIG. 13 is a diagram illustrating a display structure for acquiring control commands according to an embodiment of the present invention;
FIG. 14 is a diagram illustrating a display structure for acquiring control commands according to an embodiment of the present invention;
FIG. 15 is a schematic view of a display structure of a virtual model according to the present invention;
FIG. 16 is a schematic structural diagram illustrating a movement trajectory of a first virtual model in a first plane according to the present invention;
FIG. 17 is a flowchart illustrating a method for controlling an AR virtual model according to an embodiment of the present invention;
FIG. 18 is a schematic structural diagram of an embodiment of a control apparatus for an AR virtual model according to the present invention;
FIG. 19 is a schematic structural diagram of an embodiment of a control apparatus for an AR virtual model according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
FIG. 1 is a diagram illustrating an embodiment of a display structure of an AR virtual model in the prior art. As shown in fig. 1, in the conventional AR virtual model display structure, when displaying an AR scene including a virtual model provided by an AR display device to a user, the virtual model itself is usually used as a first viewing angle. As shown in fig. 1, the central location within the AR display interface 1 is a virtual model 3 in the AR scene, and the virtual model 3 is a virtual character in the AR scene. However, no matter what content or different planes exist in the AR scene, the AR virtual model displayed in the existing AR scene only displays the virtual model at the center position of the AR display interface 1, so that the real scene presented to the user in the AR scene shown in fig. 1 is a table 2, and the plane where the desktop of the table 2 is located is not parallel to the plane of the AR display device. At this time, if the virtual model 3 is also displayed at the center position of the AR display interface 1, the virtual model 3 is caused to be in a "floating" or "floating" state in the AR scene in front of the table 2 that the user can see.
FIG. 2 is a diagram illustrating an embodiment of a display structure of an AR virtual model in the prior art. In the prior art shown in fig. 2, the display structure shown in fig. 1 is improved to a certain extent, and the virtual model is no longer displayed in a central position in the AR display interface 1 of the AR display device, but is placed on a specific plane in the AR display interface 1. For example, as shown in fig. 2, in the AR scene of the AR display interface 1, the virtual model is no longer located at the center of the AR scene, but is located on the plane where the desktop of the table 2 is located in the AR scene. This can improve the sense of immersion of the user when viewing the AR scene.
However, when the display structure of the AR virtual model as shown in fig. 2 is adopted, there is also a need to solve the problem of how to control the movement of the virtual model. In some games or some application programs, a user can control the AR virtual model to realize operations such as moving or turning in the AR scene by operating the movement of the control on the interface of the mobile phone, so that the interaction between the user and the AR scene is increased. However, since a specific plane in the AR scene is displayed when the AR virtual model shown in fig. 2 is displayed in the AR scene, the planes in different AR scenes are different. According to the control method of the AR virtual model provided in the prior art, the virtual model can only move on the plane where the control operated by the user is located, but the moving mode of the AR virtual model actually expected to be operated by the user on a specific plane cannot be met, and a good AR scene immersion effect cannot be provided for the user. For example, the following description is made with reference to the display structures shown in fig. 3 to 5, where fig. 3 is a schematic view of a display structure of an embodiment of a control method for an AR virtual model in the prior art; FIG. 4 is a schematic diagram of a display structure of an embodiment of a control method for an AR virtual model in the prior art; fig. 5 is a schematic view of a display structure of an embodiment of a control method of an AR virtual model in the prior art. Specifically, in the display structure shown in fig. 3, the virtual model in the AR scene of the AR display interface 1 is no longer located at the center of the AR scene, but is located on the plane where the desktop of the table 2 in the AR scene is located. When the user needs to control the movement of the virtual model in the AR scene, the movement of the virtual model in the AR scene needs to be realized by moving the control provided by the AR display interface 1 to different directions, for example. As fig. 1 shows four possible moving directions of a control which a user may operate, for example, the control is the virtual model 3 itself, and the user may move in four directions, up, down, left, and right, as shown in fig. 3, by touching the virtual model 3. More specifically, taking the example in fig. 4 where the user operates the virtual model 3 to move to the right, when the user touches the virtual model 3 on the AR display interface 1 and slides to the right, the virtual model moves to the right according to the sliding direction of the user operation. The virtual model 3 shown in fig. 5 moves to the right in the direction indicated by the user, but when the virtual model in the AR scene interacts with the user, the plane where the user can operate is usually only the plane where the AR display interface is located, and when the virtual model is located on a different plane in the AR scene, the existing interaction method controlled by the virtual model can only cause the virtual model to move on the plane where the AR display interface is located, so that in the display structure shown in fig. 5, the virtual model 3 does not move on the desktop of the table 2 in the AR scene where the virtual model is located, but moves in the AR display interface 1, and a visual illusion that the virtual model "drifts" is caused to the user. And cannot satisfy the moving manner of the AR virtual model that the user actually wants to operate on a specific plane such as the desktop of the table 2, nor provide the user with a good AR scene immersion effect.
Therefore, in order to overcome the technical problems in the prior art, the movement track of the first virtual model on the first plane where the first virtual model is located is determined through the control command of the user, and the first virtual model is controlled to move on the first plane according to the movement track, so that the first virtual model can move on the plane in the first AR scene, and the immersion effect of the AR scene on the user is improved when the AR virtual model is controlled.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 6 is a flowchart illustrating a control method of an AR virtual model according to an embodiment of the present invention. As shown in fig. 6, the method for controlling an AR virtual model provided in this embodiment includes:
s101: and acquiring a control command, wherein the control command is used for controlling the first virtual model in the first AR scene to move according to the moving track.
S102: a first plane in which the first virtual model resides in the first AR scene is determined.
S103: and determining the moving track of the first virtual model in the first plane according to the control command.
S104: and controlling the first virtual model to move on the first plane according to the moving track.
In the application scenario of this embodiment, the execution subject of this embodiment of the method may be any AR display device, and the AR display device may be any electronic device with an AR display function, for example: the user can view the AR scene and the virtual model in the AR scene through the AR display device. And the virtual model can be controlled to move in the AR scene through the movement of the operation target object, so that the user realizes the interaction with the AR scene through the operation target object.
Specifically, the AR display apparatus as the execution subject of the present embodiment first acquires a control instruction for controlling the first virtual model in the first AR scene to move along the movement trajectory through S101. The first AR scene is an AR scene that is being displayed by the AR display device through the display interface thereof when the control command is acquired, the first AR scene at least includes one virtual model, and the virtual model that can be controlled by the control command is the first virtual model. For example, in the embodiment shown in fig. 7, the first AR scene comprises a table 2 being displayed at the time in the AR display interface 1 of the AR display device, and the first virtual model 3 on the table is a character. The control command acquired in S101 may be used to control the movement of the virtual model 3 in the AR scene as shown in fig. 7, or an instruction command may be received before acquiring the control command, so as to instruct the virtual model 3 to move according to the acquired control command.
Subsequently in S102, the AR display device determines a first plane in which the first virtual model resides in the first AR scene. For example, in the embodiment shown in fig. 7, it is determined that the plane in which the first virtual model 3 is located is the plane in which the desktop of the table 2 in the first AR scene is located. Optionally, a first plane in which the first virtual model is located in the first AR scene may be determined according to a Simultaneous Localization and Mapping (SLAM) algorithm in this step. The specific manner of determining the plane existing in the scene may be a manner of performing image recognition processing on the video image in the display interface 1 acquired by the AR display device, and the plane included in the image is determined according to the SLAM algorithm, and the specific image processing implementation manner and the algorithm in the prior art are the same, and the detailed description thereof is omitted, and the details of which are not shown may refer to the common general knowledge in the art.
Subsequently in S103, the AR display device determines the movement locus of the first plane acquired in S102 of the first virtual model according to the control command acquired in S101.
In order to implement S103 in this embodiment, optionally, the control command acquired in S101 may include: and the moving track of the target object on a second plane, wherein the second plane is the plane where the target object is located. For example, fig. 8 is a schematic display structure diagram of an embodiment of the acquisition control instruction according to the present invention, and fig. 9 is a schematic display structure diagram of an embodiment of the acquisition control instruction according to the present invention. Fig. 8 and 9 show a manner of acquiring a movement trajectory of the target object in a second plane when the target object is the function control 11 or the user's finger 12 on the display screen, where the second plane is the display interface 1 of the AR display apparatus for displaying the first AR scene.
Specifically, in the embodiment shown in fig. 8, the target object is a function control 11 on the display screen, and the user can move in different directions by touching the function control 11. In the embodiment of fig. 8, when the AR display apparatus detects that the user operates the functionality control 11 to move to the right, that is, controls the functionality control 11 to move on the plane where the display interface 1 is located, a vector 31 of the movement track of the target object on the second plane is determined according to the movement of the functionality control 11, and the vector 31 points to a point B from a point a where the virtual model 3 is located in the figure. It should be noted that the vector 31 obtained here is a projection of the actual moving direction of the function control 11 on the virtual model, that is, the function control 11 actually moves rightward by a certain distance, and the distance corresponds to the position of the function control 11 on the display interface 1 to obtain the vector 31. In each embodiment of the present application, a vector manner is adopted to quantitatively describe a movement track, and the movement track may also adopt other description manners, such as a straight line or a line segment, which is not limited in this embodiment.
Or, in the embodiment illustrated in fig. 9, the target object is the user's finger 12, and the user's finger slides from the point a where the virtual model 3 is located to the point B, and the AR display device may determine the motion of the user's finger 12 through the touch sensor on the display interface 1, and then obtain a vector 31 representing the movement track of the user's finger 12 in the second plane, where the vector 31 moves from the point a where the virtual model 3 is located in the figure to point B. The above-mentioned embodiments shown in fig. 8 and 9 can be implemented as a manner of acquiring the moving track of the target object on the second plane in S101, the AR display device can be implemented by using either of the above-mentioned embodiments, or the AR display device allows the user to obtain the moving track of the second plane on the display interface 1 by operating the function control 11 and by using the user' S finger 12.
Optionally, after the moving track of the target object on the second plane is obtained through S101 according to the above embodiment, S103 specifically includes: and taking the moving track of the target object on the second plane as the moving track of the first virtual model on the first plane. And controls the first virtual model to move in the first plane according to the movement trajectory in the following S104. The movement track of the second plane can be mapped onto the first plane in a mapping mode to obtain the movement track of the first virtual model on the first plane. That is, S103 may specifically include: determining the moving track of the target object on the second plane; and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
The above processes S103 and S104 are illustrated in fig. 10 to 12, wherein fig. 10 is a schematic display structure diagram of an embodiment of the control method of the AR virtual model according to the present invention; FIG. 11 is a schematic diagram of a display structure of an embodiment of a method for controlling an AR virtual model according to the present invention; fig. 12 is a schematic view of a display structure of an embodiment of a method for controlling an AR virtual model according to the present invention.
In particular, in the embodiment shown in fig. 10, several possible movement trajectories of the first virtual model in the first plane are shown. Here, the control command obtained in S101 may be used to indicate a moving track of the first virtual model 3 in the first plane 21 where the first virtual model 3 obtained in S102 is located, and four directions of the desktop 21 in the top, bottom, left and right directions are only examples in the figure.
More specifically, in the embodiment shown in fig. 11, the movement locus of the first virtual model in the first plane determined by the AR display apparatus according to the movement locus of the target object in the second plane shown in fig. 8 or 9 is shown. The movement locus of the target object on the second plane, which is acquired by the final AR display device shown in fig. 8 or 9, is a vector 31 to the right. While this vector 31 is relative to the second plane in which the display screen 1 is located, but the first virtual model 3 in fig. 11 is in the first plane 21, and the acquired control command also instructs the first virtual model 3 to move in the first plane 21. Therefore, in S103, the AR display apparatus needs to determine the movement trajectory of the first virtual model in the first plane according to the acquired movement trajectory of the target object in the second plane, that is, the movement trajectory of the target object in the second plane is taken as the movement trajectory of the first virtual model in the first plane. The movement locus of the first virtual model 3 in the first plane 21 finally determined through S103 is, as shown in fig. 11, a vector 32 along the first plane 21, which is not parallel to the vector 31 to the right in fig. 8 and 9, but is a vector to the right in the first plane 21 on which the desktop of the table 2 is located. The vector is a vector 32 obtained by mapping a vector 31 of the movement locus of the second plane shown in fig. 8 and 9 to the first plane in which the first virtual model is located.
Finally, fig. 12 shows the result obtained after the AR display device executes S104, in which the first virtual model 3 moves from the position in fig. 11 along the starting point of the vector 32 determined in fig. 11 in the first plane to the end point of the vector 32, and finally the first virtual model 3 moves in the first plane 21 according to the control command. Optionally, in the above embodiment, one possible implementation manner of S104 is to determine, according to the movement track, a position of the first virtual model 3 in the first plane 21 in each frame of the video image of the first AR scene; after the position of the first virtual model 3 is determined in all the video images, each frame of video image of the first AR scene is played, wherein the position of the first virtual model 3 in each frame of video image is different and shows a rule gradually changing along the vector 32.
In addition, optionally, in the above embodiment, fig. 13 is a schematic display structure diagram of an embodiment of obtaining a control command according to the present invention. In an embodiment as shown in fig. 13, the movement locus of the first virtual model 3 in the first plane 21 includes a direction and an angle of rotation of the first virtual model 3 in the first plane 21, for example, in fig. 13, the control command is an arc vector 33 of the movement locus of the target object in the second plane, and after mapping to the first plane, determining that the movement locus of the first virtual model 31 in the first plane 21 is that the first virtual model 3 rotates in the first plane 21 according to the rotation direction and the rotation angle of the mapped vector, S104 further includes: and controlling the first virtual model 3 to rotate on the first plane 21 according to the moving track. Specifically, in fig. 13, the first virtual model 3 is rotated counterclockwise by 180 degrees according to the rotation direction and the rotation angle of the vector of the movement trajectory.
Alternatively, in the foregoing embodiment, only the moving track of the first virtual model is taken as a straight line for exemplary illustration, and likewise, the moving track of the first virtual model in the first plane may also be a curved line. For example, fig. 14 is a schematic view of a display structure of an embodiment of obtaining a control command according to the present invention. A vector 34 shown in fig. 14 is a moving trajectory of the first virtual model 3 on the first plane 21 determined by the AR display device according to the control command in S103, where the moving trajectory of the target object on the second plane in the control command is an irregular curve, and the vector 34 that is mapped onto the first plane 21 according to the moving trajectory of the second plane to obtain the moving trajectory of the first virtual model 3 on the first plane 21 is an irregular curve.
In summary, the control method for the AR virtual model provided in this embodiment can determine, according to the control command, the movement trajectory of the first virtual model in the first plane where the first virtual model is located, and control the first virtual model to move in the first plane according to the movement trajectory, so that the first virtual model can move on the plane in the first AR scene, and the immersion effect of the AR scene on the user is improved when the AR virtual model is controlled.
Optionally, this embodiment further provides a possible implementation manner when the vector 31 of the moving track of the target object on the second plane is mapped to the vector 32 obtained after the first plane where the first virtual model is located: and after the moving track of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping and imaging plane mapping, the moving track of the first virtual model on the first plane is obtained. The method specifically comprises the following steps: converting the two-dimensional vector coordinates of the moving track of the target object on the second plane into image screen coordinates and projecting the image screen coordinates onto a far plane and a near plane; connecting rays between a far plane and a near plane of an image screen coordinate along a starting point and an end point of a two-dimensional vector by using a projection matrix and a view matrix; and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
Fig. 15 and fig. 16 are taken as examples for explanation, wherein fig. 15 is a schematic view of a display structure of the virtual model according to the present invention; FIG. 16 is a schematic diagram illustrating the determination of the movement trajectory of the first virtual model in the first plane according to the present invention. Fig. 15 shows a display structure of the virtual model of the present invention, and the first virtual model 3 to be displayed and the imaging plane 43 of the first plane where the first virtual model is located are mapped by the projection matrix and the view matrix to obtain the display structure shown in fig. 15. Wherein, the position of the image screen obtained after the first virtual model 3 and the imaging plane 43 are converted into the image screen coordinates through the projection matrix and the view matrix is between the far plane 41 and the near plane 42. The view matrix in this embodiment refers to a position of the first virtual model 3 to be displayed, which is away from the observer, and the position of the observer is a junction of an origin of the view matrix, such as a dashed line in the drawing. All models in the world are regarded as a large model, and the left sides of all model matrixes are multiplied by a model matrix representing the transformation of the whole world to obtain the view matrix. The projection matrix is to convert the vertices in the view coordinate system to a plane, and the far object and the near object between the far plane 41 and the near plane 42 are simulated by a perspective projection method as shown in fig. 15 to be small in appearance and large in appearance. The display structure used in this embodiment is the same as in the prior art, and the definition of the view matrix, the projection matrix, and the non-illustrated parts like the screen coordinates can be referred to the common knowledge in the art. The present embodiment focuses mainly on fig. 16, first, two-dimensional vector coordinates of a moving track of a target object on a second plane are converted into image screen coordinates, and a vector 411 on a far plane 41 and a vector 421 projected onto a near plane 42 are obtained. Subsequently, a ray can be connected through the vector 411 and the vector 421 along the starting point and the ending point of the vector, respectively, where the starting point of the ray can be the position of the observer in the above-mentioned figure, and the coincidence line 431 of the ray and the imaging plane 43 is the moving track of the first virtual model 3 in the imaging plane, that is, the moving track of the first virtual model 3 in the first plane 21.
Or alternatively, the present embodiment may also determine the movement trajectory of the first plane in another manner, for example, in a manner of establishing a mapping relationship between the vector of the movement trajectory of the second plane and the vector of the movement trajectory of the first plane. For example, the vectors in the four directions shown in fig. 3 are the moving tracks of the target object that the user may manipulate in the second plane, and the four directions may be respectively mapped to the four vectors in the first plane shown in fig. 8. When the vector of the movement track of the second plane in fig. 3 is detected, the mapping relationship is queried to determine the movement track of the first plane, and only four vectors are included in fig. 8 as an example, and the mapping relationship of multiple vectors may actually exist. And as shown in fig. 8, the correspondence relationship between the four directions may be determined according to different directional characteristics of the first plane, for example, in fig. 8, four sides of the table with the first plane correspond to four sides of the display screen with the second plane, and the correspondence relationship between the sides may be adjusted according to an angle between the first plane and the second plane, for example, an angle formed by the four directions of the first plane and the corresponding four directions of the second plane should be less than 45 degrees, and when the angle is greater than 45 degrees, the correspondence relationship may be correspondingly adjusted in a rotating manner so that the four directions of the first plane and the four directions of the second plane are less than 45 degrees.
Further alternatively, since the user may be in the process of moving at any time when using the AR display apparatus, the AR display apparatus may determine the first plane where the first virtual model is located in real time through the above S101 to S104 when displaying each frame of image in the AR video image, and determine and adjust the moving track of the first virtual model in the real-time first plane at any time.
Fig. 17 is a flowchart illustrating a control method of an AR virtual model according to an embodiment of the present invention. An embodiment shown in fig. 17 provides a flow combining the AR virtual model control method in the foregoing embodiments, where the flow includes: (1) the AR algorithm provides SLAM tracking capability, plane cognitive capability in a real environment and tracking capability of a real position posture of a current device camera are provided, and a technical basis is provided for an AR scene role interaction method. (2) And placing a virtual model in the virtual rendering space through the plane information provided by the AR algorithm, and starting to control the virtual game role to carry out the AR game through operation input, wherein the screen operation input comprises game remote sensing, keys, screen touch, sound, mobile phone vibration and the like. (3) The input for controlling the change of the position angle of the role is equivalent to a two-dimensional vector of a screen space, and the two-dimensional vector is transmitted to the script environment to be associated with the corresponding virtual role animation, wherein the two-dimensional vector finally falls on a moving plane where the actual role is located through projection matrix mapping, view matrix mapping and plane mapping. (4) And (3) obtaining the rotation and motion vector of the character in the real three-dimensional space, and driving the virtual character to move, rotate and play the corresponding model animation in the corresponding frame in the game script.
Fig. 18 is a schematic structural diagram of a control apparatus of an AR virtual model according to an embodiment of the present invention. As shown in fig. 18, the control device 18 of the AR virtual model according to the present embodiment includes: an acquisition module 1801, a processing module 1802, and a display module 1803. The obtaining module 1801 is configured to obtain a control command, where the control command is used to control a first virtual model in a first AR scene to move according to a movement trajectory; the processing module 1802 is configured to determine a first plane in which the first virtual model is located in the first AR scene; the processing module 1802 is further configured to determine a moving trajectory of the first virtual model in the first plane according to the control command; the display module 1803 is configured to control the first virtual model to move in the first plane according to the movement track.
The control apparatus of the AR virtual model provided in this embodiment may be used to execute the control method of the AR virtual model according to the embodiment shown in fig. 6, and the specific implementation manner and principle thereof are the same and will not be described again.
Optionally, in the foregoing embodiment, the obtaining module 1801 is specifically configured to obtain a moving trajectory of the target object on a second plane, where the second plane is a plane where the target object is located; the processing module 1802 is specifically configured to use a moving trajectory of the target object in the second plane as a moving trajectory of the first virtual model in the first plane.
Optionally, in the foregoing embodiment, the processing module 1802 is specifically configured to determine a moving track of the target object on the second plane; and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
Optionally, in the foregoing embodiment, the processing module 1802 is specifically configured to obtain a moving trajectory of the first virtual model in the first plane after the moving trajectory of the target object in the second plane is subjected to projection matrix mapping, view matrix mapping, and imaging plane mapping.
Optionally, in the above embodiment, the processing module 1802 is specifically configured to convert the two-dimensional vector coordinates of the movement trajectory of the target object on the second plane into image screen coordinates and project the image screen coordinates onto the far plane and the near plane; connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively; and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
Optionally, in the above embodiment, the processing module 1802 is specifically configured to determine, according to the synchronized positioning and map reconstruction SLAM algorithm, a first plane in which the first virtual model is located in the first AR scene.
Optionally, in the foregoing embodiment, the display module 1803 is specifically configured to determine, according to the movement track, a position of the first virtual model in each frame of the video image of the first AR scene on the first plane; each frame of video image of the first AR scene is played.
Optionally, in the above embodiment, the moving track of the first virtual model in the first plane includes the direction and angle of rotation of the first virtual model in the first plane; the display module 1803 is specifically configured to control the first virtual model to rotate on the first plane according to the moving track.
Optionally, in the above embodiment, the second plane is a display screen for displaying the first AR scene; the target object is a user finger or a function control on the display screen.
The control apparatus of the AR virtual model provided in this embodiment may be used to execute the control method of the AR virtual model shown in the foregoing embodiments, and the specific implementation manner and principle thereof are the same and will not be described again.
It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation. Each functional module in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
FIG. 19 is a schematic structural diagram of an embodiment of a control apparatus for an AR virtual model according to the present invention. As shown in fig. 19, the control device 19 of the AR virtual model according to the present embodiment includes: a receiver 1901, a processor 1902, and an AR display 1903. The receiver 1901 is configured to obtain a control command, where the control command is used to control a first virtual model in a first AR scene to move according to a movement trajectory; the processor 1902 is configured to determine a first plane in which the first virtual model resides in the first AR scene; the processor 1902 is further configured to determine a moving trajectory of the first virtual model in the first plane according to the control command; the AR display 1903 is used to control the first virtual model to move in the first plane according to the movement trajectory.
The control apparatus of the AR virtual model provided in this embodiment may be used to execute the control method of the AR virtual model according to the embodiment shown in fig. 6, and the specific implementation manner and principle thereof are the same and will not be described again.
Optionally, in the foregoing embodiment, the receiver 1901 is specifically configured to acquire a moving track of the target object on a second plane, where the second plane is a plane where the target object is located; the processor 1902 is specifically configured to use a moving trajectory of the target object in the second plane as a moving trajectory of the first virtual model in the first plane.
Optionally, in the above embodiment, the processor 1902 is specifically configured to determine a moving track of the target object in the second plane; and mapping the moving track of the target object on the second plane to the first plane to obtain the moving track of the first virtual model on the first plane.
Optionally, in the foregoing embodiment, the processor 1902 is specifically configured to obtain a moving trajectory of the first virtual model in the first plane after the moving trajectory of the target object in the second plane is subjected to projection matrix mapping, view matrix mapping, and imaging plane mapping.
Optionally, in the above embodiment, the processor 1902 is specifically configured to convert the two-dimensional vector coordinates of the movement trajectory of the target object on the second plane into image screen coordinates and project the image screen coordinates onto the far plane and the near plane; connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively; and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
Optionally, in the above embodiment, the processor 1902 is specifically configured to determine, according to the simultaneous localization and mapping SLAM algorithm, a first plane in which the first virtual model is located in the first AR scene.
Optionally, in the above embodiment, the AR display 1903 is specifically configured to determine, according to the movement track, a position of the first virtual model in each frame of the video image of the first AR scene on the first plane; each frame of video image of the first AR scene is played.
Optionally, in the above embodiment, the moving track of the first virtual model in the first plane includes the direction and angle of rotation of the first virtual model in the first plane;
the AR display 1903 is specifically configured to control the first virtual model to rotate on the first plane according to the movement trajectory.
Optionally, in the above embodiment, the second plane is a display screen for displaying the first AR scene; the target object is a user finger or a function control on the display screen.
The control apparatus of the AR virtual model provided in this embodiment may be used to execute the control method of the AR virtual model shown in the foregoing embodiments, and the specific implementation manner and principle thereof are the same and will not be described again.
The present invention also provides an electronic device readable storage medium, which includes a program that, when executed on an electronic device, causes the electronic device to execute the method for controlling an AR virtual model according to any of the above embodiments.
An embodiment of the present invention further provides an electronic device, including: a processor; and a memory for storing executable instructions for the processor; wherein the processor is configured to execute the control method of the AR virtual model in any of the above embodiments via execution of the executable instructions.
An embodiment of the present invention also provides a program product, including: a computer program (i.e., executing instructions) stored in a readable storage medium. The computer program may be read from a readable storage medium by at least one processor of the encoding apparatus, and the computer program is executed by the at least one processor to cause the encoding apparatus to implement the control method of the AR virtual model provided in the foregoing various embodiments.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, and any simple modification, equivalent change and modification made to the above embodiment according to the technical spirit of the present invention are still within the scope of the technical solution of the present invention.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (14)

1. A control method of an Augmented Reality (AR) virtual model is characterized by comprising the following steps:
acquiring a moving track of a target object on a second plane, wherein the second plane is a plane where the target object is located, and the control command is used for controlling a first virtual model in a first AR scene to move according to the moving track;
determining a first plane in which the first virtual model resides in the first AR scene;
taking the moving track of the target object on the second plane as the moving track of the first virtual model on the first plane;
controlling the first virtual model to move on the first plane according to the moving track;
the taking the moving track of the target object in the second plane as the moving track of the first virtual model in the first plane includes:
determining a moving track of the target object on the second plane;
and after the moving track of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping and imaging plane mapping, the moving track of the first virtual model on the first plane is obtained.
2. The method according to claim 1, wherein the obtaining the moving track of the first virtual model in the first plane after the moving track of the target object in the second plane is subjected to projection matrix mapping, view matrix mapping and plane mapping comprises:
converting the two-dimensional vector coordinates of the moving track of the target object on the second plane into image screen coordinates and projecting the image screen coordinates onto a far plane and a near plane;
connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively;
and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
3. The method of claim 1 or 2, wherein said determining a first plane in which the first virtual model resides in the first AR scene comprises:
and determining a first plane of the first virtual model in the first AR scene according to a synchronous positioning and map reconstruction SLAM algorithm.
4. The method according to claim 1 or 2, wherein said controlling the first virtual model to move in the first plane according to the movement trajectory comprises:
determining a position of the first virtual model in each frame of video image of the first AR scene at the first plane according to the movement track;
playing the each frame of video image of the first AR scene.
5. The method according to claim 1 or 2, wherein the movement track of the first virtual model in the first plane comprises a direction and an angle of rotation of the first virtual model in the first plane;
then the controlling the first virtual model to move on the first plane according to the movement trajectory further includes:
and controlling the first virtual model to rotate on the first plane according to the moving track.
6. The method according to claim 1 or 2,
the second plane is a display screen for displaying the first AR scene;
the target object is a user finger or a function control on the display screen.
7. An apparatus for controlling an Augmented Reality (AR) virtual model, comprising:
the receiver is used for acquiring a moving track of a target object on a second plane, wherein the second plane is a plane where the target object is located, and the control command is used for controlling a first virtual model in a first AR scene to move according to the moving track;
a processor, configured to use a moving track of the target object in the second plane as a moving track of the first virtual model in the first plane;
the processor is further used for determining a moving track of the first virtual model on the first plane according to the control command;
the AR display is used for controlling the first virtual model to move on the first plane according to the moving track;
the processor is particularly configured to perform at least one of,
determining a moving track of the target object on the second plane;
and after the moving track of the target object on the second plane is subjected to projection matrix mapping, view matrix mapping and imaging plane mapping, the moving track of the first virtual model on the first plane is obtained.
8. The apparatus of claim 7, wherein the processor is specifically configured to,
converting the two-dimensional vector coordinates of the moving track of the target object on the second plane into image screen coordinates and projecting the image screen coordinates onto a far plane and a near plane;
connecting rays between the far plane and the near plane along the starting point and the end point of the two-dimensional vector by using a projection matrix and a view matrix respectively;
and determining a coincidence line of the ray and the imaging plane as a moving track of the first virtual model in the first plane.
9. The apparatus of claim 7 or 8, wherein the processor is specifically configured to determine a first plane in which the first virtual model resides in the first AR scene according to a simultaneous localization and mapping SLAM algorithm.
10. The device according to claim 7 or 8, characterized in that the display module is specifically configured to,
determining a position of the first virtual model in each frame of video image of the first AR scene at the first plane according to the movement track;
playing the each frame of video image of the first AR scene.
11. The apparatus according to claim 7 or 8, wherein the moving track of the first virtual model in the first plane comprises the direction and angle of rotation of the first virtual model in the first plane;
the display is specifically configured to control the first virtual model to rotate on the first plane according to the movement trajectory.
12. The apparatus according to claim 7 or 8,
the second plane is a display screen for displaying the first AR scene;
the target object is a user finger or a function control on the display screen.
13. An apparatus for controlling an Augmented Reality (AR) virtual model, comprising: a processor, a memory, and a computer program; wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of any of claims 1-6.
14. A computer-readable storage medium, characterized in that it stores a computer program that causes a server to execute the method of any one of claims 1-6.
CN201810993135.0A 2018-08-29 2018-08-29 Control method and device of AR virtual model Active CN109189302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810993135.0A CN109189302B (en) 2018-08-29 2018-08-29 Control method and device of AR virtual model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810993135.0A CN109189302B (en) 2018-08-29 2018-08-29 Control method and device of AR virtual model

Publications (2)

Publication Number Publication Date
CN109189302A CN109189302A (en) 2019-01-11
CN109189302B true CN109189302B (en) 2021-04-06

Family

ID=64917072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810993135.0A Active CN109189302B (en) 2018-08-29 2018-08-29 Control method and device of AR virtual model

Country Status (1)

Country Link
CN (1) CN109189302B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111984172A (en) * 2020-07-15 2020-11-24 北京城市网邻信息技术有限公司 Furniture moving method and device
CN111984171A (en) * 2020-07-15 2020-11-24 北京城市网邻信息技术有限公司 Method and device for generating furniture movement track
CN112148197A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 Augmented reality AR interaction method and device, electronic equipment and storage medium
CN112337097A (en) * 2020-10-27 2021-02-09 网易(杭州)网络有限公司 Game simulation method and device
CN112230836B (en) * 2020-11-02 2022-05-27 网易(杭州)网络有限公司 Object moving method and device, storage medium and electronic device
CN112672185B (en) * 2020-12-18 2023-07-07 脸萌有限公司 Augmented reality-based display method, device, equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915130B (en) * 2011-08-04 2016-03-23 王振兴 Motion track bearing calibration and motion track production method
CN103996215A (en) * 2013-11-05 2014-08-20 深圳市云立方信息科技有限公司 Method and apparatus for realizing conversion from virtual view to three-dimensional view
CN106886285A (en) * 2017-01-20 2017-06-23 西安电子科技大学 A kind of historical relic interactive system and operating method based on virtual reality
CN109101120B (en) * 2017-06-21 2021-09-28 腾讯科技(深圳)有限公司 Method and device for displaying image
CN107564089B (en) * 2017-08-10 2022-03-01 腾讯科技(深圳)有限公司 Three-dimensional image processing method, device, storage medium and computer equipment

Also Published As

Publication number Publication date
CN109189302A (en) 2019-01-11

Similar Documents

Publication Publication Date Title
CN109189302B (en) Control method and device of AR virtual model
US9990759B2 (en) Offloading augmented reality processing
US10423234B2 (en) Facilitate user manipulation of a virtual reality environment
US20170206419A1 (en) Visualization of physical characteristics in augmented reality
US20180345144A1 (en) Multiple Frame Distributed Rendering of Interactive Content
US20150185825A1 (en) Assigning a virtual user interface to a physical object
US20150187137A1 (en) Physical object discovery
US20150187108A1 (en) Augmented reality content adapted to changes in real world space geometry
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
US10334222B2 (en) Focus-based video loop switching
US11107184B2 (en) Virtual object translation
US20240078703A1 (en) Personalized scene image processing method, apparatus and storage medium
JP2024502407A (en) Display methods, devices, devices and storage media based on augmented reality
US20160239095A1 (en) Dynamic 2D and 3D gestural interfaces for audio video players capable of uninterrupted continuity of fruition of audio video feeds
US11423549B2 (en) Interactive body-driven graphics for live video performance
JP2020523668A (en) System and method for configuring virtual camera
CN112987924A (en) Method, apparatus, device and storage medium for device interaction
CN112473138B (en) Game display control method and device, readable storage medium and electronic equipment
CN110264568B (en) Three-dimensional virtual model interaction method and device
CN111973984A (en) Coordinate control method and device for virtual scene, electronic equipment and storage medium
US11948257B2 (en) Systems and methods for augmented reality video generation
CN116630483A (en) Polygonal adsorption method, polygonal adsorption device, computer-readable storage medium, and electronic device
CN114681918A (en) Virtual camera control method and device, electronic equipment and storage medium
CN117596377A (en) Picture push method, device, electronic equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20190111

Assignee: Beijing Intellectual Property Management Co.,Ltd.

Assignor: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

Contract record no.: X2023110000098

Denomination of invention: Control Method and Device of AR Virtual Model

Granted publication date: 20210406

License type: Common License

Record date: 20230822

EE01 Entry into force of recordation of patent licensing contract