CN108399638B - Augmented reality interaction method and device based on mark and electronic equipment - Google Patents

Augmented reality interaction method and device based on mark and electronic equipment Download PDF

Info

Publication number
CN108399638B
CN108399638B CN201810128421.0A CN201810128421A CN108399638B CN 108399638 B CN108399638 B CN 108399638B CN 201810128421 A CN201810128421 A CN 201810128421A CN 108399638 B CN108399638 B CN 108399638B
Authority
CN
China
Prior art keywords
terminal
coordinate system
relative position
mark
posture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810128421.0A
Other languages
Chinese (zh)
Other versions
CN108399638A (en
Inventor
叶祖霈
林鸿运
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dream Bloom Technology Co ltd
Beijing Iqiyi Intelligent Technology Co ltd
Original Assignee
Chongqing IQIYI Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing IQIYI Intelligent Technology Co Ltd filed Critical Chongqing IQIYI Intelligent Technology Co Ltd
Priority to CN201810128421.0A priority Critical patent/CN108399638B/en
Publication of CN108399638A publication Critical patent/CN108399638A/en
Application granted granted Critical
Publication of CN108399638B publication Critical patent/CN108399638B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The embodiment of the invention provides a marker-based augmented reality interaction method, a marker-based augmented reality interaction device and electronic equipment, wherein the method is applied to a first terminal in an augmented reality interaction system and comprises the following steps: acquiring a first image containing a marker; determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point; determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point; acquiring the relative position posture between each second terminal and the mark; determining the relative position posture between the first terminal and each second terminal according to the relative position posture; and presenting the projection coordinates of the second terminals on the second image in the second image acquired by the first terminal based on the relative position postures of the first terminal and the second terminals. The embodiment of the invention can enable each terminal to display the motion states of other terminals in the scene of the terminal, thereby realizing real-time augmented reality interaction of a plurality of terminals.

Description

Augmented reality interaction method and device based on mark and electronic equipment
Technical Field
The invention relates to the technical field of computers, in particular to a marker-based augmented reality interaction method and device and electronic equipment.
Background
At present, intelligent devices with a camera function and a network communication function are very popular, and with the improvement of camera quality and the improvement of network speed, some applications, such as augmented reality, capable of processing images and videos in real time and communicating through a network appear in the intelligent devices. The application scenes of augmented reality are more, for example, the method can be used for positioning and identifying human faces in live webcasting, so that audiences watching the live webcasting can see that decorations or characters are superposed around the head of a main webcasting; the method can also be used for acquiring image data on one device through a camera, analyzing the image to calculate the relative position and posture between the device and a certain mark pattern, and superposing an object on the position of the pattern, so that a user feels that the superposed object is attached to a real scene.
With the further development of the augmented reality technology, more and more users want to interact with other users based on the augmented reality technology through the smart device, that is, the users want to acquire the position or action information of other users through the augmented reality application in the smart device.
However, the inventor finds that the prior art has at least the following problems in the process of implementing the invention: the existing applications utilizing augmented reality for interaction generally do not realize that a plurality of devices perform augmented reality interaction in the same virtual space.
Disclosure of Invention
The embodiment of the invention aims to provide a marker-based augmented reality interaction method, a marker-based augmented reality interaction device and electronic equipment, so as to realize real-time augmented reality interaction among a plurality of terminals. The specific technical scheme is as follows:
in order to achieve the above object, in a first aspect, the present invention provides a marker-based augmented reality interaction method, applied to a first terminal in an augmented reality interaction system, where the augmented reality interaction system further includes at least one second terminal, and the method includes:
acquiring a first image containing a marker;
determining at least four corner points of the mark in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point;
determining a relative position and posture between the first terminal and the mark based on the corner point coordinates of each corner point;
acquiring the relative position and posture between each second terminal and the mark;
determining the relative position and posture between the first terminal and each second terminal according to the relative position and posture between the first terminal and the mark and the relative position and posture between each second terminal and the mark;
and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position and posture between the first terminal and each second terminal.
Optionally, the determining corner coordinates of each of the corner points includes:
determining a first coordinate of each corner point in a coordinate system corresponding to the first terminal at the current moment and a second coordinate of each corner point in a pre-established marking coordinate system; wherein the content of the first and second substances,
the mark coordinate system is as follows: the x-axis corresponds to a first direction of the mark, the y-axis corresponds to a second direction of the mark, the first direction is perpendicular to the second direction, and the z-axis is perpendicular to a plane defined by the x-axis and the y-axis; the origin of the mark coordinate system is the central point of the mark;
the coordinate system corresponding to the first terminal is as follows: the x-axis corresponds to a third direction of the first terminal, the y-axis corresponds to a fourth direction of the first terminal, the third direction being perpendicular to the fourth direction; and the origin of the coordinate system corresponding to the first terminal is the central point of the first terminal.
Optionally, the determining a relative position and orientation between the first terminal and the marker based on the corner coordinates of each of the corners includes:
determining a rotation matrix and a translation vector of the pre-established marking coordinate system to a coordinate system corresponding to the first terminal at the current moment by an n-point/line perspective positioning PnP method based on the corner point coordinates of each corner point;
the acquiring the relative position and posture between each second terminal and the mark comprises:
acquiring a rotation matrix and a translation vector of a coordinate system corresponding to each second terminal from the pre-established marking coordinate system to the current moment; wherein the content of the first and second substances,
the coordinate system corresponding to each second terminal is as follows: the x-axis corresponds to a fifth direction of each of the second terminals, the y-axis corresponds to a sixth direction of each of the second terminals, and the fifth direction is perpendicular to the sixth direction; the origin of the coordinate system corresponding to each second terminal is the central point of each second terminal;
determining the relative position and posture between the first terminal and each second terminal according to the relative position and posture between the first terminal and the mark and the relative position and posture between each second terminal and the mark comprises:
and determining the rotation matrix and the translation vector from the coordinate system corresponding to each second terminal at the current moment to the coordinate system corresponding to the first terminal at the current moment according to the rotation matrix and the translation vector from the pre-established mark coordinate system to the coordinate system corresponding to the first terminal at the current moment and the rotation matrix and the translation vector from the pre-established mark coordinate system to the coordinate system corresponding to each second terminal at the current moment.
Optionally, the acquiring the relative position and posture between each second terminal and the mark includes:
and receiving the relative position and posture between each second terminal and the mark, which are sent by each second terminal.
Optionally, before the step of presenting, in a second image acquired by the first terminal, projection coordinates of each second terminal on the second image based on the relative position posture between the first terminal and each second terminal, the method further includes:
acquiring a target coordinate of any point on each second terminal in a coordinate system corresponding to the terminal at the current moment;
the presenting, in a second image acquired by the first terminal, projection coordinates of each second terminal on the second image based on the relative position and posture between the first terminal and each second terminal includes:
and determining target projection coordinates of the target coordinates on the second image based on the relative position and posture between the first terminal and each second terminal, and presenting the target projection coordinates in the second image acquired by the first terminal.
In a second aspect, an embodiment of the present invention provides an augmented reality interaction apparatus based on a marker, which is applied to a first terminal in an augmented reality interaction system, where the augmented reality interaction system further includes at least one second terminal, and the apparatus includes:
a first acquisition unit configured to acquire a first image including a mark;
a first determining unit, configured to determine at least four corner points of the mark in the first image according to a preset corner point detection algorithm, and determine corner point coordinates of each of the corner points;
a second determining unit, configured to determine a relative position and orientation between the first terminal and the marker based on corner coordinates of each of the corners;
a second acquisition unit configured to acquire a relative position and orientation between each of the second terminals and the marker;
a third determining unit, configured to determine a relative position and posture between the first terminal and each of the second terminals according to a relative position and posture between the first terminal and the marker and a relative position and posture between each of the second terminals and the marker;
and the presenting unit is used for presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal.
Optionally, the first determining unit is specifically configured to determine a first coordinate of each corner point in a coordinate system corresponding to the first terminal at the current time and a second coordinate of each corner point in a pre-established marker coordinate system; wherein the content of the first and second substances,
the mark coordinate system is as follows: the x-axis corresponds to a first direction of the mark, the y-axis corresponds to a second direction of the mark, the first direction is perpendicular to the second direction, and the z-axis is perpendicular to a plane defined by the x-axis and the y-axis; the origin of the mark coordinate system is the central point of the mark;
the coordinate system corresponding to the first terminal is as follows: the x-axis corresponds to a third direction of the first terminal, the y-axis corresponds to a fourth direction of the first terminal, the third direction being perpendicular to the fourth direction; and the origin of the coordinate system corresponding to the first terminal is the central point of the first terminal.
Optionally, the second determining unit is specifically configured to determine, based on the corner point coordinates of each corner point, a rotation matrix and a translation vector of the coordinate system corresponding to the first terminal from the pre-established marker coordinate system to the current time by using a PnP method;
the second obtaining unit is specifically configured to obtain a rotation matrix and a translation vector of a coordinate system corresponding to each second terminal from the pre-established marker coordinate system to the current time; wherein the content of the first and second substances,
the coordinate system corresponding to each second terminal is as follows: the x-axis corresponds to a fifth direction of each of the second terminals, the y-axis corresponds to a sixth direction of each of the second terminals, and the fifth direction is perpendicular to the sixth direction; the origin of the coordinate system corresponding to each second terminal is the central point of each second terminal;
the third determining unit is specifically configured to determine, according to the rotation matrix and the translation vector of the coordinate system corresponding to the first terminal from the pre-established labeled coordinate system to the current time, and the rotation matrix and the translation vector of the coordinate system corresponding to each second terminal from the pre-established labeled coordinate system to the current time, the rotation matrix and the translation vector of the coordinate system corresponding to each second terminal from the current time to the coordinate system corresponding to the first terminal at the current time.
Optionally, the second obtaining unit is specifically configured to receive the relative position and posture between each second terminal and the mark, where the relative position and posture is sent by each second terminal.
Optionally, the apparatus further includes a third obtaining unit;
the third obtaining unit is configured to obtain, before the presenting unit presents, in a second image obtained by the first terminal, projection coordinates of each second terminal on the second image based on a relative position and posture between the first terminal and each second terminal, a target coordinate of any point on each second terminal in a coordinate system corresponding to the terminal at the current time;
the presenting unit is specifically configured to determine target projection coordinates of the target coordinates on the second image based on the relative position and posture between the first terminal and each of the second terminals, and present the target projection coordinates in the second image acquired by the first terminal.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps for marker-based augmented reality interaction as described above in relation to the first aspect when executing the program stored in the memory.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium having stored therein instructions, which, when executed on a computer, cause the computer to perform the method steps of marker-based augmented reality interaction as described in the first aspect above.
In a fifth aspect, embodiments of the present invention provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method steps of marker-based augmented reality interaction as described above in the first aspect.
According to the augmented reality interaction method and device based on the marker and the electronic equipment, a first terminal in an augmented reality interaction system firstly obtains a first image containing the marker; determining at least four corner points of the mark in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point; then, based on the corner point coordinates of each corner point, determining the relative position posture between the first terminal and the mark; acquiring relative position gestures between each second terminal and the marker in the augmented reality interaction system; finally, determining the relative position and posture between the first terminal and each second terminal according to the relative position and posture between the first terminal and the mark and the relative position and posture between each second terminal and the mark; and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position and posture between the first terminal and each second terminal. The embodiment of the invention can enable each terminal to display the motion states of other terminals in the scene of the terminal, thereby realizing real-time augmented reality interaction of a plurality of terminals.
Of course, it is not necessary for any product or method of practicing the invention to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below.
Fig. 1 is a flowchart of an augmented reality interaction method based on a marker according to an embodiment of the present invention;
fig. 2 is a flowchart of another augmented reality interaction method based on markers according to an embodiment of the present invention;
fig. 3 is a flowchart of another augmented reality interaction method based on markers according to an embodiment of the present invention;
fig. 4 is a flowchart of another augmented reality interaction method based on markers according to an embodiment of the present invention;
fig. 5 is a flowchart of another augmented reality interaction method based on markers according to an embodiment of the present invention;
fig. 6 is a block diagram of an augmented reality interaction apparatus based on a marker according to an embodiment of the present invention;
FIG. 7 is a block diagram of another augmented reality interaction device based on markers according to an embodiment of the present invention;
fig. 8 is a structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
The augmented reality interaction method based on the marker provided by the embodiment of the invention can be applied to a first terminal in an augmented reality interaction system, and the system also comprises at least one second terminal. The first terminal and each second terminal generally have an image display function, and can display images with augmented reality effects and the like. The first terminal may be: desktop computer, portable computer, intelligent mobile terminal etc. the second terminal can be: desktop computers, portable computers, intelligent mobile terminals, and the like.
Fig. 1 is a flowchart of a marker-based augmented reality interaction method according to an embodiment of the present invention, where the method includes the following steps:
s101, a first image containing a mark is acquired.
The mark may be a planar pattern having at least four corners, such as a square having four corners. The mark may also be a solid figure having at least four corner points, such as a triangular pyramid having four corner points. It should be noted that, no matter whether the mark is a plane pattern or a three-dimensional pattern, the mark cannot have rotational ambiguity, that is, at least four corners of the mark have unique corresponding ways to the mark pattern itself. In practice, the marks may be placed on any plane.
In this embodiment, the first terminal may include a camera, and the first image including the marker is acquired in real time by the camera. The first terminal can also be externally connected with equipment with an image acquisition function, and a first image containing the mark is acquired in real time through the equipment.
In practical applications, the first terminal may obtain only one first image containing the mark, and further determine the corner point of the mark in the first image. In order to make the result of the augmented reality interaction more accurate, the first terminal may also acquire a plurality of images including the marker, and use the image which is most clear and is most favorable for determining the corner point as the first image, so as to further determine the corner point of the marker in the first image.
S102, determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point.
In this embodiment, the purpose of determining at least four corner points of the mark in the first image is to: and further determining the relative position and posture between the first terminal and the mark according to the determined at least four corner points. Specifically, the relative position posture may include: and marking a rotation matrix and a translation vector from the coordinate system to the coordinate system corresponding to the first terminal so as to respectively represent the position change relation and the posture change relation between the first terminal and the mark through the translation vector and the rotation matrix. The description of the mark coordinate system and the coordinate system corresponding to the first terminal is described in detail in the following step S103. The description of the relative position attitude in the other steps below can refer to the description of the above-described phase position attitude.
The preset corner detection algorithm may be any existing corner detection algorithm, such as Harris corner detection algorithm in Harris. The pixel points and the pixel coordinates of at least four corner points of the mark in the first image can be determined through a preset corner point detection algorithm.
For example, the pixel coordinate system corresponding to the first image is set as: the origin is located in the upper left corner of the first image plane, u-axis to the right, v-axis down. The pixel coordinates of the four corner points a1, a2, A3 and a4 marked in the first image, which correspond to the first image, are determined to be a1(100 ), a2(150, 100), A3(100, 150) and a4(150 ), respectively, by a preset corner point detection algorithm.
On the basis of the augmented reality interaction method based on the marker shown in fig. 1, as shown in fig. 2, in an optional embodiment of the present invention, step S102 may be:
s1021, determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining a first coordinate of each corner point in a coordinate system corresponding to the first terminal at the current moment and a second coordinate of each corner point in a pre-established marking coordinate system.
The above-mentioned marker coordinate system may be: the x-axis corresponds to a first direction of the mark, the y-axis corresponds to a second direction of the mark, the first direction is perpendicular to the second direction, and the z-axis is perpendicular to a plane defined by the x-axis and the y-axis; the origin of the marker coordinate system is the center point of the marker. Further, the marker coordinate system may be a right-hand coordinate system.
If the mark is a plane pattern, the center point of the mark is the plane geometric center of the plane pattern; the first direction of the mark may be any direction in the plane of the planar pattern; the second direction of the mark is a direction perpendicular to the either direction within the plane of the planar pattern.
If the mark is a three-dimensional figure, the central point of the mark is the solid geometric center of the three-dimensional figure; the first direction of the mark may be any direction in any plane including the geometric center of the solid figure; the second direction of the mark is a direction perpendicular to the any direction in the any plane.
The coordinate system corresponding to the first terminal may be: the x-axis corresponds to a third direction of the first terminal, the y-axis corresponds to a fourth direction of the first terminal, and the third direction is perpendicular to the fourth direction; the origin of the coordinate system corresponding to the first terminal is the central point of the first terminal.
In practical applications, the first terminal is movable, so that at different times, the coordinate system corresponding to the first terminal may be different. Specifically, at the current moment, if the first terminal acquires the first image through the camera, the central point of the first terminal may be an intersection point of the optical axis of the camera and the first image plane; if the first terminal collects the first image through the external equipment with the image collecting function, the central point of the first terminal can be the intersection point of the optical axis of the camera of the external equipment and the first image plane.
In this embodiment, the third direction and the fourth direction of the first terminal may be parallel to the u axis and the v axis of the pixel coordinate system corresponding to the first image, respectively.
It should be noted that the second coordinates of each corner point in the mark coordinate system specifically include: coordinates of the corresponding corner point marked in the real scene by each corner point in the marked coordinate system.
For example, at the current time, the first coordinate of the corner point a1 marked in the first image in the coordinate system corresponding to the first terminal is determined to be (5, 5); at the same time, the second coordinate of the corner point a1 in the marker coordinate system is determined to be (-10, 10, 0), i.e. the coordinate of the corresponding corner point a1 marked in the real scene by the corner point a1 in the marker coordinate system is determined to be (-10, 10, 0). Of course, the accuracy of the coordinates of the corner points of the marks in the first image in the coordinate system corresponding to the first terminal and each axis in the mark coordinate system may be set as required, for example, to be accurate to millimeters.
In an implementation manner of this embodiment, the coordinates of the corresponding corner point marked in the real scene by each corner point in the mark coordinate system can be obtained through measurement by an existing measurement method.
In another implementation manner of this embodiment, according to a pixel coordinate corresponding to each corner point in the first image, a first coordinate in a coordinate system corresponding to the first terminal at the current time of each corner point may be determined by the following formula:
Figure BDA0001574169480000091
wherein (u)i,vi) For the coordinates of the pixel corresponding to the ith corner point in the first image, dx and dy are the physical dimensions of each pixel in the first image in the directions of the x axis and the y axis of the coordinate system corresponding to the first terminal at the current moment respectively, (x is the physical dimension of each pixel in the first image in the directions of the x axis and the y axis of the coordinate system corresponding to the first terminal at the current momenti,yi) Is a first coordinate of the ith corner point in a coordinate system corresponding to the first terminal at the current moment, (u)0,v0) And (3) regarding the pixel coordinate corresponding to the origin of the coordinate system corresponding to the first terminal in the first image at the current moment, wherein i is 1, …, N is the number of corner points, N is not less than 4, and s' represents a tilting factor caused by non-orthogonality of coordinate axes of the coordinate system corresponding to the first terminal.
S103, determining the relative position and posture between the first terminal and the mark based on the corner point coordinates of each corner point.
The corner coordinates of each corner point are determined through step S102, which means that the coordinate correspondence relationship of at least four corner points of the mark in the mark coordinate system and the coordinate system corresponding to the first terminal at the current time is determined. In step S103, a relative position and orientation between the first terminal and the mark may be further determined according to the coordinate correspondence of the at least four corner points. Specifically, a rotation matrix and a translation vector between a coordinate system corresponding to the first terminal at the current moment and the mark coordinate system can be further determined.
On the basis of the augmented reality interaction method based on the marker shown in fig. 1, as shown in fig. 3, in an optional embodiment of the present invention, step S103 may be:
and S1031, based on the corner Point coordinates of each corner Point, determining a rotation matrix and a translation vector of a coordinate system corresponding to the first terminal from the pre-established mark coordinate system to the current moment through a Perspective-n-Point (PnP) method.
The PnP method is a classic method in the field of machine vision, and can determine the relative position and posture between a camera and an object according to n feature points on the object, and specifically, can determine a rotation matrix and a translation vector between the camera and the object according to n feature points on the object. When n ≧ 4, the determined result has 1 or more solutions, but it should be noted that there are cases where there are multiple solutions to the determined result, of which only one solution is a positive solution. The reasons for multiple solutions are: there are multiple local minima for the cost function. If the initial value of the 2D-3D coordinates of the n feature points is not properly set, the determined result is converged to an incorrect minimum value.
The coordinate conversion relationship from the mark coordinate system determined by the PnP method to the coordinate system corresponding to the first terminal at the current time based on the corner point coordinates of each corner point is as follows:
Figure BDA0001574169480000101
wherein Z iscIs the scale factor corresponding to the first terminal, (X, y) is the coordinate of the point P in the coordinate system corresponding to the first terminal at the current moment, (X)m,Ym,Zm) The coordinate of the point P in the marked coordinate system is denoted as R, which is a rotation matrix of the coordinate system corresponding to the first terminal from the marked coordinate system to the current time, R is a3 × 3 orthometric unit matrix with a determinant value of 1, t is a translation vector of the coordinate system corresponding to the first terminal from the marked coordinate system to the current time, and t is a 3-dimensional vector.
And S104, acquiring the relative position and posture between each second terminal and the mark.
In order to enable each second terminal in the augmented reality interaction system to perform augmented reality interaction with the first terminal, the first terminal may obtain a relative position and posture between each second terminal and the marker, so as to further determine the relative position and posture between the first terminal and each second terminal.
On the basis of the augmented reality interaction method based on the marker shown in fig. 1, as shown in fig. 3, in an optional embodiment of the present invention, step S104 may be:
and S1041, acquiring a rotation matrix and a translation vector of a coordinate system corresponding to each second terminal from a pre-established marking coordinate system to the current moment.
The coordinate system corresponding to each second terminal is as follows: the x-axis corresponds to a fifth direction of the respective second terminal, the y-axis corresponds to a sixth direction of the respective second terminal, and the fifth direction is perpendicular to the sixth direction; the origin of the coordinate system corresponding to each second terminal is the central point of each second terminal.
In a specific implementation manner, each second terminal may determine, in advance, a rotation matrix and a translation vector from the marked coordinate system to a coordinate system corresponding to the second terminal at the current time with reference to step S103.
In practical applications, each second terminal is movable, and thus the coordinate system corresponding to each second terminal may be different at different time. Specifically, at the current moment, if each second terminal acquires a second image containing the mark through the camera, the central point of each second terminal may be an intersection point of the optical axis of the camera and the plane of the second image acquired by each second terminal; if each second terminal acquires a second image containing the mark through the external equipment with the image acquisition function, the central point of each second terminal can be the intersection point of the optical axis of the camera of the external equipment and the plane of the acquired second image.
In this embodiment, the fifth direction and the sixth direction of each second terminal may be parallel to the u axis and the v axis of the pixel coordinate system corresponding to the respectively acquired second image. The pixel coordinate system corresponding to each second image may be set with reference to the pixel coordinate system corresponding to the first image.
The following is a coordinate conversion relationship from the marked coordinate system acquired by the first terminal to the coordinate system corresponding to each second terminal at the current time:
Figure BDA0001574169480000121
wherein Z isc,jA scaling factor corresponding to the jth second terminal, (x)j,yj) Is composed ofThe coordinate of the point P in the coordinate system corresponding to the jth second terminal at the current moment, (X)m,Ym,Zm) As the coordinates of point P in the marked coordinate system, RjIs a rotation matrix of a coordinate system corresponding to the jth second terminal from the marked coordinate system to the current moment, RjIs a3 x 3 orthogonal identity matrix and has a determinant value of 1, tjIs a translation vector of a coordinate system corresponding to the jth second terminal from the marked coordinate system to the current moment, tjThe vector is a 3-dimensional vector, j is 1, and M is the number of second terminals, and M is larger than or equal to 1.
On the basis of the augmented reality interaction method based on the marker shown in fig. 1, as shown in fig. 4, in an optional embodiment of the present invention, step S104 may be:
and S1042, receiving the relative position and posture between each second terminal and the mark, which are sent by each second terminal.
Specifically, communication modules may be provided in the first terminal and each of the second terminals, and each of the second terminals may transmit the relative position posture between each of the second terminals and the marker to the communication module of the first terminal through the respective communication module in a wired or wireless communication manner, so that the first terminal shares the relative position posture between each of the second terminals and the marker. Wherein, the respective second terminals can determine the relative position and posture between the respective second terminals and the mark in advance with reference to steps S101-S103.
And S105, determining the relative position and posture between the first terminal and each second terminal according to the relative position and posture between the first terminal and the mark and the relative position and posture between each second terminal and the mark.
And determining the relative position posture between the first terminal and each second terminal, and further displaying the position information of the second terminals on the first terminal, thereby realizing the augmented reality interaction of the first terminal and each second terminal.
On the basis of the augmented reality interaction method based on the marker shown in fig. 1, as shown in fig. 3, in an alternative embodiment of the present invention, step S105 may be:
s1051, according to the rotation matrix and the translation vector of the coordinate system corresponding to the first terminal from the pre-established marked coordinate system to the current time and the rotation matrix and the translation vector of the coordinate system corresponding to each second terminal from the pre-established marked coordinate system to the current time, the rotation matrix and the translation vector of the coordinate system corresponding to each second terminal from the current time to the coordinate system corresponding to the first terminal at the current time are determined.
The following is a coordinate conversion relationship between the coordinate system corresponding to the first terminal and the coordinate systems corresponding to the second terminals at the current moment:
Figure BDA0001574169480000131
wherein Z isc,jFor the scaling factor, Z, corresponding to the jth second terminalcIs the scale factor corresponding to the first terminal, (x, y) is the coordinate of the point P in the coordinate system corresponding to the first terminal at the current moment, (xj,yj) Is the coordinate of the point P in the coordinate system corresponding to the jth second terminal at the current moment,
Figure BDA0001574169480000132
is a rotation matrix from the coordinate system corresponding to the first terminal to the coordinate system corresponding to the jth second terminal at the current moment,
Figure BDA0001574169480000133
is a3 x 3 orthogonal identity matrix and its determinant has a value of 1,
Figure BDA0001574169480000134
is a translation vector from a coordinate system corresponding to the first terminal to a coordinate system corresponding to the jth second terminal at the current moment,
Figure BDA0001574169480000135
is a 3-dimensional vector, j is 1, and M is the number of second terminals, and M is more than or equal to 1,
Figure BDA0001574169480000136
Figure BDA0001574169480000137
wherein, (.)-1An inverse matrix is represented.
And S106, based on the relative position and posture between the first terminal and each second terminal, presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal.
Step S105 determines the relative position and posture between the first terminal and each second terminal, that is, determines the coordinate correspondence relationship between any point in space in the coordinate system corresponding to the first terminal and the coordinate system corresponding to each second terminal at the current time. So that the projected coordinates of the second terminal on the second image can be determined. After the second terminal is projected on the second image, the position of the second terminal can be displayed on the first terminal.
The manner in which the first terminal acquires the second image may refer to the manner in which the first terminal acquires the first image in step S101.
In this embodiment, the second image may specifically have the same display size and display position as the first image, so that the pixel coordinate system corresponding to the second image and the pixel coordinate system corresponding to the first image may be completely the same. Therefore, the projection coordinates of each second terminal on the second image are specifically: and coordinates of each second terminal in the coordinate system corresponding to the first terminal.
In a specific implementation manner, an origin of a coordinate system corresponding to the second terminal may be used as a reference point of the second terminal in the real scene, and based on the relative position and posture between the first terminal and each second terminal, a coordinate of the reference point in the coordinate system corresponding to the first terminal, which corresponds to a coordinate of the reference point in the coordinate system corresponding to the second terminal, is determined, and then the determined coordinate is a projection coordinate of the second terminal on the second image.
Further, each second terminal may send, to the first terminal, the coordinates of the respective reference point in the coordinate system corresponding to the terminal, so that the first terminal may calculate the projection coordinates of the reference point of each second terminal on the second image according to the relative position and posture between the first terminal and each second terminal.
According to the augmented reality interaction method based on the marker, a first terminal in an augmented reality interaction system firstly obtains a first image containing the marker; determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point; then, determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point; acquiring relative position postures between each second terminal and the marker in the augmented reality interaction system; finally, determining the relative position posture between the first terminal and each second terminal according to the relative position posture between the first terminal and the mark and the relative position posture between each second terminal and the mark; and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal. The embodiment of the invention can enable each terminal to display the motion states of other terminals in the scene of the terminal, thereby realizing real-time augmented reality interaction of a plurality of terminals.
Fig. 5 is a flowchart of another augmented reality interaction method based on markers according to an embodiment of the present invention, where the method includes the following steps:
s201, a first image containing a mark is acquired.
S202, determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point.
S203, determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point.
S204, acquiring the relative position and posture between each second terminal and the mark.
And S205, determining the relative position and posture between the first terminal and each second terminal according to the relative position and posture between the first terminal and the mark and the relative position and posture between each second terminal and the mark.
In this embodiment, the steps S201 to S205 may be respectively identical to the steps S101 to S105 in the embodiment shown in fig. 1, and are not described herein again.
And S206, acquiring the target coordinates of any point on each second terminal in the coordinate system corresponding to the terminal at the current moment.
In practical applications, the user of the first terminal may determine any point on the second terminal displayed on the first terminal according to needs. In this embodiment, any point on the second terminal may be any point of the second terminal itself, or any point on the body of the user of the second terminal. For example, the second terminal is an Augmented Reality (AR) glasses, and any point on the second terminal may be a center point of the AR glasses, and the like. As another example, the user of the second terminal is an athlete, and any point on the second terminal may be a center point of the right hand of the athlete, and so on.
The second terminal may send the target coordinates of any point on the terminal in the coordinate system corresponding to the terminal to the first terminal.
S207, determining target projection coordinates of the target coordinates on the second image based on the relative position and posture of the first terminal and each second terminal, and presenting the target projection coordinates in the second image acquired by the first terminal.
According to the augmented reality interaction method based on the marker, a first terminal in an augmented reality interaction system firstly obtains a first image containing the marker; determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point; then, determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point; acquiring relative position postures between each second terminal and the marker in the augmented reality interaction system; finally, determining the relative position posture between the first terminal and each second terminal according to the relative position posture between the first terminal and the mark and the relative position posture between each second terminal and the mark; and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal. The embodiment of the invention can enable each terminal to display the motion states of other terminals in the scene of the terminal, thereby realizing real-time augmented reality interaction of a plurality of terminals.
Fig. 6 is a structural diagram of an augmented reality interaction apparatus based on a marker according to an embodiment of the present invention, where the apparatus is applied to a first terminal in an augmented reality interaction system, the augmented reality interaction system further includes at least one second terminal, and the apparatus includes: a first acquisition unit 301, a first determination unit 302, a second determination unit 303, a second acquisition unit 304, a third determination unit 305, and a presentation unit 306; wherein the content of the first and second substances,
a first acquisition unit 301 for acquiring a first image containing a marker;
a first determining unit 302, configured to determine at least four corner points marked in the first image according to a preset corner point detection algorithm, and determine corner point coordinates of each corner point;
a second determining unit 303, configured to determine a relative position and orientation between the first terminal and the mark based on the corner coordinates of each corner point;
a second acquiring unit 304, configured to acquire a relative position and posture between each second terminal and the mark;
a third determination unit 305 configured to determine a relative position posture between the first terminal and each of the second terminals according to the relative position posture between the first terminal and the marker and the relative position posture between each of the second terminals and the marker;
and a presenting unit 306, configured to present, in a second image acquired by the first terminal, projection coordinates of each second terminal on the second image based on the relative position and posture between the first terminal and each second terminal.
According to the augmented reality interaction device based on the marker, the first terminal firstly obtains a first image containing the marker; determining at least four corner points of the mark in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point; then determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point; acquiring relative position gestures between each second terminal and the marker in the augmented reality interaction system; finally, determining the relative position and posture between the first terminal and each second terminal according to the relative position and posture between the first terminal and the mark and the relative position and posture between each second terminal and the mark; and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal. The embodiment of the invention can enable each terminal to display the motion states of other terminals in the scene of the terminal, thereby realizing real-time augmented reality interaction of a plurality of terminals.
In an implementation manner, the first determining unit 302 is specifically configured to determine a first coordinate of each corner point in a coordinate system corresponding to the first terminal at the current time and a second coordinate of each corner point in a pre-established marker coordinate system; wherein the content of the first and second substances,
the mark coordinate system is as follows: the x-axis corresponds to a first direction of the mark, the y-axis corresponds to a second direction of the mark, the first direction is perpendicular to the second direction, and the z-axis is perpendicular to a plane defined by the x-axis and the y-axis; the origin of the marked coordinate system is the central point of the mark;
the coordinate system corresponding to the first terminal is as follows: the x-axis corresponds to a third direction of the first terminal, the y-axis corresponds to a fourth direction of the first terminal, and the third direction is perpendicular to the fourth direction; the origin of the coordinate system corresponding to the first terminal is the central point of the first terminal.
In an implementation manner, the second determining unit 303 is specifically configured to determine, based on corner point coordinates of each corner point, a rotation matrix and a translation vector of a coordinate system corresponding to the first terminal from a pre-established marker coordinate system to a current time by using a PnP method;
a second obtaining unit 304, specifically configured to obtain a rotation matrix and a translation vector of a coordinate system corresponding to each second terminal from a pre-established marker coordinate system to a current time; wherein the content of the first and second substances,
the coordinate system corresponding to each second terminal is as follows: the x-axis corresponds to a fifth direction of the respective second terminal, the y-axis corresponds to a sixth direction of the respective second terminal, and the fifth direction is perpendicular to the sixth direction; the origin of the coordinate system corresponding to each second terminal is the central point of each second terminal;
the third determining unit 305 is specifically configured to determine, according to the rotation matrix and the translation vector of the coordinate system corresponding to the first terminal from the pre-established labeled coordinate system to the current time and the rotation matrix and the translation vector of the coordinate system corresponding to each second terminal from the pre-established labeled coordinate system to the current time, the rotation matrix and the translation vector of the coordinate system corresponding to each second terminal from the current time to the coordinate system corresponding to the first terminal at the current time.
In one implementation, the second obtaining unit 304 is specifically configured to receive the relative position and orientation between each second terminal and the marker, which is sent by each second terminal.
Fig. 7 is a structural diagram of another augmented reality interaction device based on a marker according to an embodiment of the present invention, where the device is applied to a first terminal in an augmented reality interaction system, the augmented reality interaction system further includes at least one second terminal, and the device includes: a first acquisition unit 401, a first determination unit 402, a second determination unit 403, a second acquisition unit 404, a third determination unit 405, a third acquisition unit 406, and a presentation unit 407; the first obtaining unit 401, the first determining unit 402, the second determining unit 403, the second obtaining unit 404, and the third determining unit 405 may be respectively identical to the first obtaining unit 301, the first determining unit 302, the second determining unit 303, the second obtaining unit 304, and the third determining unit 305 in the embodiment shown in fig. 6, and are not repeated here.
In this embodiment, the third obtaining unit 406 is configured to obtain, before the presenting unit presents, in the second image obtained by the first terminal, the projection coordinates of each second terminal on the second image based on the relative position and posture between the first terminal and each second terminal, a target coordinate of any point on each second terminal in the coordinate system corresponding to the terminal at the current time.
A presenting unit 407, configured to determine, based on the relative position and posture between the first terminal and each of the second terminals, target projection coordinates of the target coordinates on the second image, and present the target projection coordinates in the second image acquired by the first terminal.
According to the augmented reality interaction device based on the marker, the first terminal firstly obtains a first image containing the marker; determining at least four corner points of the mark in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point; then determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point; acquiring relative position gestures between each second terminal and the marker in the augmented reality interaction system; finally, determining the relative position and posture between the first terminal and each second terminal according to the relative position and posture between the first terminal and the mark and the relative position and posture between each second terminal and the mark; and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal. The embodiment of the invention can enable each terminal to display the motion states of other terminals in the scene of the terminal, thereby realizing real-time augmented reality interaction of a plurality of terminals.
An embodiment of the present invention further provides an electronic device, as shown in fig. 8, which includes a processor 501, a communication interface 502, a memory 503 and a communication bus 504, where the processor 501, the communication interface 502 and the memory 503 complete mutual communication through the communication bus 504,
a memory 503 for storing a computer program;
the processor 501, when executing the program stored in the memory 503, implements the following steps:
acquiring a first image containing a marker;
determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point;
determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point;
acquiring the relative position and posture between each second terminal and the mark;
determining the relative position posture between the first terminal and each second terminal according to the relative position posture between the first terminal and the mark and the relative position posture between each second terminal and the mark;
and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal.
In the electronic device provided by the embodiment of the invention, the processor enables the first terminal in the augmented reality interaction system to firstly acquire the first image containing the marker by executing the program stored in the memory; determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point; then, determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point; acquiring relative position postures between each second terminal and the marker in the augmented reality interaction system; finally, determining the relative position posture between the first terminal and each second terminal according to the relative position posture between the first terminal and the mark and the relative position posture between each second terminal and the mark; and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal. The embodiment of the invention can enable each terminal to display the motion states of other terminals in the scene of the terminal, thereby realizing real-time augmented reality interaction of a plurality of terminals.
The communication bus 504 mentioned above for the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus 504 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 8, but this is not intended to represent only one bus or type of bus.
The communication interface 502 is used for communication between the above-described electronic apparatus and other apparatuses.
The Memory 503 may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor 501 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
In yet another embodiment of the present invention, a computer-readable storage medium is further provided, which has instructions stored therein, and when the instructions are executed on a computer, the instructions cause the computer to perform any one of the above-mentioned tag-based augmented reality interaction methods.
When the instruction stored in the computer-readable storage medium provided by the embodiment of the invention runs on a computer, a first terminal in an augmented reality interaction system firstly acquires a first image containing a marker; determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point; then, determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point; acquiring relative position postures between each second terminal and the marker in the augmented reality interaction system; finally, determining the relative position posture between the first terminal and each second terminal according to the relative position posture between the first terminal and the mark and the relative position posture between each second terminal and the mark; and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal. The embodiment of the invention can enable each terminal to display the motion states of other terminals in the scene of the terminal, thereby realizing real-time augmented reality interaction of a plurality of terminals.
In yet another embodiment, a computer program product containing instructions is provided, which when run on a computer, causes the computer to perform the marker-based augmented reality interaction method of any one of the above embodiments.
According to the computer program product containing the instruction provided by the embodiment of the invention, when the computer program product runs on a computer, a first terminal in an augmented reality interaction system firstly acquires a first image containing a mark; determining at least four corner points marked in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point; then, determining the relative position posture between the first terminal and the mark based on the corner point coordinates of each corner point; acquiring relative position postures between each second terminal and the marker in the augmented reality interaction system; finally, determining the relative position posture between the first terminal and each second terminal according to the relative position posture between the first terminal and the mark and the relative position posture between each second terminal and the mark; and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal. The embodiment of the invention can enable each terminal to display the motion states of other terminals in the scene of the terminal, thereby realizing real-time augmented reality interaction of a plurality of terminals.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The term "comprising", without further limitation, means that the element so defined is not excluded from the group consisting of additional identical elements in the process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus/electronic device/storage medium/computer program product embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (11)

1. An augmented reality interaction method based on a marker is applied to a first terminal in an augmented reality interaction system, the augmented reality interaction system further comprises at least one second terminal, and the method comprises the following steps:
acquiring a first image containing a marker;
determining at least four corner points of the mark in the first image according to a preset corner point detection algorithm, and determining corner point coordinates of each corner point;
determining a relative position and posture between the first terminal and the mark based on the corner point coordinates of each corner point;
acquiring the relative position and posture between each second terminal and the mark;
determining the relative position and posture between the first terminal and each second terminal according to the relative position and posture between the first terminal and the mark and the relative position and posture between each second terminal and the mark;
and presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position and posture between the first terminal and each second terminal.
2. The method according to claim 1, wherein said determining corner coordinates of each of said corner points comprises:
determining a first coordinate of each corner point in a coordinate system corresponding to the first terminal at the current moment and a second coordinate of each corner point in a pre-established marking coordinate system; wherein the content of the first and second substances,
the mark coordinate system is as follows: the x-axis corresponds to a first direction of the mark, the y-axis corresponds to a second direction of the mark, the first direction is perpendicular to the second direction, and the z-axis is perpendicular to a plane defined by the x-axis and the y-axis; the origin of the mark coordinate system is the central point of the mark;
the coordinate system corresponding to the first terminal is as follows: the x-axis corresponds to a third direction of the first terminal, the y-axis corresponds to a fourth direction of the first terminal, the third direction being perpendicular to the fourth direction; and the origin of the coordinate system corresponding to the first terminal is the central point of the first terminal.
3. The method according to claim 2, wherein said determining a relative position and orientation between said first terminal and said marker based on corner coordinates of each of said corner points comprises:
determining a rotation matrix and a translation vector of the pre-established marking coordinate system to a coordinate system corresponding to the first terminal at the current moment by an n-point/line perspective positioning PnP method based on the corner point coordinates of each corner point;
the acquiring the relative position and posture between each second terminal and the mark comprises:
acquiring a rotation matrix and a translation vector of a coordinate system corresponding to each second terminal from the pre-established marking coordinate system to the current moment; wherein the content of the first and second substances,
the coordinate system corresponding to each second terminal is as follows: the x-axis corresponds to a fifth direction of each of the second terminals, the y-axis corresponds to a sixth direction of each of the second terminals, and the fifth direction is perpendicular to the sixth direction; the origin of the coordinate system corresponding to each second terminal is the central point of each second terminal;
determining the relative position and posture between the first terminal and each second terminal according to the relative position and posture between the first terminal and the mark and the relative position and posture between each second terminal and the mark comprises:
and determining the rotation matrix and the translation vector from the coordinate system corresponding to each second terminal at the current moment to the coordinate system corresponding to the first terminal at the current moment according to the rotation matrix and the translation vector from the pre-established mark coordinate system to the coordinate system corresponding to the first terminal at the current moment and the rotation matrix and the translation vector from the pre-established mark coordinate system to the coordinate system corresponding to each second terminal at the current moment.
4. The method of claim 1, wherein the obtaining the relative position and orientation between each of the second terminals and the markers comprises:
and receiving the relative position and posture between each second terminal and the mark, which are sent by each second terminal.
5. The method of claim 1, wherein prior to the step of presenting the projected coordinates of each of the second terminals on the second image in the second image acquired by the first terminal based on the relative position and posture between the first terminal and each of the second terminals, the method further comprises:
acquiring a target coordinate of any point on each second terminal in a coordinate system corresponding to the terminal at the current moment;
the presenting, in a second image acquired by the first terminal, projection coordinates of each second terminal on the second image based on the relative position and posture between the first terminal and each second terminal includes:
and determining target projection coordinates of the target coordinates on the second image based on the relative position and posture between the first terminal and each second terminal, and presenting the target projection coordinates in the second image acquired by the first terminal.
6. An augmented reality interaction device based on a marker, which is applied to a first terminal in an augmented reality interaction system, wherein the augmented reality interaction system further comprises at least one second terminal, and the device comprises:
a first acquisition unit configured to acquire a first image including a mark;
a first determining unit, configured to determine at least four corner points of the mark in the first image according to a preset corner point detection algorithm, and determine corner point coordinates of each of the corner points;
a second determining unit, configured to determine a relative position and orientation between the first terminal and the marker based on corner coordinates of each of the corners;
a second acquisition unit configured to acquire a relative position and orientation between each of the second terminals and the marker;
a third determining unit, configured to determine a relative position and posture between the first terminal and each of the second terminals according to a relative position and posture between the first terminal and the marker and a relative position and posture between each of the second terminals and the marker;
and the presenting unit is used for presenting the projection coordinates of each second terminal on the second image in the second image acquired by the first terminal based on the relative position posture between the first terminal and each second terminal.
7. The apparatus of claim 6,
the first determining unit is specifically configured to determine a first coordinate of each corner point in a coordinate system corresponding to the first terminal at the current time and a second coordinate of each corner point in a pre-established marker coordinate system; wherein the content of the first and second substances,
the mark coordinate system is as follows: the x-axis corresponds to a first direction of the mark, the y-axis corresponds to a second direction of the mark, the first direction is perpendicular to the second direction, and the z-axis is perpendicular to a plane defined by the x-axis and the y-axis; the origin of the mark coordinate system is the central point of the mark;
the coordinate system corresponding to the first terminal is as follows: the x-axis corresponds to a third direction of the first terminal, the y-axis corresponds to a fourth direction of the first terminal, the third direction being perpendicular to the fourth direction; and the origin of the coordinate system corresponding to the first terminal is the central point of the first terminal.
8. The apparatus of claim 7,
the second determining unit is specifically configured to determine, based on the corner point coordinates of each corner point, a rotation matrix and a translation vector of the pre-established marker coordinate system to a coordinate system corresponding to the first terminal at the current time by a PnP method;
the second obtaining unit is specifically configured to obtain a rotation matrix and a translation vector of a coordinate system corresponding to each second terminal from the pre-established marker coordinate system to the current time; wherein the content of the first and second substances,
the coordinate system corresponding to each second terminal is as follows: the x-axis corresponds to a fifth direction of each of the second terminals, the y-axis corresponds to a sixth direction of each of the second terminals, and the fifth direction is perpendicular to the sixth direction; the origin of the coordinate system corresponding to each second terminal is the central point of each second terminal;
the third determining unit is specifically configured to determine, according to the rotation matrix and the translation vector of the coordinate system corresponding to the first terminal from the pre-established labeled coordinate system to the current time, and the rotation matrix and the translation vector of the coordinate system corresponding to each second terminal from the pre-established labeled coordinate system to the current time, the rotation matrix and the translation vector of the coordinate system corresponding to each second terminal from the current time to the coordinate system corresponding to the first terminal at the current time.
9. The apparatus according to claim 6, wherein the second obtaining unit is specifically configured to receive the relative position and orientation between each of the second terminals and the marker, which is sent by each of the second terminals.
10. The apparatus of claim 6, further comprising a third obtaining unit;
the third obtaining unit is configured to obtain, before the presenting unit presents, in a second image obtained by the first terminal, projection coordinates of each second terminal on the second image based on a relative position and posture between the first terminal and each second terminal, a target coordinate of any point on each second terminal in a coordinate system corresponding to the terminal at the current time;
the presenting unit is specifically configured to determine target projection coordinates of the target coordinates on the second image based on the relative position and posture between the first terminal and each of the second terminals, and present the target projection coordinates in the second image acquired by the first terminal.
11. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any one of claims 1 to 5 when executing a program stored in the memory.
CN201810128421.0A 2018-02-08 2018-02-08 Augmented reality interaction method and device based on mark and electronic equipment Active CN108399638B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810128421.0A CN108399638B (en) 2018-02-08 2018-02-08 Augmented reality interaction method and device based on mark and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810128421.0A CN108399638B (en) 2018-02-08 2018-02-08 Augmented reality interaction method and device based on mark and electronic equipment

Publications (2)

Publication Number Publication Date
CN108399638A CN108399638A (en) 2018-08-14
CN108399638B true CN108399638B (en) 2021-07-20

Family

ID=63096299

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810128421.0A Active CN108399638B (en) 2018-02-08 2018-02-08 Augmented reality interaction method and device based on mark and electronic equipment

Country Status (1)

Country Link
CN (1) CN108399638B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109700550B (en) * 2019-01-22 2020-06-26 雅客智慧(北京)科技有限公司 Augmented reality method and device for dental surgery

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142055A (en) * 2011-04-07 2011-08-03 上海大学 True three-dimensional design method based on augmented reality interactive technology
CN103942796A (en) * 2014-04-23 2014-07-23 清华大学 High-precision projector and camera calibration system and method
CN105021184A (en) * 2015-07-08 2015-11-04 西安电子科技大学 Pose estimation system and method for visual carrier landing navigation on mobile platform
CN105188516A (en) * 2013-03-11 2015-12-23 奇跃公司 System and method for augmented and virtual reality
CN106774870A (en) * 2016-12-09 2017-05-31 武汉秀宝软件有限公司 A kind of augmented reality exchange method and system
CN107038722A (en) * 2016-02-02 2017-08-11 深圳超多维光电子有限公司 A kind of equipment localization method and device
CN107466008A (en) * 2013-10-22 2017-12-12 华为终端(东莞)有限公司 The information presentation method and mobile terminal of a kind of mobile terminal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100927009B1 (en) * 2008-02-04 2009-11-16 광주과학기술원 Haptic interaction method and system in augmented reality
WO2016077798A1 (en) * 2014-11-16 2016-05-19 Eonite Perception Inc. Systems and methods for augmented reality preparation, processing, and application

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142055A (en) * 2011-04-07 2011-08-03 上海大学 True three-dimensional design method based on augmented reality interactive technology
CN105188516A (en) * 2013-03-11 2015-12-23 奇跃公司 System and method for augmented and virtual reality
CN107466008A (en) * 2013-10-22 2017-12-12 华为终端(东莞)有限公司 The information presentation method and mobile terminal of a kind of mobile terminal
CN103942796A (en) * 2014-04-23 2014-07-23 清华大学 High-precision projector and camera calibration system and method
CN105021184A (en) * 2015-07-08 2015-11-04 西安电子科技大学 Pose estimation system and method for visual carrier landing navigation on mobile platform
CN107038722A (en) * 2016-02-02 2017-08-11 深圳超多维光电子有限公司 A kind of equipment localization method and device
CN106774870A (en) * 2016-12-09 2017-05-31 武汉秀宝软件有限公司 A kind of augmented reality exchange method and system

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
"Interactive e-learning system using pattern recognition and augmented reality";Sang Hwa Lee et al.;《IEEE Transactions on Consumer Electronics》;20090531;第55卷(第2期);第883-890页 *
"Outdoor Augmented Reality: State of the Art and Issues";Imane Zendjebil et al.;《10th ACM/IEEE Virtual Reality International Conference (VRIC 2008)》;20080430;第177-187页 *
"基于投影图像自适应校正方法的智能投影系统研究";朱博;《中国博士学位论文全文数据库信息科技辑》;20140615(第06期);第1-150页 *
"基于自然特征点的移动增强现实系统研究";陈智翔;《中国优秀硕士学位论文全文数据库信息科技辑》;20131015(第10期);第1-56页 *
"增强现实摄像机-IMU相对姿态的自动标定研究";姜广浩;《中国优秀硕士学位论文全文数据库信息科技辑》;20150315(第03期);第1-54页 *
《增强现实交互技术的研究》;李丹等;《北京地区高校研究生学术交流会通信与信息技术会议》;20081231;第447-451页 *

Also Published As

Publication number Publication date
CN108399638A (en) 2018-08-14

Similar Documents

Publication Publication Date Title
CN107223269B (en) Three-dimensional scene positioning method and device
US11394950B2 (en) Augmented reality-based remote guidance method and apparatus, terminal, and storage medium
US10964049B2 (en) Method and device for determining pose of camera
US10924729B2 (en) Method and device for calibration
TWI587205B (en) Method and system of three - dimensional interaction based on identification code
CN110782499B (en) Calibration method and calibration device for augmented reality equipment and terminal equipment
US20190096092A1 (en) Method and device for calibration
CN111459269B (en) Augmented reality display method, system and computer readable storage medium
US10769811B2 (en) Space coordinate converting server and method thereof
KR20140054590A (en) Method and apparatus for providing camera calibration
CN111161398B (en) Image generation method, device, equipment and storage medium
CN110807814A (en) Camera pose calculation method, device, equipment and storage medium
CN108430032B (en) Method and equipment for realizing position sharing of VR/AR equipment
CN110807431A (en) Object positioning method and device, electronic equipment and storage medium
CN115830135A (en) Image processing method and device and electronic equipment
CN108399638B (en) Augmented reality interaction method and device based on mark and electronic equipment
CN112017242B (en) Display method and device, equipment and storage medium
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN110415196A (en) Method for correcting image, device, electronic equipment and readable storage medium storing program for executing
CN113793392A (en) Camera parameter calibration method and device
CN113706692A (en) Three-dimensional image reconstruction method, three-dimensional image reconstruction device, electronic device, and storage medium
CN112652056A (en) 3D information display method and device
CN113034615A (en) Equipment calibration method for multi-source data fusion and related device
CN111223139A (en) Target positioning method and terminal equipment
CN112446928B (en) External parameter determining system and method for shooting device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 100176 305-9, floor 3, building 6, courtyard 10, KEGU 1st Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing (Yizhuang group, high-end industrial zone, Beijing Pilot Free Trade Zone)

Patentee after: Beijing dream bloom Technology Co.,Ltd.

Address before: 100176 305-9, floor 3, building 6, courtyard 10, KEGU 1st Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing (Yizhuang group, high-end industrial zone, Beijing Pilot Free Trade Zone)

Patentee before: Beijing iqiyi Intelligent Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder
CP03 Change of name, title or address

Address after: 100176 305-9, floor 3, building 6, courtyard 10, KEGU 1st Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing (Yizhuang group, high-end industrial zone, Beijing Pilot Free Trade Zone)

Patentee after: Beijing iqiyi Intelligent Technology Co.,Ltd.

Address before: 401133 room 208, 2 / F, 39 Yonghe Road, Yuzui Town, Jiangbei District, Chongqing

Patentee before: CHONGQING IQIYI INTELLIGENT TECHNOLOGY Co.,Ltd.

CP03 Change of name, title or address
PP01 Preservation of patent right

Effective date of registration: 20231009

Granted publication date: 20210720

PP01 Preservation of patent right
PD01 Discharge of preservation of patent

Date of cancellation: 20231129

Granted publication date: 20210720

PD01 Discharge of preservation of patent