CN112631431A - AR (augmented reality) glasses pose determination method, device and equipment and storage medium - Google Patents

AR (augmented reality) glasses pose determination method, device and equipment and storage medium Download PDF

Info

Publication number
CN112631431A
CN112631431A CN202110004365.1A CN202110004365A CN112631431A CN 112631431 A CN112631431 A CN 112631431A CN 202110004365 A CN202110004365 A CN 202110004365A CN 112631431 A CN112631431 A CN 112631431A
Authority
CN
China
Prior art keywords
position information
glasses
coordinate system
marker
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110004365.1A
Other languages
Chinese (zh)
Other versions
CN112631431B (en
Inventor
游立锦
钟家跃
潘东
李海峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Guangli Technology Co ltd
Original Assignee
Hangzhou Guangli Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Guangli Technology Co ltd filed Critical Hangzhou Guangli Technology Co ltd
Priority to CN202110004365.1A priority Critical patent/CN112631431B/en
Publication of CN112631431A publication Critical patent/CN112631431A/en
Application granted granted Critical
Publication of CN112631431B publication Critical patent/CN112631431B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a method, a device and equipment for determining the pose of AR glasses and a storage medium, and the result is more accurate. The AR glasses are positioned in the movable compartment body, and a camera is arranged in the movable compartment body; the AR glasses are provided with N1 first markers, and the method comprises the following steps: detecting a first marker from an image currently acquired by a first camera; acquiring observation position information of each detected first marker in the image when the number of detected first markers reaches M1; acquiring first position information of the detected first marker in a first coordinate system, wherein the first coordinate system is a coordinate system applied by the AR glasses; calculating a current first conversion relation according to the first position information and the observation position information of each detected first marker; the first conversion relation is a conversion relation from the first coordinate system to a second coordinate system applied by the first camera, and the first conversion relation is used for representing the pose relation of the AR glasses relative to the first camera.

Description

AR (augmented reality) glasses pose determination method, device and equipment and storage medium
Technical Field
The invention relates to the technical field of head-mounted intelligent equipment positioning, in particular to a pose determining method, a pose determining device, pose determining equipment and a storage medium of AR glasses.
Background
The AR augmented reality technology is a technology for skillfully fusing virtual information and a real world, the feeling of reality is enhanced by overlapping and displaying the content of the virtual space in the real space, various technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like are widely applied, virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer is applied to the real world after analog simulation, and the two kinds of information complement each other, so that the 'enhancement' of the real world is realized.
The application scenes of the AR glasses are divided into outdoor, indoor and movable compartments, and positioning of the AR glasses is generally required to be realized in any scene. Under the outdoor scene, the position information of the AR glasses can be obtained through the GPS carried by the AR glasses. In an indoor scene, the AR glasses can acquire pose information by performing inside-out SLAM (a space positioning mode) through built-in sensors such as built-in cameras and inertia measurement devices.
At present, in a scene of a movable carriage, an inside-out SLAM is also realized by using an internal sensor to obtain pose information, but when the movable carriage is in motion, an optical field in the carriage is influenced by illumination outside the carriage, the motion state of an object in the carriage is influenced by the motion speed of the carriage, and the environment outside the carriage moves relative to glasses, so that the precision of the pose information obtained by realizing the inside-out SLAM by using the internal sensor is reduced. Therefore, when the mode is applied to a movable compartment scene, accurate pose information of the AR glasses cannot be obtained.
Disclosure of Invention
The invention provides a method, a device and equipment for determining the pose of AR glasses and a storage medium, and the result is more accurate.
The invention provides a pose determination method of AR glasses, wherein the AR glasses are positioned in a movable compartment, and at least one camera is arranged in the movable compartment; n1 first markers are arranged on the AR glasses, and N1 is larger than 1; the method comprises the following steps:
detecting a first marker from an image currently acquired by a first camera, wherein the first camera is any one camera arranged in the movable carriage body;
acquiring observation position information of each detected first marker in the image when the number of detected first markers reaches M1; the M1 is greater than 1 and less than or equal to N1;
acquiring first position information of a detected first marker in a first coordinate system, wherein the first coordinate system is a coordinate system applied to the AR glasses;
calculating a current first conversion relation according to the first position information and the observation position information of each detected first marker; the first conversion relation is a conversion relation from the first coordinate system to a second coordinate system applied by the first camera, and the first conversion relation is used for representing a pose relation of the AR glasses relative to the first camera.
In accordance with one embodiment of the present invention,
the first transformation relationship comprises a first rotation matrix and a first translation matrix of the AR glasses from the first coordinate system to the second coordinate system;
the calculating a current first conversion relationship according to the first position information and the observation position information of each detected first marker includes:
acquiring a first initial rotation matrix and a first initial translation matrix which are constructed aiming at the first coordinate system and the second coordinate system, wherein the first initial rotation matrix and the first initial translation matrix have unknowns to be solved;
for each detected first marker, converting first position information of the first marker based on the first initial rotation matrix and the first initial translation matrix to obtain estimated position information of the first marker in the second coordinate system;
and solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker to obtain the first rotation matrix and the first translation matrix.
According to one embodiment of the present invention, solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker includes:
converting the observation position information to a first designated plane of the second coordinate system according to the internal reference calibrated by the first camera to obtain second position information;
projecting the estimated position information to the first designated plane to obtain third position information;
and solving the unknowns in the first initial rotation matrix and the first initial translation matrix based on the relationship that the second position information and the third position information of each detected first marker are equal.
According to an embodiment of the present invention, after calculating the current first conversion relationship based on the first position information and the observed position information of each detected first marker, the method further includes:
calculating a second conversion relation according to the position relation between the first camera and other cameras arranged in the movable carriage body and the first conversion relation; the second conversion relationship is a conversion relationship between the first coordinate system applied by the AR glasses and the third coordinate system applied by the other cameras, and the second conversion relationship is used for representing the pose relationship of the AR glasses relative to the other cameras.
According to an embodiment of the present invention, after calculating the current first conversion relationship based on the first position information and the observed position information of each detected first marker, the method further includes:
and determining the situation of the AR glasses from the specified time to the current pose transformation according to the current first transformation relation and the first transformation relation obtained at the specified time.
The invention provides a pose determining method of AR glasses, wherein the AR glasses are arranged in a movable compartment, at least one transmitting device is arranged in the movable compartment, the transmitting device is used for transmitting an indication signal, an X-axis rotation laser signal and a Y-axis rotation laser signal, the indication signal is used for indicating the start of transmitting the X-axis rotation laser signal, the X-axis rotation laser signal is a linear laser signal rotating around a Y axis when the indication signal is transmitted, the Y-axis rotation laser signal is a linear laser signal rotating around the X axis when the X-axis rotation laser signal is transmitted, and the X axis and the Y axis are two different axes in a fourth coordinate system applied by the transmitting device; the AR glasses are provided with N2 detection pieces, wherein N2 is larger than 1; the method comprises the following steps:
acquiring a first moment when the detection piece detects an indication signal emitted by a first emitting device, a second moment when an X-axis rotation laser signal passes through the detection piece, and a third moment when a Y-axis rotation laser signal passes through the detection piece; the first launching device is any launching device arranged in the movable carriage body;
determining a first included angle corresponding to each detection piece based on the first moment and the second moment, and determining a second included angle corresponding to each detection piece based on the first moment and the third moment; the first included angle is an included angle between a scanning plane when the X-axis rotating laser signal passes through the corresponding detection piece and a second designated plane of the fourth coordinate system, and the second included angle is an included angle between a scanning plane when the Y-axis rotating laser signal passes through the corresponding detection piece and the second designated plane;
determining fourth position information of the detection piece projected to the second appointed plane based on the first included angle and the second included angle corresponding to the detection piece;
acquiring fifth position information of each detection piece in a first coordinate system, wherein the first coordinate system is a coordinate system applied to the AR glasses;
calculating a current third conversion relation according to the fourth position information and the fifth position information of each detection piece; the third transformation relation is a transformation relation between the first coordinate system and the fourth coordinate system, and the third transformation relation is used for representing a pose relation of the AR glasses relative to the first transmitting device.
According to an embodiment of the present invention, determining the first included angle corresponding to each detecting element based on the first time and the second time includes:
calculating a first time difference between the second time and the first time;
and determining a first included angle corresponding to the detection piece according to the first time difference and an X-axis rotation angular velocity corresponding to the X-axis rotation laser signal.
According to an embodiment of the present invention, determining the second included angle corresponding to each detecting element based on the first time and the third time includes:
calculating the time difference between the third moment and the first moment, and calculating the time difference between the time difference and a set time length to obtain a second time difference, wherein the set time length is the time length for the first transmitting device to transmit the X-axis laser rotation signal;
and determining a second included angle corresponding to the detection piece according to the second time difference and the Y-axis rotation angular velocity corresponding to the Y-axis rotation laser signal.
According to an embodiment of the present invention, the third transformation relationship comprises a second rotation matrix and a second translation matrix of the AR glasses from the first coordinate system to the fourth coordinate system;
calculating a current third conversion relation according to the fourth position information and the fifth position information of each detection piece, wherein the third conversion relation comprises the following steps:
acquiring a second initial rotation matrix and a second initial translation matrix which are constructed aiming at the first coordinate system and the fourth coordinate system, wherein the second initial rotation matrix and the second initial translation matrix have unknowns to be solved;
for each detection piece, converting fifth position information of the detection piece based on the second initial rotation matrix and the second initial translation matrix to obtain sixth position information of the detection piece in the fourth coordinate system;
and solving the second initial rotation matrix and the second initial translation matrix based on the equal relationship between the fourth position information and the sixth position information of each detection piece to obtain the second rotation matrix and the second translation matrix.
According to an embodiment of the present invention, after calculating the current third conversion relationship according to the fourth position information and the fifth position information of each detecting element, the method further includes:
calculating a fourth conversion relation according to the position relation between the first transmitting device and other transmitting devices arranged in the movable carriage body and the third conversion relation; the fourth conversion relation is a conversion relation between the first coordinate system applied by the AR glasses and a fifth coordinate system applied by the other transmitting device, and the fourth conversion relation is used for representing the pose relation of the AR glasses relative to the other transmitting device.
According to an embodiment of the present invention, after calculating the current third conversion relationship according to the fourth position information and the fifth position information of each detecting element, the method further includes:
and determining the situation of the AR glasses from the specified time to the current pose transformation according to the current third transformation relation and the third transformation relation obtained at the specified time.
The invention provides a pose determination method of AR glasses, wherein the AR glasses are positioned in a movable compartment, N3 second markers are arranged in the movable compartment, and N3 is greater than 1; the AR glasses have binocular cameras; the method comprises the following steps:
respectively detecting identical second markers from a first image currently acquired by a first eye camera and a second image currently acquired by a second eye camera in the binocular cameras;
determining seventh position information and eighth position information of the detected second marker in the first image and second image, respectively, when the number of detected second markers reaches M3, the M3 being greater than 1 and less than or equal to the N3;
determining Z-axis coordinates of the detected second marker on a sixth coordinate system applied by a first target camera based on the seventh and eighth position information, the Z-axis coordinates representing a distance between the detected second marker and the first target camera;
determining the current three-dimensional coordinate position of the detected second marker in the sixth coordinate system according to the seventh position information and the Z-axis coordinate;
and determining the current pose information of the AR glasses relative to the movable carriage according to the current three-dimensional coordinate position of each detected second marker in the sixth coordinate system.
According to an embodiment of the present invention, determining Z-axis coordinates of the detected second marker on a sixth coordinate system applied by the first camera based on the seventh position information and the eighth position information comprises:
projecting the seventh position information onto a third designated plane of the sixth coordinate system to obtain ninth position information;
projecting the eighth position information onto a fourth designated plane of a seventh coordinate system to obtain tenth position information, wherein the seventh coordinate system is a coordinate system applied by the second camera;
and determining the Z-axis coordinate of the detected second marker according to the conversion relation between the sixth coordinate system and the seventh coordinate system and the ninth position information and the tenth position information.
According to an embodiment of the present invention, after determining the current three-dimensional coordinate position of the detected second marker in the sixth coordinate system according to the seventh position information, the eighth position information and the Z-axis coordinate, the method further comprises:
and determining the change condition of the AR glasses from the specified time to the current pose according to the current three-dimensional coordinate position of the detected second marker and the three-dimensional coordinate position of the AR glasses in the sixth coordinate system at the specified time.
The invention provides a pose determination device of AR glasses, wherein the AR glasses are positioned in a movable compartment body, and at least one camera is arranged in the movable compartment body; n1 first markers are arranged on the AR glasses, and N1 is larger than 1; the device includes:
the first marker detection module is used for detecting a first marker from an image currently acquired by a first camera, and the first camera is any one camera arranged in the movable compartment body;
an observation position information acquiring module for acquiring observation position information of each of the detected first markers in the image when the number of detected first markers reaches M1; the M1 is greater than 1 and less than or equal to N1;
the first position information acquisition module is used for acquiring first position information of the detected first marker in a first coordinate system, wherein the first coordinate system is a coordinate system applied to the AR glasses;
the first position and posture determining module is used for calculating a current first conversion relation according to the first position information and the observation position information of each detected first marker; the first conversion relation is a conversion relation from the first coordinate system to a second coordinate system applied by the first camera, and the first conversion relation is used for representing a pose relation of the AR glasses relative to the first camera.
In accordance with one embodiment of the present invention,
the first transformation relationship comprises a first rotation matrix and a first translation matrix of the AR glasses from the first coordinate system to the second coordinate system;
the first posture determining module is specifically configured to, when calculating the current first conversion relationship according to the first position information and the observation position information of each detected first marker:
acquiring a first initial rotation matrix and a first initial translation matrix which are constructed aiming at the first coordinate system and the second coordinate system, wherein the first initial rotation matrix and the first initial translation matrix have unknowns to be solved;
for each detected first marker, converting first position information of the first marker based on the first initial rotation matrix and the first initial translation matrix to obtain estimated position information of the first marker in the second coordinate system;
and solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker to obtain the first rotation matrix and the first translation matrix.
According to an embodiment of the present invention, the first pose determination module, when solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker, is specifically configured to:
converting the observation position information to a first designated plane of the second coordinate system according to the internal reference calibrated by the first camera to obtain second position information;
projecting the estimated position information to the first designated plane to obtain third position information;
and solving the unknowns in the first initial rotation matrix and the first initial translation matrix based on the relationship that the second position information and the third position information of each detected first marker are equal.
According to an embodiment of the invention, the apparatus further comprises:
the first other pose information determining module is used for calculating a second conversion relation according to the position relation between the first camera and other cameras arranged in the movable carriage and the first conversion relation; the second conversion relationship is a conversion relationship between the first coordinate system applied by the AR glasses and the third coordinate system applied by the other cameras, and the second conversion relationship is used for representing the pose relationship of the AR glasses relative to the other cameras.
According to an embodiment of the invention, the apparatus further comprises:
and the first pose transformation determining module is used for determining the transformation condition of the AR glasses from the specified time to the current pose according to the current first transformation relation and the first transformation relation obtained at the specified time.
A fifth aspect of the present invention provides a pose determining apparatus for AR glasses, where the AR glasses are located inside a movable carriage, the movable carriage is provided with at least one transmitting device, the transmitting device is configured to transmit an indication signal, an X-axis rotation laser signal, and a Y-axis rotation laser signal, the indication signal is configured to indicate that an X-axis rotation laser signal starts to be transmitted, the X-axis rotation laser signal is a linear laser signal rotating around a Y-axis when the indication signal is transmitted, the Y-axis rotation laser signal is a linear laser signal rotating around an X-axis when the X-axis rotation laser signal is transmitted, and the X-axis and the Y-axis are two different axes in a fourth coordinate system applied by the transmitting device; the AR glasses are provided with N2 detection pieces, wherein N2 is larger than 1; the device includes:
the time acquisition module is used for acquiring a first time when the detection piece detects the indication signal emitted by the first emitting device, a second time when the X-axis rotation laser signal passes through the detection piece and a third time when the Y-axis rotation laser signal passes through the detection piece; the first launching device is any launching device arranged in the movable carriage body;
the included angle determining module is used for determining a first included angle corresponding to each detection piece based on the first moment and the second moment and determining a second included angle corresponding to each detection piece based on the first moment and the third moment; the first included angle is an included angle between a scanning plane when the X-axis rotating laser signal passes through the corresponding detection piece and a second designated plane of the fourth coordinate system, and the second included angle is an included angle between a scanning plane when the Y-axis rotating laser signal passes through the corresponding detection piece and the second designated plane;
the fourth position information determining module is used for determining fourth position information of the detection piece projected to the second designated plane based on the first included angle and the second included angle corresponding to the detection piece;
a fifth position information obtaining module, configured to obtain fifth position information of each detection element in a first coordinate system, where the first coordinate system is a coordinate system applied to the AR glasses;
the second posture determining module is used for calculating a current third conversion relation according to the fourth position information and the fifth position information of each detection piece; the third transformation relation is a transformation relation between the first coordinate system and the fourth coordinate system, and the third transformation relation is used for representing a pose relation of the AR glasses relative to the first transmitting device.
According to an embodiment of the present invention, when the included angle determining module determines the first included angle corresponding to each detecting element based on the first time and the second time, the included angle determining module is specifically configured to:
calculating a first time difference between the second time and the first time;
and determining a first included angle corresponding to the detection piece according to the first time difference and an X-axis rotation angular velocity corresponding to the X-axis rotation laser signal.
According to an embodiment of the present invention, when the included angle determining module determines the second included angle corresponding to each detecting element based on the first time and the third time, the included angle determining module is specifically configured to:
calculating the time difference between the third moment and the first moment, and calculating the time difference between the time difference and a set time length to obtain a second time difference, wherein the set time length is the time length for the first transmitting device to transmit the X-axis laser rotation signal;
and determining a second included angle corresponding to the detection piece according to the second time difference and the Y-axis rotation angular velocity corresponding to the Y-axis rotation laser signal.
According to an embodiment of the present invention, the third transformation relationship comprises a second rotation matrix and a second translation matrix of the AR glasses from the first coordinate system to the fourth coordinate system;
the second posture determining module is specifically configured to, when calculating a current third conversion relationship according to the fourth position information and the fifth position information of each detection piece:
acquiring a second initial rotation matrix and a second initial translation matrix which are constructed aiming at the first coordinate system and the fourth coordinate system, wherein the second initial rotation matrix and the second initial translation matrix have unknowns to be solved;
for each detection piece, converting fifth position information of the detection piece based on the second initial rotation matrix and the second initial translation matrix to obtain sixth position information of the detection piece in the fourth coordinate system;
and solving the second initial rotation matrix and the second initial translation matrix based on the equal relationship between the fourth position information and the sixth position information of each detection piece to obtain the second rotation matrix and the second translation matrix.
According to an embodiment of the invention, the apparatus further comprises:
the second other pose information determining module is used for calculating a fourth conversion relation according to the position relation between the first emitting device and other emitting devices arranged in the movable carriage body and the third conversion relation; the fourth conversion relation is a conversion relation between the first coordinate system applied by the AR glasses and a fifth coordinate system applied by the other transmitting device, and the fourth conversion relation is used for representing the pose relation of the AR glasses relative to the other transmitting device.
According to an embodiment of the invention, the apparatus further comprises:
and the second pose transformation determining module is used for determining the transformation condition of the AR glasses from the specified time to the current pose according to the current third transformation relation and the third transformation relation obtained at the specified time.
A sixth aspect of the present invention provides a pose determination apparatus for AR glasses, where the AR glasses are located inside a movable compartment, N3 second markers are disposed in the movable compartment, and N3 is greater than 1; the AR glasses have binocular cameras; the device includes:
the second marker detection module is used for respectively detecting the same second markers from a first image currently acquired by a first eye camera and a second image currently acquired by a second eye camera in the binocular cameras;
seventh and eighth position information determination modules for determining seventh position information and eighth position information of the detected second marker in the first image and second image, respectively, when the number of detected second markers reaches M3, the M3 being greater than 1 and less than or equal to the N3;
a Z-axis coordinate determination module for determining a Z-axis coordinate of the detected second marker on a sixth coordinate system applied by the first target camera based on the seventh position information and the eighth position information, the Z-axis coordinate representing a distance between the detected second marker and the first target camera;
a three-dimensional coordinate position determining module, configured to determine, according to the seventh position information and the Z-axis coordinate, a current three-dimensional coordinate position of the detected second marker in the sixth coordinate system;
and the third pose determining module is used for determining the pose information of the AR glasses relative to the movable carriage according to the current three-dimensional coordinate position of each detected second marker in the sixth coordinate system.
According to an embodiment of the present invention, the Z-axis coordinate determination module, when determining the Z-axis coordinate of the detected second marker on the sixth coordinate system applied by the first target camera based on the seventh position information and the eighth position information, is specifically configured to:
projecting the seventh position information onto a third designated plane of the sixth coordinate system to obtain ninth position information;
projecting the eighth position information onto a fourth designated plane of a seventh coordinate system to obtain tenth position information, wherein the seventh coordinate system is a coordinate system applied by the second camera;
and determining the Z-axis coordinate of the detected second marker according to the conversion relation between the sixth coordinate system and the seventh coordinate system and the ninth position information and the tenth position information.
According to an embodiment of the invention, the apparatus further comprises:
and the third pose change determining module is used for determining the change condition of the AR glasses from the specified time to the current pose according to the current three-dimensional coordinate position of the detected second marker and the three-dimensional coordinate position of the AR glasses in the sixth coordinate system at the specified time.
A seventh aspect of the present invention provides an electronic device, including a processor and a memory; the memory stores a program that can be called by the processor; wherein the processor, when executing the program, implements the method for determining the pose of the AR glasses according to the foregoing embodiment.
An eighth aspect of the present invention provides a non-transitory electronic device-readable storage medium on which a program is stored, the program, when executed by a processor, implementing the pose determination method of the AR glasses according to the foregoing embodiments.
The invention has the following beneficial effects:
in the embodiment of the invention, a plurality of first markers are arranged on the AR glasses, at least one camera is arranged in the movable compartment, the first markers are detected in the image collected by the camera, after the first markers meeting the quantity requirement are detected, a first conversion relation from the first coordinate system to a second coordinate system applied by the camera can be determined according to the observation position information of the first markers in the image and the first position information of the first markers in a first coordinate system applied by the AR glasses, the first conversion relation can represent the pose relation of the AR glasses relative to the camera, the position relation between the camera and the movable compartment is determined, namely the pose information of the AR glasses in the movable compartment is determined by using the first markers as auxiliary devices, the whole process is basically not influenced by the movement speed of the movable carriage, the high-precision positioning of the AR glasses can be realized, the determined pose information is more accurate, and the accurate pose information is provided for the AR glasses to realize stable and reliable enhancement display application.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic structural view of a movable carriage provided with a camera according to an embodiment of the present invention;
FIG. 2 is a schematic structural view of AR glasses provided with a first marker according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of a pose determination method for AR glasses according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a first coordinate system and a second coordinate system according to an embodiment of the invention;
fig. 5 is a flowchart illustrating a pose determination method for AR glasses according to another embodiment of the present invention;
FIG. 6 is a schematic diagram of determining a first angle and a second angle in accordance with one embodiment of the present invention;
fig. 7 is a schematic structural view of a movable carriage provided with a second marker according to an embodiment of the present invention;
fig. 8 is a flowchart illustrating a pose determination method for AR glasses according to another embodiment of the present invention;
FIG. 9 is a schematic illustration of determining Z-axis coordinates in accordance with an embodiment of the present invention;
fig. 10 is a block diagram of the configuration of the pose determination apparatus for AR glasses according to an embodiment of the present invention;
fig. 11 is a block diagram of a pose determination apparatus of AR glasses according to another embodiment of the present invention;
fig. 12 is a block diagram of a configuration of a pose determination apparatus of AR glasses according to still another embodiment of the present invention;
fig. 13 is a block diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The method for determining the pose of the AR glasses can be applied to the scene of a movable compartment, the movable compartment can be a vehicle, a wearer can wear the AR glasses and sit in the vehicle, the pose information of the AR glasses can be determined through the pose determining method, the pose information of the AR glasses can be accurately determined no matter how fast the vehicle runs, and the method is not influenced by the moving speed.
In the embodiment of the present invention, the movable compartment may refer to a vehicle, and certainly, other movable compartments may also be used, such as a cockpit and a cabin of a subway, a high-speed rail, and the like, which are not limited specifically.
A first aspect of the embodiments of the present invention provides a method for determining a pose of AR glasses, and the method for determining a pose of AR glasses according to the first aspect of the embodiments of the present invention is described in detail below.
In the method for determining the pose of the AR glasses provided in the first aspect, the AR glasses are located inside a movable compartment, and at least one camera is disposed in the movable compartment; n1 first markers are arranged on the AR glasses, and N1 is larger than 1.
In the case where a plurality of cameras are provided in the movable compartment body, the plurality of cameras may be distributed at different positions in the movable compartment body. Referring to fig. 1, the movable compartment 10 may be an automobile, and 4 cameras 21-24 are disposed in the movable compartment 10 and respectively disposed at four corners inside the movable compartment 10, so that the AR glasses can be collected by at least one camera when rotating back and forth and left and right, and the pose information of the AR glasses in the movable compartment can be determined. It is to be understood that the number of cameras provided in the movable carriage 10 is merely a preferred example, and not a limitation, and may be more or less.
The plurality of first markers disposed on the AR glasses may be distributed at different locations of the AR glasses. Referring to fig. 2, the AR glasses 30 are provided with 9 first markers 41-49, which may be disposed at different positions on an outer side surface of a frame of the AR glasses 30, the outer side surface being a surface of the frame away from a wearer, and the first markers 41-49 may be distributed symmetrically with respect to a middle position of the outer side surface, and a specific distribution may be as shown in fig. 2, but should not be taken as a limitation, and a specific distribution position is not limited. It is to be understood that the number of first markers provided on the AR glasses 30 herein is merely a preferred example, and not by way of limitation, and may be more or less in practice.
Optionally, the AR glasses shown in fig. 2 may further include a binocular camera, and the binocular camera includes a first eye camera 31 and a second eye camera 32, which is not limited specifically. It is understood that the AR glasses may also have other components, such as a processor, a communication module, etc., which are not described in detail herein.
Alternatively, the first marking element may be, for example, an infrared emitter, which may emit infrared light, for example, may emit infrared light in a blinking manner. The frequency bands of the infrared light emitted by different first markers may be different, and the different first markers may be distinguished through the frequency bands, or of course, the different first markers may also be distinguished in other manners, which is not limited specifically. It is to be understood that the first marker is not limited to the infrared emitter, and may be other markers as long as they can be detected by the image.
In one embodiment, referring to fig. 3, the method includes the steps of:
s100: detecting a first marker from an image currently acquired by a first camera, wherein the first camera is any one camera arranged in the movable carriage body;
s200: acquiring observation position information of each detected first marker in the image when the number of detected first markers reaches M1; the M1 is greater than 1 and less than or equal to N1;
s300: acquiring first position information of a detected first marker in a first coordinate system, wherein the first coordinate system is a coordinate system applied to the AR glasses;
s400: calculating a current first conversion relation according to the first position information and the observation position information of each detected first marker; the first conversion relation is a conversion relation from the first coordinate system to a second coordinate system applied by the first camera, and the first conversion relation is used for representing a pose relation of the AR glasses relative to the first camera.
An executing subject of the pose determination method for the AR glasses may be the AR glasses, and specifically may be a processor in the AR glasses. Of course, the execution main body is not limited to the AR glasses, and may also be a camera on the movable carriage, or other processing devices, and the like, which is not limited specifically. In the following embodiments, description will be made taking the execution subject as AR glasses as an example.
Every camera on the portable railway carriage or compartment body can gather the image with certain collection cycle to can send the image of gathering to AR glasses and carry out corresponding processing. Optionally, wireless communication can be realized between the camera and the AR glasses, corresponding communication modules can be arranged on the camera and the AR glasses, and the specific communication mode is not limited, for example, bluetooth communication and the like can be realized.
In step S100, a first marker is detected from an image currently captured by a first camera.
The first camera is any one camera arranged in the movable carriage body. For example, the first camera may be any one of the cameras 21 to 24 shown in fig. 1, and of course, images collected by the cameras 21 to 24 may all be sent to the AR glasses, and this embodiment is only explained by using one of the cameras, and the processing manner is the same for the other cameras.
The AR glasses may detect the first marker from the image after receiving the image currently captured by the first camera. From which it is possible to detect how many first markers are present in the image. Optionally, different first markers may be distinguished when a first marker is detected.
In step S200, when the number of detected first markers reaches M1, observation position information of each detected first marker in the image is acquired.
M1 is greater than 1 and less than or equal to N1. Optionally, M1 may be, for example, 4, 5 or 6, and is not particularly limited. The following description will be given taking M1 as 6 as an example.
When the number of the detected first markers reaches 6, it is explained that it is sufficient to determine the pose information of the AR glasses from the 6 first markers in the image, and therefore, it is possible to acquire the observed position information of each detected first marker in the image, which is the position information of the first marker actually detected in the image, rather than the estimated value, with respect to the estimated position information.
Optionally, the first marker in the image may be detected by using a neural network, the neural network may output the position information of the detected first marker, and when the number of the output position information reaches 6, the position information of each first marker output by the neural network may be acquired as the observed position information of each detected first marker in the image. The position information of each first marker may specifically be position information of the detection frame, may be position information of a center point of the detection frame, and is not limited specifically.
Of course, the detection method of the first marker in the image is not limited, and other methods are also possible. Similarly, the manner of acquiring the observation position information is not limited to this, and the position information of the first marker can be determined substantially with the detection of the first marker.
In step S300, first position information of the detected first marker in the first coordinate system is acquired.
The first coordinate system is a coordinate system to which the AR glasses are applied. The first coordinate system may be a three-dimensional rectangular coordinate system, but is not limited thereto. For example, referring to FIG. 4, G1 is a first coordinate system consisting of three directional axes X2-Y2-Z2. m1 is the first marker in which one is detected.
Since all the first markers are fixed to the AR glasses, the position information of each first marker in the first coordinate system to which the AR glasses are applied can be predetermined, can be stored in a memory in the AR glasses, and can be retrieved from the memory when necessary.
For example, after detecting the first marker from the image, first position information of the detected first marker in the first coordinate system may be acquired from a memory of the AR glasses.
In step S400, a current first conversion relationship is calculated based on the first position information and the observed position information of each detected first marker.
The observed position information of the detected first marker is position information in the image, namely position information in an image coordinate system, and the image coordinate system is associated with a second coordinate system applied by the first camera, and the association of the image coordinate system and the second coordinate system can be determined according to internal parameters set by the first camera.
In the case where the first position information of each detected first marker in the first coordinate system and the observation position information in the image have been determined, the conversion relationship between the first coordinate system applied to the AR glasses and the second coordinate system applied to the first camera, that is, the first conversion relationship, can be determined.
With continued reference to FIG. 4, S1 is a second coordinate system that may consist of three directional axes X1-Y1-Z1.
The first transformation relationship may characterize a pose relationship of the AR glasses relative to the first camera, which is also used to characterize the pose relationship of the AR glasses relative to the movable carriage since the first camera is fixed to the movable carriage.
The first conversion relation can be directly used as the determined pose information of the AR glasses; certainly, a certain conversion may be performed, for example, the first conversion relationship is converted based on the relationship between the second coordinate system applied by the first camera and the coordinate system applied by the movable compartment, and the obtained result is used as the determined pose information of the AR glasses.
In one embodiment, the first transformation relationship includes a first rotation matrix and a first translation matrix of the AR glasses from the first coordinate system to the second coordinate system.
Based on the first rotation matrix and the first translation matrix, the rotation and translation conditions of the AR glasses relative to the first camera can be determined, that is, the rotation and translation conditions of the AR relative to the movable carriage, that is, the posture relationship can be determined.
Accordingly, in step S400, the calculating a current first conversion relationship according to the first position information and the observed position information of each detected first marker may include:
s401: acquiring a first initial rotation matrix and a first initial translation matrix which are constructed aiming at the first coordinate system and the second coordinate system, wherein the first initial rotation matrix and the first initial translation matrix have unknowns to be solved;
s402: for each detected first marker, converting first position information of the first marker based on the first initial rotation matrix and the first initial translation matrix to obtain estimated position information of the first marker in the second coordinate system;
s403: and solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker to obtain the first rotation matrix and the first translation matrix.
For example, with continued reference to fig. 4, assuming that the first position information of one of the detected first markers m1 in the first coordinate system is g1, the first initial rotation matrix is denoted by R1, the first initial translation matrix is denoted by T1, and accordingly, the estimated position information of m1 in the second coordinate system may be:
(Mx1,My1,Mz1)=(R1*g1+T1);
where Mx1 is a coordinate value in the estimated position information along the X1 axis direction, My1 is a coordinate value in the estimated position information along the Y1 axis direction, and Mz1 is a coordinate value in the estimated position information along the Z1 axis direction.
Other estimated position information of the detected first marker can be represented in the same manner, and will not be described herein.
In this way, based on the relationship between the estimated position information and the observed position information of each detected first marker, a plurality of equations may be established, such that the unknowns in R1 and T1 may be calculated, and the first rotation matrix and the first translation matrix of the AR glasses from the first coordinate system to the second coordinate system, i.e., the first conversion relationship, may be determined by substituting the calculation results into the first initial rotation matrix R1 and the first initial translation matrix T1.
Optionally, in step S403, solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker may include the following steps:
s4031: converting the observation position information to a first designated plane of the second coordinate system according to the internal reference calibrated by the first camera to obtain second position information;
s4032: projecting the estimated position information to the first designated plane to obtain third position information;
s4033: and solving the unknowns in the first initial rotation matrix and the first initial translation matrix based on the relationship that the second position information and the third position information of each detected first marker are equal.
In step S4031, the observation position information is converted to a first designated plane of the second coordinate system according to the internal reference calibrated by the first camera, so as to obtain second position information.
For example, with continued reference to fig. 4, the observed position information of one detected first marker m1 in the image is, for example, (u1, v1), the first designated plane of the second coordinate system may be, for example, a plane with Z1 ═ 1, in fig. 4, the first designated plane may extend in two directions XP-YP, the observed position information is converted to the first designated plane, the second position information is obtained, the projection point is, for example, mp in fig. 4, and the second position information of mp projected on the first designated plane can be calculated according to the preset perspective projection model, as follows:
(xp1,yp1)=((u1-cx)/fx,(v1-cy)/fy);
wherein, cx, fx, cy, fy can be the internal reference calibrated by the first camera.
In step S4032, the estimated location information is projected onto the first designated plane to obtain third location information.
Continuing with the example of fig. 4, the estimated location information of m1 is: (Mx, My, Mz) ═ R1 × g1+ T1. After the first designated plane is projected to the first designated plane, the obtained third position information is as follows: (Mx/Mz, My/Mz, 1), 1 in the Z-axis direction can be ignored in subsequent calculations.
In step S4033, the unknown number in the first initial rotation matrix and the first initial translation matrix is solved based on the relationship that the second position information and the third position information of each detected first marker are equal
For a detected first marker, two equations may be established based on the second position information and the third position information, one being an equation between coordinate values with respect to the Xp-axis direction and the other being an equation between coordinate values with respect to the Yp-axis direction.
Optionally, the R1 may be a matrix of 3X3, which may contain 9 unknowns; t1 may be a matrix of 3X1, which may contain 3 unknowns.
Thus, 12 unknowns are present in total in R1 and T1, and 12 equations are required to solve all unknowns, so that, in this case, when the number of the first marker detected reaches 6, all unknowns in the first initial rotation matrix R1 and the first initial translation matrix T1 can be solved smoothly, thereby obtaining the required first rotation matrix and first translation matrix.
Of course, the unknowns in R1 and T1 are not limited to the above, and when there is a certain position constraint between the AR glasses and the first camera, the unknowns in R1 and T1 may be reduced accordingly, and then the number of first markers to be detected may be reduced appropriately, and is not limited specifically.
In one embodiment, after calculating the current first conversion relationship based on the first position information and the observed position information of each of the detected first markers, the method further comprises the steps of:
step S500: calculating a second conversion relation according to the position relation between the first camera and other cameras arranged in the movable carriage body and the first conversion relation; the second conversion relationship is a conversion relationship between the first coordinate system applied by the AR glasses and the third coordinate system applied by the other cameras, and the second conversion relationship is used for representing the pose relationship of the AR glasses relative to the other cameras.
For example, referring to fig. 1, assuming that the camera 21 is a first camera, since the positional relationship of the cameras 21 to 24 is fixed, after the first conversion relationship is determined, the conversion relationship between the first coordinate system and the coordinate system applied by the cameras 22 to 24 can be determined according to the positional relationship between the cameras 22 to 24 and the camera 21 and the first conversion relationship.
In some cases, due to the steering of the wearer, not all the cameras can acquire the first markers with the number meeting the requirement, so that after the pose relationship of the AR glasses relative to one camera is determined, the pose relationship of the AR glasses relative to other cameras can be determined according to the position relationship between the cameras, even if the first markers cannot be observed in the current view field range of the other cameras.
In some cases, the pose relationship of the AR glasses may be determined with reference to one of the cameras, and thus, the pose relationship of the AR glasses determined based on the other cameras needs to be converted into the pose relationship of the AR glasses with respect to the one camera. Of course, this is only preferred and not particularly limited.
In one embodiment, after calculating the current first conversion relationship based on the first position information and the observed position information of each of the detected first markers, the method further comprises the steps of:
s600: and determining the situation of the AR glasses from the specified time to the current pose transformation according to the current first transformation relation and the first transformation relation obtained at the specified time.
The specified time may be an initial time or a previous time.
Taking the appointed time as an initial time as an example, the first conversion relation obtained at the appointed time is used as the reference pose information of the AR glasses, and the current first conversion relation is used as the current pose information of the AR glasses, so that the motion trail and the posture transformation of the AR glasses can be determined by always taking the reference pose information of the initial time as a reference.
In one embodiment, the AR glasses may be provided with an inertial sensor IMU, the IMU may predict the pose of the AR glasses, for example, the pose of the AR glasses may be integrally predicted based on a current first conversion relationship, and the frequency predicted by the IMU may be higher than an image capture frequency of the camera, so that predicted pose information may be determined at an intermediate time between two frames of images, making up for a gap that pose information cannot be determined based on the images, and thus, the frequency at which the AR glasses output the pose information may be consistent with the frequency predicted by the IMU.
In the above embodiment, a plurality of first markers are disposed on the AR glasses, at least one camera is disposed in the movable compartment, the first markers are detected in the image captured by the camera, after the first markers meeting the quantity requirement are detected, a first conversion relationship from the first coordinate system to a second coordinate system applied by the camera can be determined according to the observed position information of the first markers in the image and the predetermined first position information of the first markers in the first coordinate system applied by the AR glasses, the first conversion relationship can represent the pose relationship of the AR glasses relative to the camera, since the position relationship between the camera and the movable compartment is determined, that is, the pose information of the AR glasses in the movable compartment is determined by using the plurality of first markers as auxiliary devices, the whole process is basically not influenced by the movement speed of the movable carriage, the high-precision positioning of the AR glasses can be realized, the determined pose information is more accurate, and the accurate pose information is provided for the AR glasses to realize stable and reliable enhancement display application.
A second aspect of the embodiment of the present invention provides a method for determining a pose of AR glasses, and the method for determining a pose of AR glasses according to the second aspect of the embodiment of the present invention is described in detail below.
In the method for determining the pose of the AR glasses provided in the second aspect, the AR glasses are located inside a movable carriage, at least one emitting device is disposed in the movable carriage, the emitting device is configured to emit an indication signal, an X-axis rotation laser signal, and a Y-axis rotation laser signal, the indication signal is configured to indicate that the emission of the X-axis rotation laser signal is started, the X-axis rotation laser signal is emitted when the emission of the indication signal is completed and is composed of a line laser signal rotating around a Y-axis, the Y-axis rotation laser signal is emitted when the emission of the X-axis rotation laser signal is completed and is composed of a line laser signal rotating around an X-axis, and the X-axis and the Y-axis are two different axes in a fourth coordinate system applied by the emitting device; the AR glasses are provided with N2 detection pieces, and N2 is larger than 1.
In the case where a plurality of launching devices are provided within the movable compartment, the plurality of launching devices may be distributed at different locations within the movable compartment. For example, the movable compartment may be an automobile, and 4 emitting devices may be disposed in the movable compartment, and the distribution positions may be the distribution positions of 4 cameras 21 to 24 shown in fig. 1, so that the AR glasses rotate back and forth and left and right, and signals emitted by at least one emitting device may be detected, and thus the pose information of the AR glasses in the movable compartment may be determined. It is to be understood that the number of launching devices provided in the movable carriage is only a preferred example and not a limitation, and may be more or less.
The plurality of sensing elements provided on the AR glasses may be distributed at different locations of the AR glasses. For example, 9 detecting elements may be disposed on the AR glasses, and may be disposed at different positions on the outer side of the frame of the AR glasses, and the specific distribution positions may be the distribution positions of the first markers 41 to 49 shown in fig. 2, but should not be taken as a limitation, and the specific distribution positions are not limited. It is understood that the number of detecting elements provided on the AR glasses herein is only a preferred example, and not a limitation, and may be more or less in practice.
Alternatively, the detecting element may be a photosensitive sensor, the indication signal emitted by the emitting device may be a surface light signal, and as long as the detecting element on the AR glasses is within the surface light signal emitting range of the emitting device, the detecting element within the emitting range may detect the surface light signal.
The transmitting means may transmit the signal periodically, for example at a frequency of 60HZ, i.e. 60 transmissions per second; or may be triggered before transmitting the signal. The signal that emitter transmission sent includes instruction signal, rotatory laser signal of X axle and the rotatory laser signal of Y axle in proper order, sends instruction signal earlier promptly, and the rotatory laser signal of X axle is reemitted again when instruction signal transmission is accomplished, and the rotatory laser signal of Y axle is reemitted again when the rotatory laser signal transmission of X axle is accomplished again.
The X-axis rotation laser signal and the Y-axis rotation laser signal will be described.
X-axis rotation laser signal: the emitting device may emit a line laser signal, and the X-axis rotation laser signal may be a line laser signal rotating around the Y-axis, and the direction of the line laser signal may be parallel to the Y-axis, and the rotation angle may be, for example, 120 degrees, which is not limited in particular. Alternatively, when the X-axis rotation laser signal is emitted, the emitting device may rotate around the Y-axis from the direction of the X-axis.
Y-axis rotation laser signal: the emitting device may emit a line laser signal, and the Y-axis rotation laser signal may be a line laser signal rotating around the X-axis, and the direction of the line laser signal may be parallel to the X-axis, and the rotation angle may be, for example, 120 degrees, which is not limited. Alternatively, when the Y-axis rotation laser signal is emitted, the emitting device may rotate around the X-axis from the direction of the Y-axis.
Optionally, the emitting ranges of the emitting device for emitting the indication signal, the X-axis rotation laser signal and the Y-axis rotation laser signal have an intersection, and the larger the intersection is, the better the intersection is. The emission ranges of different emission devices in the movable compartment body can have no intersection to avoid mutual interference, or the emission ranges can also have intersection to stagger the emission time, for example, one emission device emits signals first, and after all the signals are emitted, another emission device emits the signals next, and the specific limitation is not required.
In one embodiment, referring to fig. 5, the method includes the steps of:
u100: acquiring a first moment when the detection piece detects an indication signal emitted by a first emitting device, a second moment when an X-axis rotation laser signal passes through the detection piece, and a third moment when a Y-axis rotation laser signal passes through the detection piece; the first launching device is any launching device arranged in the movable carriage body;
u200: determining a first included angle corresponding to each detection piece based on the first moment and the second moment, and determining a second included angle corresponding to each detection piece based on the first moment and the third moment; the first included angle is an included angle between a scanning plane when the X-axis rotating laser signal passes through the corresponding detection piece and a second designated plane of the fourth coordinate system, and the second included angle is an included angle between a scanning plane when the Y-axis rotating laser signal passes through the corresponding detection piece and the second designated plane;
u300: determining fourth position information of the detection piece projected to the second appointed plane based on the first included angle and the second included angle corresponding to the detection piece;
u400: acquiring fifth position information of each detection piece in a first coordinate system, wherein the first coordinate system is a coordinate system applied to the AR glasses;
u500: calculating a current third conversion relation according to the fourth position information and the fifth position information of each detection piece; the third transformation relation is a transformation relation between the first coordinate system and the fourth coordinate system, and the third transformation relation is used for representing a pose relation of the AR glasses relative to the first transmitting device.
An executing subject of the pose determination method for the AR glasses may be the AR glasses, and specifically may be a processor in the AR glasses. Of course, the execution subject is not limited to the AR glasses, and may be other processing devices, for example, without limitation. In the following embodiments, description will be made taking the execution subject as AR glasses as an example.
In step U100, a first time when the detection piece detects the indication signal emitted by the first emitting device, a second time when the X-axis rotation laser signal passes through the detection piece, and a third time when the Y-axis rotation laser signal passes through the detection piece are obtained.
The transmitting device on the movable carriage body can transmit an indication signal, an X-axis rotation laser signal and a Y-axis rotation laser signal. The detection piece on the AR glasses can detect corresponding signals as long as the detection piece is located in the transmitting range of the transmitting device, so that the time of receiving the signals can be recorded, and the time is sent to the AR glasses.
The time is specifically a first time when the detection member detects the indication signal, a second time when the detection member detects that the X-axis rotation laser signal passes through the detection member, and a third time when the detection member detects that the Y-axis rotation laser signal passes through the detection member.
The first launching device is any launching device arranged in the movable carriage body.
Optionally, the detection element and the AR glasses (specifically, the processor and/or the memory) may be connected by a wire.
In step U200, a first included angle corresponding to each detection piece is determined based on the first time and the second time, and a second included angle corresponding to each detection piece is determined based on the first time and the third time.
The emitting time of the indicating signal is short and can be ignored, and the delay caused by the reasons of devices and the like is not considered, and the first time can be considered as the time when the emitting device emits the indicating signal and can also be considered as the time when the X-axis rotating laser signal starts to be emitted.
That is, the X-axis rotation laser signal starts to rotate from a first timing, and when the detecting member detects that the X-axis rotation laser signal passes through the detecting member, the X-axis rotation laser signal rotates toward the detecting member to be detected by the detecting member, and this detected timing serves as a second timing.
The rotation angle of the X-axis rotation laser signal can be determined based on the second moment and the first moment, and correspondingly, a first included angle corresponding to the detection piece which detects the second moment and the first moment can also be determined.
In other words, a first included angle corresponding to each detection piece is determined based on the first time and the second time, and the first included angle is an included angle between a scanning plane when the X-axis rotation laser signal passes through the corresponding detection piece and a second designated plane of the fourth coordinate system.
For example, referring to FIG. 6, G2 is a first coordinate system applied by AR glasses, consisting of three directional axes X4-Y4-Z4. S3 is a fourth coordinate system applied by the first emitting device, which is composed of axes in three directions of X3-Y3-Z3, and O3 is the origin of the fourth coordinate system and can be used as the emitting point of the emitting device. The second designated plane may be the plane Z3 ═ 1, extending in both XC-YC directions. m2 is a detecting piece on the AR glasses.
The scanning plane when the X-axis rotation laser signal passes through the detector m2 is a plane which is emitted from the O3 to the m2 direction, is coplanar with the Y axis and intersects with the second designated plane, the first included angle between the scanning plane and the second designated plane in FIG. 6 is a, and the included angle a is the included angle between the scanning plane and the X3-O3-Y3 plane because the second designated plane is parallel to the X3-O3-Y3 plane. If the X-axis rotation laser signal is rotated around the Y-axis from the direction of the X-axis, the included angle a is also the rotation angle of the X-axis rotation laser signal.
That is, an angle between a scanning plane when the X-axis rotation laser signal passes through the corresponding detecting member and a second designated plane of the fourth coordinate system, that is, a first angle, is related to a rotation angle of the X-axis rotation laser signal, and is specifically determined according to an angle at which the X-axis rotation laser signal starts to rotate.
Optionally, in step U200, determining a first included angle corresponding to each detecting element based on the first time and the second time may include the following steps:
u201: calculating a first time difference between the second time and the first time;
u202: and determining a first included angle corresponding to the detection piece according to the first time difference and an X-axis rotation angular velocity corresponding to the X-axis rotation laser signal.
Since the first time is the time when the X-axis rotation laser signal starts to rotate (the time when the X-axis rotation laser signal starts to emit is also the time when the X-axis rotation laser signal passes through the detection member), and the second time is the time when the X-axis rotation laser signal passes through the detection member, the first time difference is the time length that the X-axis rotation laser signal has rotated when passing through the detection member.
The rotation angular velocity of the X axis corresponding to the X axis rotation laser signal can be predetermined and can be preset in the AR glasses.
After the first time difference and the X-axis rotation angular velocity are determined, a first included angle corresponding to the detection piece can be determined. Optionally, when the X-axis rotation laser signal starts to rotate around the Y-axis from the X-axis direction, a product of the first time difference and the X-axis rotation angular velocity may be determined as a first included angle corresponding to the detection element, and in other cases, a certain adjustment may be performed according to the product, which is not limited specifically.
And the Y-axis rotation laser signal starts to rotate from the time when the X-axis rotation laser signal is emitted, when the detection piece detects that the Y-axis rotation laser signal passes through the detection piece, the Y-axis rotation laser signal rotates to the direction of the detection piece to be detected by the detection piece, and the detected time is taken as a third time.
Because the rotation duration of the X-axis rotation laser signal can be predetermined, the rotation angle of the Y-axis rotation laser signal can be determined based on the third time and the first time, and correspondingly, the second included angle corresponding to the detection piece at the second time and the first time can also be determined and obtained by detection.
In other words, a second included angle corresponding to each detecting element can be determined based on the first time and the third time, where the second included angle is an included angle between the scanning plane and the second designated plane when the Y-axis rotation laser signal emitted by the first emitting device passes through the corresponding detecting element.
For example, with continued reference to FIG. 6, the scan plane of the Y-axis rotating laser signal passing through detector m2 is a plane emanating from O3 in the direction of m2 and collinear with the X-axis and intersecting a second designated plane, which in FIG. 6 is at a second angle b, which is also the angle of the scan plane with the X3-O3-Y3 plane, since the second designated plane is parallel to the X3-O3-Y3 plane. If the Y-axis rotation laser signal is rotated around the X-axis from the direction of the Y-axis, the included angle b is also the rotation angle of the Y-axis rotation laser signal.
That is, an angle between a scanning plane of the Y-axis rotation laser signal passing through the corresponding detecting element and a second designated plane of the fourth coordinate system, that is, a second angle, is related to the rotation angle of the Y-axis rotation laser signal, and is determined according to the angle at which the Y-axis rotation laser signal starts to rotate.
Optionally, in step U200, determining a second included angle corresponding to each detecting element based on the first time and the third time may include the following steps:
u203: calculating the time difference between the third moment and the first moment, and calculating the time difference between the time difference and a set time length to obtain a second time difference, wherein the set time length is the time length for the first transmitting device to transmit the X-axis laser rotation signal;
u204: and determining a second included angle corresponding to the detection piece according to the second time difference and the Y-axis rotation angular velocity corresponding to the Y-axis rotation laser signal.
Since the first time is the time when the X-axis rotation laser signal starts to rotate (the time when the X-axis rotation laser signal starts to emit is also the time when the Y-axis rotation laser signal passes through the detection piece), and the third time also includes the whole process of emitting the X-axis rotation laser signal, then the time difference between the third time and the first time includes the time length when the first emitting device emits the X-axis laser rotation signal and the time length when the Y-axis rotation laser signal rotates to the detection piece, and the second time difference obtained by the time difference between the time difference and the set time length is calculated, that is, the time length when the first emitting device emits the X-axis laser rotation signal is removed, and the time length when the Y-axis rotation laser signal rotates to the detection piece is also the time length when the Y-axis rotation laser signal rotates to the detection piece.
The Y-axis rotation angular velocity corresponding to the Y-axis rotation laser signal may be predetermined and may be preset in the AR glasses.
After the second time difference and the Y-axis rotation angular velocity are determined, a second included angle corresponding to the detection piece can be determined. Optionally, when the Y-axis rotation laser signal starts to rotate around the X-axis from the Y-axis direction, a product of the second time difference and the Y-axis rotation angular velocity may be determined as the second included angle corresponding to the detection element, and in other cases, a certain adjustment may be performed according to the product, which is not limited specifically.
It is understood that the order of determining the first angle and the second angle is not limited.
In step U300, fourth position information of the detection piece projected to the second designated plane is determined based on the first included angle and the second included angle corresponding to the detection piece.
With continued reference to fig. 6, the point projected by the detecting element onto the second designated plane is mc, i.e. the intersection point of the connecting line between the emitting point of the emitting device (i.e. the origin of coordinates O3) and the m2 position of the detecting element and the second designated plane, and the position information of mc is the fourth position information, and can be determined according to the first included angle a and the second included angle b.
In the case where the second designated plane is Z3 ═ 1, the fourth position information is obtained with a, b, and Z3 ═ 1: (x, y, z) ═ 1/tana, 1/tanb, 1)
Where tan is an angle operation formula, X is a coordinate value along an axis X3, Y is a coordinate value along an axis Y3, and Z is a coordinate value along an axis Z3, and 1 of Z can be ignored in subsequent calculations.
In step U400, fifth position information of each detection piece in a first coordinate system is obtained, where the first coordinate system is a coordinate system to which the AR glasses are applied.
The first coordinate system is a coordinate system to which the AR glasses are applied. The first coordinate system may be a three-dimensional rectangular coordinate system, but is not limited thereto. For example, referring to FIG. 6, G2 is a first coordinate system consisting of three directional axes X4-Y4-Z4. m2 is one of the detecting members.
Since all the detecting members are fixed on the AR glasses, the position information of each detecting member in the first coordinate system to which the AR glasses are applied can be predetermined, and can be stored in a memory in the AR glasses and acquired from the memory when necessary.
For example, the fourth position information of the detected detecting member in the first coordinate system may be acquired from the memory of the AR glasses.
In step U500, calculating a current third conversion relation according to the fourth position information and the fifth position information of each detection piece; the third transformation relation is a transformation relation between the first coordinate system and the fourth coordinate system, and the third transformation relation is used for representing a pose relation of the AR glasses relative to the first transmitting device.
The fourth position information is position information of the second designated plane on which the detecting member is projected to the fourth coordinate system, and the fifth position information is position information of the detecting member in the first coordinate system, then the current third conversion relationship may be calculated based on the fourth position information and the fifth position information of each detecting member.
The third transformation relationship may characterize a pose relationship of the AR glasses relative to the first transmission device, and since the first transmission device is fixed to the movable carriage, the third transformation relationship is also used to characterize the pose relationship of the AR glasses relative to the movable carriage.
The third conversion relation can be directly used as the determined pose information of the AR glasses; certainly, a certain conversion may be performed, for example, the third conversion relationship is converted based on the relationship between the fourth coordinate system applied by the first transmitting device and the coordinate system applied by the movable compartment, and the obtained result is used as the determined pose information of the AR glasses.
In one embodiment, the third transformation relationship includes a second rotation matrix and a second translation matrix of the AR glasses from the first coordinate system to the fourth coordinate system.
Based on the second rotation matrix and the second translation matrix, the rotation and translation of the AR glasses relative to the first transmitting device, that is, the rotation and translation of the AR relative to the movable carriage, that is, the attitude relationship, can be determined.
Correspondingly, in the step U500, calculating the current third conversion relationship according to the fourth position information and the fifth position information of each detecting element may include the following steps:
u501: acquiring a second initial rotation matrix and a second initial translation matrix which are constructed aiming at the first coordinate system and the fourth coordinate system, wherein the second initial rotation matrix and the second initial translation matrix have unknowns to be solved;
u502: for each detection piece, converting fifth position information of the detection piece based on the second initial rotation matrix and the second initial translation matrix to obtain sixth position information of the detection piece in the fourth coordinate system;
u503: and solving the second initial rotation matrix and the second initial translation matrix based on the equal relationship between the fourth position information and the sixth position information of each detection piece to obtain the second rotation matrix and the second translation matrix.
With continued reference to fig. 6, assuming that the fifth position information of one of the detectors m2 in the first coordinate system is g2, the first initial rotation matrix is denoted by R2, the first initial translation matrix is denoted by T2, and accordingly, the sixth position information of m2 in the fourth coordinate system may be:
(Mx2,My2,Mz2)=(R2*g2+T2)
where Mx2 is a coordinate value in the sixth positional information along the X3 axis, My2 is a coordinate value in the sixth positional information along the Y3 axis, and Mz2 is a coordinate value in the sixth positional information along the Z3 axis.
The estimated position information of the other detecting elements can be represented in the same manner, and will not be described in detail herein.
The above-described fourth position information is an observed value, and the sixth position information is an estimated value containing R2 and T2 to be solved, which should be theoretically equal.
Therefore, the second initial rotation matrix and the second initial translation matrix may be solved based on a relationship that the fourth position information and the sixth position information of each detection piece are equal.
For one detecting member, two equations can be established based on the fourth position information and the sixth position information, one being an equation between coordinate values with respect to the XC axis direction and the other being an equation between coordinate values with respect to the YC axis direction.
Optionally, the R2 may be a matrix of 3X3, which may contain 9 unknowns; t2 may be a matrix of 3X1, which may contain 3 unknowns.
Thus, 12 unknowns are present in total in R2 and T2, and 12 equations are required to solve all unknowns, so that, in this case, when the number of detectors in the transmission range of the first transmission device reaches 6, all unknowns in the second initial rotation matrix R2 and the second initial translation matrix T2 can be solved smoothly, thereby obtaining the required second rotation matrix and second translation matrix.
Of course, the unknowns in R2 and T2 are not limited to the above, and when there is a certain position constraint between the AR glasses and the first transmitting device, the unknowns in R2 and T2 may be correspondingly reduced, and then the number of detecting elements that need to be within the transmitting range of the first transmitting device may be appropriately reduced, and is not limited specifically.
In one embodiment, after calculating the current third conversion relationship according to the fourth position information and the fifth position information of each detecting element, the method further comprises the following steps:
u600: calculating a fourth conversion relation according to the position relation between the first transmitting device and other transmitting devices arranged in the movable carriage body and the third conversion relation; the fourth conversion relation is a conversion relation between the first coordinate system applied by the AR glasses and a fifth coordinate system applied by the other transmitting device, and the fourth conversion relation is used for representing the pose relation of the AR glasses relative to the other transmitting device.
In some cases, due to the steering of the wearer, not all of the detecting pieces can be in the transmitting range of the same transmitting device, so that after the position relationship of the AR glasses relative to one of the transmitting devices is determined, the position relationship of the AR glasses relative to other transmitting devices can be determined according to the position relationship between the transmitting devices even if the other transmitting devices do not exist in the current transmitting range of the other transmitting devices.
In some cases, the position relationship of the AR glasses may be determined with reference to one of the transmitting devices, and thus, the position relationship of the AR glasses determined based on the other transmitting devices needs to be converted into the position relationship of the AR glasses with respect to the one transmitting device. Of course, this is only preferred and not particularly limited.
In one embodiment, after calculating the current third conversion relationship according to the fourth position information and the fifth position information of each detecting element, the method further comprises the following steps:
u700: and determining the situation of the AR glasses from the specified time to the current pose transformation according to the current third transformation relation and the third transformation relation obtained at the specified time.
The specified time may be an initial time or a previous time.
Taking the appointed time as an initial time as an example, the third conversion relation obtained at the appointed time is used as the reference pose information of the AR glasses, and the current third conversion relation is used as the current pose information of the AR glasses, so that the motion trail and the posture transformation of the AR glasses can be determined by always taking the reference pose information at the initial time as a reference.
In one embodiment, the AR glasses may be provided with an inertial sensor IMU, the IMU may predict the pose of the AR glasses, for example, the pose of the AR glasses may be integrally predicted based on the current third conversion relationship, and the frequency predicted by the IMU may be higher than the emission frequency of the emission device, so that the predicted pose information may be determined at an intermediate time between two emissions of the emission device, making up for a gap in which the pose information cannot be determined based on the emission device, and thus, the frequency of the AR glasses outputting the pose information may be consistent with the frequency predicted by the IMU.
In the above embodiment, a plurality of detecting elements are disposed on the AR glasses, at least one emitting device is disposed in the movable compartment, the emitting device can sequentially emit the indication signal, the X-axis rotation laser signal, and the Y-axis rotation laser signal, according to the first time, the second time, and the third time when the detecting elements respectively detect the indication signal, the X-axis rotation laser signal, and the Y-axis rotation laser signal, the fourth position information of the detecting element in the second designated plane of the fourth coordinate system applied by the emitting device can be determined as the observed value of the position of the detecting element, and the third conversion relationship between the first coordinate system and the fourth coordinate system can be determined by combining the fifth position information of the detecting element in the first coordinate system applied by the AR glasses, the third conversion relationship can represent the pose relationship of the AR glasses with respect to the emitting device, and since the position relationship between the emitting device and the movable compartment is determined, the position and pose information of the AR glasses in the movable compartment body is determined as the auxiliary device through the detection pieces, the whole process is basically not influenced by the movement speed of the movable compartment, the high-precision positioning of the AR glasses can be realized, the determined position and pose information is more accurate, and the stable and reliable enhancement display application of the AR glasses is provided with accurate position and pose information.
A third aspect of the embodiments of the present invention provides a method for determining a pose of AR glasses, and the method for determining a pose of AR glasses according to the third aspect of the embodiments of the present invention is described in detail below.
In the pose determination method of the AR glasses provided in the third aspect, the AR glasses are inside a movable compartment, N3 second markers are disposed inside the movable compartment, and N3 is greater than 1; the AR glasses have binocular cameras.
The binocular camera of the AR glasses can be the AR glasses, so that the pose information of the AR glasses can be realized by utilizing the original components, and the equipment cost can be saved. For example, referring to fig. 2, the binocular glasses may include a first eye camera 31 and a second eye camera 32.
The plurality of second markers arranged on the movable carriage body can be distributed at different positions of the movable carriage body. Referring to fig. 7, 9 second markers 51-59 can be arranged in the movable compartment 10 and can be distributed on the windshield and the two side windows, so that the AR glasses rotate back and forth, left and right, and the binocular camera can collect the plurality of second markers, so that the pose information of the AR glasses in the movable compartment can be determined. It is to be understood that the number of second markers provided in the movable carriage 10 is only a preferred example and not a limitation, and may be more or less.
Alternatively, the second marking element may be, for example, an infrared emitter, which may emit infrared light, for example, may emit infrared light in a blinking manner. The frequency bands of the infrared light emitted by different second markers may be different, and different second markers may be distinguished through the frequency bands, or of course, different second markers may also be distinguished in other manners, which is not limited specifically. It is to be understood that the second marker is not limited to an infrared emitter, and may be other markers as long as they can be detected by an image.
In one embodiment, referring to fig. 8, the method may include the steps of:
t100: respectively detecting identical second markers from a first image currently acquired by a first eye camera and a second image currently acquired by a second eye camera in the binocular cameras;
t200: determining seventh position information and eighth position information of the detected second marker in the first image and second image, respectively, when the number of detected second markers reaches M3, the M3 being greater than 1 and less than or equal to the N3;
t300: determining Z-axis coordinates of the detected second marker on a sixth coordinate system applied by a first target camera based on the seventh and eighth position information, the Z-axis coordinates representing a distance between the detected second marker and the first target camera;
t400: determining the current three-dimensional coordinate position of the detected second marker in the sixth coordinate system according to the seventh position information and the Z-axis coordinate;
t500: and determining the current pose information of the AR glasses relative to the movable carriage according to the current three-dimensional coordinate position of each detected second marker in the sixth coordinate system.
An executing subject of the pose determination method for the AR glasses may be the AR glasses, and specifically may be a processor in the AR glasses. Of course, the execution subject is not limited to the AR glasses, and may be other processing devices, for example, without limitation. In the following embodiments, description will be made taking the execution subject as AR glasses as an example.
The binocular camera can gather the image with certain cycle, and first mesh camera and second mesh camera can gather the image in step to send the image of gathering for AR glasses, more specifically can send the treater that sends for AR glasses and handle.
In step T100, the same second marker is detected from a first image currently captured by a first eye camera and a second image currently captured by a second eye camera of the binocular cameras, respectively.
When receiving a first image currently acquired by a first eye camera and a second image currently acquired by a second eye camera, the AR glasses can detect second markers from the first image and the second image respectively, and when detecting the first markers, different first markers can be distinguished. In this embodiment, it is necessary to detect the same second marker in the first image and the second image, that is, the second marker exists in both the first image and the second image.
In step T200, when the number of detected second markers reaches M3, seventh positional information and eighth positional information of the detected second markers in the first image and the second image, respectively, are determined.
The M3 is greater than 1 and less than or equal to the N3. Optionally, M3 may be, for example, 4, 5 or 6, and is not particularly limited. The following description will be given taking M3 as 4 as an example.
When the number of the identical second markers reaches 4, it is explained that it is sufficient to determine the attitude information of the AR glasses from 4 identical second markers in the first and second images, and therefore, the seventh position information and the eighth position information of the detected second markers in the first and second images, respectively, can be determined.
The seventh positional information and the eighth positional information herein are positional information at which the second marker is actually detected in the image, and are not estimated values.
Optionally, the second markers in the first image and the second image may be detected by using a neural network, the neural network may output the detected position information of the same second marker in the first image and the second image, and when the number of the position information of the same second marker reaches 6, the position information of each of the same second markers output by the neural network may be acquired as seventh position information of each of the same second markers in the first image and eighth position information of each of the same second markers in the second image. The position information may be, specifically, position information of the detection frame, may be position information of a center point of the detection frame, and is not limited specifically. Of course, the detection method of the second marker in the image is not limited, and other methods are also possible.
In step T300, the Z-axis coordinate of the detected second marker on the sixth coordinate system applied by the first target camera is determined based on the seventh position information and the eighth position information.
The Z-axis coordinate represents a distance between the detected second marker and the first eye camera.
The detected second marker is the same second marker as the detected second marker described above.
The above-mentioned Z-axis coordinates may be calculated based on the principle of binocular vision.
Optionally, in step T300, determining the Z-axis coordinate of the detected second marker on the sixth coordinate system applied by the first target camera based on the seventh position information and the eighth position information may include the following steps:
t301: projecting the seventh position information onto a third designated plane of the sixth coordinate system to obtain ninth position information;
t302: projecting the eighth position information onto a fourth designated plane of a seventh coordinate system to obtain tenth position information, wherein the seventh coordinate system is a coordinate system applied by the second camera;
t303: and determining the Z-axis coordinate of the detected second marker according to the conversion relation between the sixth coordinate system and the seventh coordinate system and the ninth position information and the tenth position information.
For example, referring to fig. 9, G3 is a sixth coordinate system applied to the first-eye camera, and may be composed of three directional axes X5-Y5-Z5, and the third designated plane may be, for example, a plane where Z5 is 1. G4 is a seventh coordinate system applied to the second-eye camera, and may be composed of three axes of X6-Y6-Z6, and the fourth designated plane may be, for example, a plane where Z6 is 1.
Assuming that m2 is one of the detected second markers, the point where m2 is projected onto the third designated plane is P0, and the position information of P0 is the ninth position information, which may be:
(p0x,p0y,1)=((u0-cx0)/fx0,(v0-cy0)/fy0,1)
wherein, cx0, fx0, cy0 and fy0 are internal references marked by a sixth coordinate system, and (u0, v0) is seventh position information of m2 in the first image.
Similarly, the point projected by m2 onto the fourth designated plane is P1, and the position information of P1 is tenth position information, which may be:
(p1x,p1y,1)=((u1-cx1)/fx1,(v1-cy1)/fy1,1)
wherein, cx1, fx1, cy1 and fy1 are internal references marked by a seventh coordinate system, and (u1, v1) is eighth position information of m2 in the second image.
Assuming that the position information of m2 in the sixth coordinate system and the seventh coordinate system is (x0, y0, z0) and (x1, y1, z1), respectively; the conversion relationship between the sixth coordinate system and the seventh coordinate system may include a rotation matrix R10 and a translation matrix T10 of the sixth coordinate system to the seventh coordinate system, and based on the conversion relationship and the ninth position information and the tenth position information, the following equation may be determined:
z1*(p1x,p1y,1)=R10*z0*(p0x,p0y,1)+T10
wherein z1 (p1x, p1y,1) represents the three-dimensional position coordinates of m2 in the seventh coordinate system, z0 (p0x, p0y,1) represents the three-dimensional position coordinates in the sixth coordinate system, and R10 and t10 are a rotation matrix and a translation matrix obtained by calibration of the binocular camera. The above equations are cross-multiplied left and right simultaneously (p1x, p1y,1) to yield the following equation:
(p1x,p1y,1)×z1*(p1x,p1y,1)=
(p1x,p1y,1)×R10*z0*(p0x,p0y,1)+t10
wherein (p1x, p1y,1) × (p1x, p1y,1) ═ 0, so that:
z0*(p1x,p1y,1)×R10*(p0x,p0y,1)+t10=0
in the above equation, only Z0 is an unknown number, and the solution can obtain Z0, which is the Z0 coordinate of the detected second marker to be solved.
In step T400, a current three-dimensional coordinate position of the detected second marker in the sixth coordinate system is determined according to the seventh position information and the Z-axis coordinate.
Continuing with the previous example, after z0 is solved, z0 (p0x, p0y,1) is the current three-dimensional coordinate position of the detected second marker in the sixth coordinate system.
Based on the above manner, the current three-dimensional coordinate position of each detected second marker in the sixth coordinate system can be calculated. Optionally, in order to ensure that the pose information of the AR glasses can be accurately calculated, at least 4 current three-dimensional coordinate positions of the detected second markers in the sixth coordinate system are calculated.
In step T500, the present position information of the AR glasses relative to the movable compartment is determined according to the present three-dimensional coordinate position of each detected second marker in the sixth coordinate system.
Since each detected second marker is fixed in the movable compartment, the current three-dimensional coordinate position of each detected second marker in the sixth coordinate system is determined, that is, the position and attitude information of each detected movable compartment relative to the AR glasses, that is, the current position and attitude information of the AR glasses relative to the movable compartment can be determined.
Alternatively, the orientation and position of the AR glasses may be calculated as the pose information according to the current three-dimensional coordinate position of each detected second marker in the sixth coordinate system.
In one embodiment, after determining a current three-dimensional coordinate position of the detected second marker in the sixth coordinate system based on the seventh position information, the eighth position information, and the Z-axis coordinate, the method further comprises:
t600: and determining the change condition of the AR glasses from the specified time to the current pose according to the current three-dimensional coordinate position of the detected second marker and the three-dimensional coordinate position of the AR glasses in the sixth coordinate system at the specified time.
The specified time may be an initial time or a previous time.
Taking the appointed time as the initial time as an example, the pose information obtained at the appointed time is used as the reference pose information of the AR glasses, and the current pose information is used as the current pose information of the AR glasses, so that the motion trail and the posture transformation of the AR glasses can be determined by always taking the reference pose information at the initial time as a reference.
Preferably, in order to better determine the relative posture change condition from the previous time to the current time on the AR glasses, in the case that at least 4 identical second markers are observed together at the previous time and at the current time, the current three-dimensional coordinate positions of the second markers and the three-dimensional coordinate position of the second markers in the sixth coordinate system at the specified time can be solved respectively, and then the relative posture change condition from the previous time to the current time on the AR glasses can be solved by using the matching relationship between the previous time and the current time and adopting an ICP (iterative closest point) algorithm.
In one embodiment, the AR glasses may be provided with an inertial sensor IMU, which may predict the pose of the AR glasses, for example, may perform integral prediction of the next pose of the AR glasses based on the current pose information, and the frequency predicted by the IMU may be higher than the image capture frequency of the binocular camera, so that the predicted pose information may be determined at an intermediate time between two captures, making up for the gap that the pose information cannot be determined based on the image, and thus, the frequency at which the AR glasses output the pose information may be consistent with the frequency predicted by the IMU.
In the above embodiment, a plurality of second markers are arranged in the movable carriage, images are captured by using the binocular cameras of the AR glasses, when the number of the same second markers exists in the first image captured by the first target camera and the second image captured by the second target camera and meets the number requirement, the Z-axis coordinate of the second marker on the sixth coordinate system applied by the first target camera can be determined according to the seventh position information and the eighth position information of the second marker in the first image and the second image, respectively, the Z-axis coordinate can represent the distance between the second marker and the first target camera, the current three-dimensional coordinate position of the second marker in the sixth coordinate system can be determined according to the seventh position information and the Z-axis coordinate, the position relationship between the second marker and the AR glasses is represented, and the second marker is fixed on the movable carriage, therefore, according to the current three-dimensional coordinate position of each second marker in the sixth coordinate system, the current pose information of the AR glasses relative to the movable compartment body can be determined, the whole process is basically not influenced by the movement speed of the movable compartment, the high-precision positioning of the AR glasses can be realized, the determined pose information is more accurate, and accurate pose information is provided for stable and reliable enhancement display application of the AR glasses.
The invention provides a pose determination device of AR glasses, wherein the AR glasses are positioned in a movable compartment body, and at least one camera is arranged in the movable compartment body; n1 first markers are arranged on the AR glasses, and N1 is larger than 1; referring to fig. 10, the pose determination apparatus 100 of the AR glasses includes:
the first marker detection module 101 is configured to detect a first marker from an image currently acquired by a first camera, where the first camera is any one of cameras disposed in the movable compartment;
an observation position information obtaining module 102, configured to obtain observation position information of each detected first marker in the image when the number of detected first markers reaches M1; the M1 is greater than 1 and less than or equal to N1;
a first position information acquiring module 103, configured to acquire first position information of the detected first marker in a first coordinate system, where the first coordinate system is a coordinate system applied to the AR glasses;
a first posture determining module 104, configured to calculate a current first conversion relationship according to the first position information and the observation position information of each detected first marker; the first conversion relation is a conversion relation from the first coordinate system to a second coordinate system applied by the first camera, and the first conversion relation is used for representing a pose relation of the AR glasses relative to the first camera.
In one embodiment of the present invention,
the first transformation relationship comprises a first rotation matrix and a first translation matrix of the AR glasses from the first coordinate system to the second coordinate system;
the first posture determining module is specifically configured to, when calculating the current first conversion relationship according to the first position information and the observation position information of each detected first marker:
acquiring a first initial rotation matrix and a first initial translation matrix which are constructed aiming at the first coordinate system and the second coordinate system, wherein the first initial rotation matrix and the first initial translation matrix have unknowns to be solved;
for each detected first marker, converting first position information of the first marker based on the first initial rotation matrix and the first initial translation matrix to obtain estimated position information of the first marker in the second coordinate system;
and solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker to obtain the first rotation matrix and the first translation matrix.
In one embodiment, the first pose determination module, when solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker, is specifically configured to:
converting the observation position information to a first designated plane of the second coordinate system according to the internal reference calibrated by the first camera to obtain second position information;
projecting the estimated position information to the first designated plane to obtain third position information;
and solving the unknowns in the first initial rotation matrix and the first initial translation matrix based on the relationship that the second position information and the third position information of each detected first marker are equal.
In one embodiment, the apparatus further comprises:
the first other pose information determining module is used for calculating a second conversion relation according to the position relation between the first camera and other cameras arranged in the movable carriage and the first conversion relation; the second conversion relationship is a conversion relationship between the first coordinate system applied by the AR glasses and the third coordinate system applied by the other cameras, and the second conversion relationship is used for representing the pose relationship of the AR glasses relative to the other cameras.
In one embodiment, the apparatus further comprises:
and the first pose transformation determining module is used for determining the transformation condition of the AR glasses from the specified time to the current pose according to the current first transformation relation and the first transformation relation obtained at the specified time.
The pose determination device for the AR glasses provided by the fourth aspect of the present invention is corresponding to the pose determination method for the AR glasses provided by the first aspect of the present invention, and the implementation process of the function and the action of each unit in the device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
A fifth aspect of the present invention provides a pose determining apparatus for AR glasses, where the AR glasses are located inside a movable carriage, the movable carriage is provided with at least one transmitting device, the transmitting device is configured to transmit an indication signal, an X-axis rotation laser signal, and a Y-axis rotation laser signal, the indication signal is configured to indicate that an X-axis rotation laser signal starts to be transmitted, the X-axis rotation laser signal is a linear laser signal rotating around a Y-axis when the indication signal is transmitted, the Y-axis rotation laser signal is a linear laser signal rotating around an X-axis when the X-axis rotation laser signal is transmitted, and the X-axis and the Y-axis are two different axes in a fourth coordinate system applied by the transmitting device; the AR glasses are provided with N2 detection pieces, wherein N2 is larger than 1; referring to fig. 11, the pose determination apparatus 200 of the AR glasses includes:
the time acquisition module 201 is configured to acquire a first time when the detection element detects the indication signal emitted by the first emitting device, a second time when the X-axis rotation laser signal passes through the detection element, and a third time when the Y-axis rotation laser signal passes through the detection element; the first launching device is any launching device arranged in the movable carriage body;
an included angle determining module 202, configured to determine a first included angle corresponding to each detection element based on the first time and the second time, and determine a second included angle corresponding to each detection element based on the first time and the third time; the first included angle is an included angle between a scanning plane when the X-axis rotating laser signal passes through the corresponding detection piece and a second designated plane of the fourth coordinate system, and the second included angle is an included angle between a scanning plane when the Y-axis rotating laser signal passes through the corresponding detection piece and the second designated plane;
a fourth position information determining module 203, configured to determine fourth position information of the detection element projected to the second designated plane based on the first included angle and the second included angle corresponding to the detection element;
a fifth position information obtaining module 204, configured to obtain fifth position information of each detection element in a first coordinate system, where the first coordinate system is a coordinate system applied to the AR glasses;
the second position and orientation determining module 205 is configured to calculate a current third conversion relationship according to the fourth position information and the fifth position information of each detecting element; the third transformation relation is a transformation relation between the first coordinate system and the fourth coordinate system, and the third transformation relation is used for representing a pose relation of the AR glasses relative to the first transmitting device.
In an embodiment, when the included angle determining module determines the first included angle corresponding to each detecting element based on the first time and the second time, the included angle determining module is specifically configured to:
calculating a first time difference between the second time and the first time;
and determining a first included angle corresponding to the detection piece according to the first time difference and an X-axis rotation angular velocity corresponding to the X-axis rotation laser signal.
In an embodiment, when the included angle determining module determines the second included angle corresponding to each detecting element based on the first time and the third time, the included angle determining module is specifically configured to:
calculating the time difference between the third moment and the first moment, and calculating the time difference between the time difference and a set time length to obtain a second time difference, wherein the set time length is the time length for the first transmitting device to transmit the X-axis laser rotation signal;
and determining a second included angle corresponding to the detection piece according to the second time difference and the Y-axis rotation angular velocity corresponding to the Y-axis rotation laser signal.
In one embodiment, the third transformation relationship comprises a second rotation matrix and a second translation matrix of the AR glasses from the first coordinate system to the fourth coordinate system;
the second posture determining module is specifically configured to, when calculating a current third conversion relationship according to the fourth position information and the fifth position information of each detection piece:
acquiring a second initial rotation matrix and a second initial translation matrix which are constructed aiming at the first coordinate system and the fourth coordinate system, wherein the second initial rotation matrix and the second initial translation matrix have unknowns to be solved;
for each detection piece, converting fifth position information of the detection piece based on the second initial rotation matrix and the second initial translation matrix to obtain sixth position information of the detection piece in the fourth coordinate system;
and solving the second initial rotation matrix and the second initial translation matrix based on the equal relationship between the fourth position information and the sixth position information of each detection piece to obtain the second rotation matrix and the second translation matrix.
In one embodiment, the apparatus further comprises:
the second other pose information determining module is used for calculating a fourth conversion relation according to the position relation between the first emitting device and other emitting devices arranged in the movable carriage body and the third conversion relation; the fourth conversion relation is a conversion relation between the first coordinate system applied by the AR glasses and a fifth coordinate system applied by the other transmitting device, and the fourth conversion relation is used for representing the pose relation of the AR glasses relative to the other transmitting device.
In one embodiment, the apparatus further comprises:
and the second pose transformation determining module is used for determining the transformation condition of the AR glasses from the specified time to the current pose according to the current third transformation relation and the third transformation relation obtained at the specified time.
The pose determination device for the AR glasses provided by the fifth aspect of the present invention is corresponding to the pose determination method for the AR glasses provided by the second aspect of the present invention, and the implementation process of the function and the effect of each unit in the device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
A sixth aspect of the present invention provides a pose determination apparatus for AR glasses, where the AR glasses are located inside a movable compartment, N3 second markers are disposed in the movable compartment, and N3 is greater than 1; the AR glasses have binocular cameras; referring to fig. 12, the pose determination apparatus 300 of the AR glasses includes:
a second marker detection module 301, configured to detect a same second marker from a first image currently acquired by a first eye camera and a second image currently acquired by a second eye camera in the binocular cameras, respectively;
a seventh and eighth position information determining module 302 for determining seventh position information and eighth position information of the detected second marker in the first image and second image, respectively, when the number of detected second markers reaches M3, the M3 being greater than 1 and less than or equal to the N3;
a Z-axis coordinate determination module 303, configured to determine, based on the seventh position information and the eighth position information, a Z-axis coordinate of the detected second marker on a sixth coordinate system applied to the first target camera, where the Z-axis coordinate represents a distance between the detected second marker and the first target camera;
a three-dimensional coordinate position determining module 303, configured to determine, according to the seventh position information and the Z-axis coordinate, a current three-dimensional coordinate position of the detected second marker in the sixth coordinate system;
a third pose determination module 304, configured to determine pose information of the AR glasses currently relative to the movable compartment according to a current three-dimensional coordinate position of each detected second marker in the sixth coordinate system.
In one embodiment, the Z-axis coordinate determination module, when determining the Z-axis coordinate of the detected second marker on a sixth coordinate system applied by the first target camera based on the seventh position information and the eighth position information, is specifically configured to:
projecting the seventh position information onto a third designated plane of the sixth coordinate system to obtain ninth position information;
projecting the eighth position information onto a fourth designated plane of a seventh coordinate system to obtain tenth position information, wherein the seventh coordinate system is a coordinate system applied by the second camera;
and determining the Z-axis coordinate of the detected second marker according to the conversion relation between the sixth coordinate system and the seventh coordinate system and the ninth position information and the tenth position information.
In one embodiment, the apparatus further comprises:
and the third pose change determining module is used for determining the change condition of the AR glasses from the specified time to the current pose according to the current three-dimensional coordinate position of the detected second marker and the three-dimensional coordinate position of the AR glasses in the sixth coordinate system at the specified time.
The pose determination device for the AR glasses provided by the sixth aspect of the present invention is corresponding to the pose determination method for the AR glasses provided by the third aspect of the present invention, and the implementation process of the function and the effect of each unit in the device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and elements described as separate parts may or may not be physically separate, and parts shown as units may or may not be physical units.
The invention also provides an electronic device, which comprises a processor and a memory; the memory stores a program that can be called by the processor; when the processor executes the program, the method for determining the pose of the AR glasses in the foregoing embodiment is implemented.
The embodiment of the pose determining device of the AR glasses can be applied to electronic equipment. Taking a software implementation as an example, as a logical device, the device is formed by reading, by a processor of the electronic device where the device is located, a corresponding computer program instruction in the nonvolatile memory into the memory for operation. From a hardware aspect, as shown in fig. 13, fig. 13 is a hardware structure diagram of an electronic device where the pose determination apparatus 100 of the AR glasses according to an exemplary embodiment of the present invention is located, and in addition to the processor 510, the memory 530, the network interface 520, and the nonvolatile memory 540 shown in fig. 13, the electronic device where the pose determination apparatus 100 of the AR glasses according to the embodiment is located may also include other hardware generally according to the actual function of the electronic device, which is not described again.
The present invention also provides a non-transitory electronic device-readable storage medium having stored thereon a program that, when executed by a processor, implements the pose determination method of the AR glasses as in the foregoing embodiments.
The present invention may take the form of a computer program product embodied on one or more storage media including, but not limited to, disk storage, CD-ROM, optical storage, and the like, having program code embodied therein. Non-transitory electronic device readable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of non-transitory electronic device readable storage media include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium may be used to store information that may be accessed by a computing device.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (19)

1. A pose determination method of AR glasses is characterized in that the AR glasses are located inside a movable compartment, and at least one camera is arranged in the movable compartment; n1 first markers are arranged on the AR glasses, and N1 is larger than 1; the method comprises the following steps:
detecting a first marker from an image currently acquired by a first camera, wherein the first camera is any one camera arranged in the movable carriage body;
acquiring observation position information of each detected first marker in the image when the number of detected first markers reaches M1; the M1 is greater than 1 and less than or equal to N1;
acquiring first position information of a detected first marker in a first coordinate system, wherein the first coordinate system is a coordinate system applied to the AR glasses;
calculating a current first conversion relation according to the first position information and the observation position information of each detected first marker; the first conversion relation is a conversion relation from the first coordinate system to a second coordinate system applied by the first camera, and the first conversion relation is used for representing a pose relation of the AR glasses relative to the first camera.
2. The pose determination method of AR glasses according to claim 1,
the first transformation relationship comprises a first rotation matrix and a first translation matrix of the AR glasses from the first coordinate system to the second coordinate system;
the calculating a current first conversion relationship according to the first position information and the observation position information of each detected first marker includes:
acquiring a first initial rotation matrix and a first initial translation matrix which are constructed aiming at the first coordinate system and the second coordinate system, wherein the first initial rotation matrix and the first initial translation matrix have unknowns to be solved;
for each detected first marker, converting first position information of the first marker based on the first initial rotation matrix and the first initial translation matrix to obtain estimated position information of the first marker in the second coordinate system;
and solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker to obtain the first rotation matrix and the first translation matrix.
3. The pose determination method of AR glasses according to claim 2, wherein solving the first initial rotation matrix and the first initial translation matrix based on the estimated position information and the observed position information of each detected first marker comprises:
converting the observation position information to a first designated plane of the second coordinate system according to the internal reference calibrated by the first camera to obtain second position information;
projecting the estimated position information to the first designated plane to obtain third position information;
and solving the unknowns in the first initial rotation matrix and the first initial translation matrix based on the relationship that the second position information and the third position information of each detected first marker are equal.
4. The pose determination method of AR glasses according to claim 1, wherein after calculating the current first conversion relationship based on the first position information and the observed position information of each of the detected first markers, the method further comprises:
calculating a second conversion relation according to the position relation between the first camera and other cameras arranged in the movable carriage body and the first conversion relation; the second conversion relationship is a conversion relationship between the first coordinate system applied by the AR glasses and the third coordinate system applied by the other cameras, and the second conversion relationship is used for representing the pose relationship of the AR glasses relative to the other cameras.
5. The pose determination method of AR glasses according to claim 1, wherein after calculating the current first conversion relationship based on the first position information and the observed position information of each of the detected first markers, the method further comprises:
and determining the situation of the AR glasses from the specified time to the current pose transformation according to the current first transformation relation and the first transformation relation obtained at the specified time.
6. The position and pose determination method of the AR glasses is characterized in that the AR glasses are located inside a movable compartment, at least one transmitting device is arranged in the movable compartment, the transmitting device is used for transmitting an indication signal, an X-axis rotation laser signal and a Y-axis rotation laser signal, the indication signal is used for indicating that the X-axis rotation laser signal starts to be transmitted, the X-axis rotation laser signal is transmitted when the indication signal is transmitted, and is a linear laser signal rotating around the Y axis, the Y-axis rotation laser signal is transmitted when the X-axis rotation laser signal is transmitted, and is a linear laser signal rotating around the X axis, and the X axis and the Y axis are two different axes in a fourth coordinate system applied by the transmitting device; the AR glasses are provided with N2 detection pieces, wherein N2 is larger than 1; the method comprises the following steps:
acquiring a first moment when the detection piece detects an indication signal emitted by a first emitting device, a second moment when an X-axis rotation laser signal passes through the detection piece, and a third moment when a Y-axis rotation laser signal passes through the detection piece; the first launching device is any launching device arranged in the movable carriage body;
determining a first included angle corresponding to each detection piece based on the first moment and the second moment, and determining a second included angle corresponding to each detection piece based on the first moment and the third moment; the first included angle is an included angle between a scanning plane when the X-axis rotating laser signal passes through the corresponding detection piece and a second designated plane of the fourth coordinate system, and the second included angle is an included angle between a scanning plane when the Y-axis rotating laser signal passes through the corresponding detection piece and the second designated plane;
determining fourth position information of the detection piece projected to the second appointed plane based on the first included angle and the second included angle corresponding to the detection piece;
acquiring fifth position information of each detection piece in a first coordinate system, wherein the first coordinate system is a coordinate system applied to the AR glasses;
calculating a current third conversion relation according to the fourth position information and the fifth position information of each detection piece; the third transformation relation is a transformation relation between the first coordinate system and the fourth coordinate system, and the third transformation relation is used for representing a pose relation of the AR glasses relative to the first transmitting device.
7. The pose determination method for the AR glasses according to claim 6, wherein determining the first included angle corresponding to each detection piece based on the first time and the second time comprises:
calculating a first time difference between the second time and the first time;
and determining a first included angle corresponding to the detection piece according to the first time difference and an X-axis rotation angular velocity corresponding to the X-axis rotation laser signal.
8. The pose determination method for the AR glasses according to claim 6, wherein determining the second angle corresponding to each detection piece based on the first time and the third time comprises:
calculating the time difference between the third moment and the first moment, and calculating the time difference between the time difference and a set time length to obtain a second time difference, wherein the set time length is the time length for the first transmitting device to transmit the X-axis laser rotation signal;
and determining a second included angle corresponding to the detection piece according to the second time difference and the Y-axis rotation angular velocity corresponding to the Y-axis rotation laser signal.
9. The pose determination method for AR glasses according to claim 6, wherein the third transformation relationship includes a second rotation matrix and a second translation matrix of the AR glasses from the first coordinate system to the fourth coordinate system;
calculating a current third conversion relation according to the fourth position information and the fifth position information of each detection piece, wherein the third conversion relation comprises the following steps:
acquiring a second initial rotation matrix and a second initial translation matrix which are constructed aiming at the first coordinate system and the fourth coordinate system, wherein the second initial rotation matrix and the second initial translation matrix have unknowns to be solved;
for each detection piece, converting fifth position information of the detection piece based on the second initial rotation matrix and the second initial translation matrix to obtain sixth position information of the detection piece in the fourth coordinate system;
and solving the second initial rotation matrix and the second initial translation matrix based on the equal relationship between the fourth position information and the sixth position information of each detection piece to obtain the second rotation matrix and the second translation matrix.
10. The pose determination method of AR glasses according to claim 6, wherein after calculating the current third conversion relationship based on the fourth position information and the fifth position information of each detection piece, the method further comprises:
calculating a fourth conversion relation according to the position relation between the first transmitting device and other transmitting devices arranged in the movable carriage body and the third conversion relation; the fourth conversion relation is a conversion relation between the first coordinate system applied by the AR glasses and a fifth coordinate system applied by the other transmitting device, and the fourth conversion relation is used for representing the pose relation of the AR glasses relative to the other transmitting device.
11. The pose determination method of AR glasses according to claim 6, wherein after calculating the current third conversion relationship based on the fourth position information and the fifth position information of each detection piece, the method further comprises:
and determining the situation of the AR glasses from the specified time to the current pose transformation according to the current third transformation relation and the third transformation relation obtained at the specified time.
12. The pose determination method of the AR glasses is characterized in that the AR glasses are located inside a movable compartment, N3 second markers are arranged in the movable compartment, and N3 is greater than 1; the AR glasses have binocular cameras; the method comprises the following steps:
respectively detecting identical second markers from a first image currently acquired by a first eye camera and a second image currently acquired by a second eye camera in the binocular cameras;
determining seventh position information and eighth position information of the detected second marker in the first image and second image, respectively, when the number of detected second markers reaches M3, the M3 being greater than 1 and less than or equal to the N3;
determining Z-axis coordinates of the detected second marker on a sixth coordinate system applied by a first target camera based on the seventh and eighth position information, the Z-axis coordinates representing a distance between the detected second marker and the first target camera;
determining the current three-dimensional coordinate position of the detected second marker in the sixth coordinate system according to the seventh position information and the Z-axis coordinate;
and determining the current pose information of the AR glasses relative to the movable carriage according to the current three-dimensional coordinate position of each detected second marker in the sixth coordinate system.
13. The pose determination method of AR glasses according to claim 12, wherein determining the Z-axis coordinate of the detected second marker on a sixth coordinate system to which the first target camera is applied based on the seventh position information and the eighth position information comprises:
projecting the seventh position information onto a third designated plane of the sixth coordinate system to obtain ninth position information;
projecting the eighth position information onto a fourth designated plane of a seventh coordinate system to obtain tenth position information, wherein the seventh coordinate system is a coordinate system applied by the second camera;
and determining the Z-axis coordinate of the detected second marker according to the conversion relation between the sixth coordinate system and the seventh coordinate system and the ninth position information and the tenth position information.
14. The pose determination method of AR glasses according to claim 12, wherein after determining the current three-dimensional coordinate position of the detected second marker in the sixth coordinate system based on the seventh position information, the eighth position information, and the Z-axis coordinate, the method further comprises:
and determining the change condition of the AR glasses from the specified time to the current pose according to the current three-dimensional coordinate position of the detected second marker and the three-dimensional coordinate position of the AR glasses in the sixth coordinate system at the specified time.
15. The pose determination device of the AR glasses is characterized in that the AR glasses are positioned in a movable compartment, and at least one camera is arranged in the movable compartment; n1 first markers are arranged on the AR glasses, and N1 is larger than 1; the device includes:
the first marker detection module is used for detecting a first marker from an image currently acquired by a first camera, and the first camera is any one camera arranged in the movable compartment body;
an observation position information acquiring module for acquiring observation position information of each of the detected first markers in the image when the number of detected first markers reaches M1; the M1 is greater than 1 and less than or equal to N1;
the first position information acquisition module is used for acquiring first position information of the detected first marker in a first coordinate system, wherein the first coordinate system is a coordinate system applied to the AR glasses;
the first position and posture determining module is used for calculating a current first conversion relation according to the first position information and the observation position information of each detected first marker; the first conversion relation is a conversion relation from the first coordinate system to a second coordinate system applied by the first camera, and the first conversion relation is used for representing a pose relation of the AR glasses relative to the first camera.
16. The position and pose determining device of the AR glasses is characterized in that the AR glasses are arranged in a movable compartment, at least one transmitting device is arranged in the movable compartment, the transmitting device is used for transmitting an indication signal, an X-axis rotation laser signal and a Y-axis rotation laser signal, the indication signal is used for indicating that the X-axis rotation laser signal is started to be transmitted, the X-axis rotation laser signal is transmitted when the indication signal is transmitted, and is a linear laser signal rotating around the Y axis, the Y-axis rotation laser signal is transmitted when the X-axis rotation laser signal is transmitted, and is a linear laser signal rotating around the X axis, and the X axis and the Y axis are two different axes in a fourth coordinate system applied by the transmitting device; the AR glasses are provided with N2 detection pieces, wherein N2 is larger than 1; the device includes:
the time acquisition module is used for acquiring a first time when the detection piece detects the indication signal emitted by the first emitting device, a second time when the X-axis rotation laser signal passes through the detection piece and a third time when the Y-axis rotation laser signal passes through the detection piece; the first launching device is any launching device arranged in the movable carriage body;
the included angle determining module is used for determining a first included angle corresponding to each detection piece based on the first moment and the second moment and determining a second included angle corresponding to each detection piece based on the first moment and the third moment; the first included angle is an included angle between a scanning plane when the X-axis rotating laser signal passes through the corresponding detection piece and a second designated plane of the fourth coordinate system, and the second included angle is an included angle between a scanning plane when the Y-axis rotating laser signal passes through the corresponding detection piece and the second designated plane;
the fourth position information determining module is used for determining fourth position information of the detection piece projected to the second designated plane based on the first included angle and the second included angle corresponding to the detection piece;
a fifth position information obtaining module, configured to obtain fifth position information of each detection element in a first coordinate system, where the first coordinate system is a coordinate system applied to the AR glasses;
the second posture determining module is used for calculating a current third conversion relation according to the fourth position information and the fifth position information of each detection piece; the third transformation relation is a transformation relation between the first coordinate system and the fourth coordinate system, and the third transformation relation is used for representing a pose relation of the AR glasses relative to the first transmitting device.
17. A posture determining device of AR glasses is characterized in that the AR glasses are arranged inside a movable compartment, N3 second markers are arranged in the movable compartment, and N3 is larger than 1; the AR glasses have binocular cameras; the device includes:
the second marker detection module is used for respectively detecting the same second markers from a first image currently acquired by a first eye camera and a second image currently acquired by a second eye camera in the binocular cameras;
seventh and eighth position information determination modules for determining seventh position information and eighth position information of the detected second marker in the first image and second image, respectively, when the number of detected second markers reaches M3, the M3 being greater than 1 and less than or equal to the N3;
a Z-axis coordinate determination module for determining a Z-axis coordinate of the detected second marker on a sixth coordinate system applied by the first target camera based on the seventh position information and the eighth position information, the Z-axis coordinate representing a distance between the detected second marker and the first target camera;
a three-dimensional coordinate position determining module, configured to determine, according to the seventh position information and the Z-axis coordinate, a current three-dimensional coordinate position of the detected second marker in the sixth coordinate system;
and the third pose determining module is used for determining the pose information of the AR glasses relative to the movable carriage according to the current three-dimensional coordinate position of each detected second marker in the sixth coordinate system.
18. An electronic device comprising a processor and a memory; the memory stores a program that can be called by the processor; wherein the processor, when executing the program, implements the pose determination method for AR glasses according to any one of claims 1 to 14.
19. A non-transitory electronic device-readable storage medium, characterized in that a program is stored thereon, which when executed by a processor, implements a pose determination method for AR glasses according to any one of claims 1 to 14.
CN202110004365.1A 2021-01-04 2021-01-04 Method, device and equipment for determining pose of AR (augmented reality) glasses and storage medium Active CN112631431B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110004365.1A CN112631431B (en) 2021-01-04 2021-01-04 Method, device and equipment for determining pose of AR (augmented reality) glasses and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110004365.1A CN112631431B (en) 2021-01-04 2021-01-04 Method, device and equipment for determining pose of AR (augmented reality) glasses and storage medium

Publications (2)

Publication Number Publication Date
CN112631431A true CN112631431A (en) 2021-04-09
CN112631431B CN112631431B (en) 2023-06-16

Family

ID=75291305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110004365.1A Active CN112631431B (en) 2021-01-04 2021-01-04 Method, device and equipment for determining pose of AR (augmented reality) glasses and storage medium

Country Status (1)

Country Link
CN (1) CN112631431B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327066A (en) * 2021-12-30 2022-04-12 上海曼恒数字技术股份有限公司 Three-dimensional display method, device and equipment of virtual reality screen and storage medium
CN114545629A (en) * 2022-01-21 2022-05-27 广东虚拟现实科技有限公司 Augmented reality device, information display method and device
CN115342806A (en) * 2022-07-14 2022-11-15 歌尔股份有限公司 Positioning method and device of head-mounted display equipment, head-mounted display equipment and medium

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3711872A1 (en) * 1987-04-08 1987-10-15 Christian Tammel Moveable device for determining positional data for electronic data processing systems (digitiser), and method of determining its absolute position
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
JP2012248206A (en) * 2012-07-26 2012-12-13 Casio Comput Co Ltd Ar processing apparatus, ar processing method and program
WO2015024361A1 (en) * 2013-08-20 2015-02-26 华为技术有限公司 Three-dimensional reconstruction method and device, and mobile terminal
DE102014221190A1 (en) * 2014-09-15 2016-03-17 Bayerische Motoren Werke Aktiengesellschaft Infrared pattern in slices of vehicles
CN106959108A (en) * 2017-03-23 2017-07-18 联想(北京)有限公司 Location determining method, system and electronic equipment
CN206400472U (en) * 2016-08-24 2017-08-11 王忠民 A kind of virtual reality device and its alignment system
CN206788480U (en) * 2017-06-15 2017-12-22 北京虚实视界科技有限公司 A kind of desktop level AR glasses systems based on location technology
CN107977977A (en) * 2017-10-20 2018-05-01 深圳华侨城卡乐技术有限公司 A kind of indoor orientation method, device and the storage medium of VR game
CN108682038A (en) * 2018-04-27 2018-10-19 腾讯科技(深圳)有限公司 Pose determines method, apparatus and storage medium
CN109816719A (en) * 2017-11-22 2019-05-28 冯晶 The key technology of unmanned helicopter visual guidance
CN110031975A (en) * 2017-12-12 2019-07-19 大众汽车有限公司 The method and system and augmented reality glasses of augmented reality glasses are calibrated in the car
JP2019144003A (en) * 2018-02-16 2019-08-29 Kddi株式会社 Device and method for use in moving body and control program and method for same
CN110998409A (en) * 2017-08-30 2020-04-10 大众汽车有限公司 Augmented reality glasses, method of determining the pose of augmented reality glasses, motor vehicle adapted to use the augmented reality glasses or method
US20200184726A1 (en) * 2018-12-05 2020-06-11 Geun Sik Jo Implementing three-dimensional augmented reality in smart glasses based on two-dimensional data
CN111338474A (en) * 2020-02-19 2020-06-26 Oppo广东移动通信有限公司 Virtual object pose calibration method and device, storage medium and electronic equipment
CN111965624A (en) * 2020-08-06 2020-11-20 北京百度网讯科技有限公司 Calibration method, device and equipment for laser radar and camera and readable storage medium
CN112118436A (en) * 2020-09-18 2020-12-22 联想(北京)有限公司 Image presentation method, device and system based on augmented reality device

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3711872A1 (en) * 1987-04-08 1987-10-15 Christian Tammel Moveable device for determining positional data for electronic data processing systems (digitiser), and method of determining its absolute position
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
JP2012248206A (en) * 2012-07-26 2012-12-13 Casio Comput Co Ltd Ar processing apparatus, ar processing method and program
WO2015024361A1 (en) * 2013-08-20 2015-02-26 华为技术有限公司 Three-dimensional reconstruction method and device, and mobile terminal
DE102014221190A1 (en) * 2014-09-15 2016-03-17 Bayerische Motoren Werke Aktiengesellschaft Infrared pattern in slices of vehicles
CN206400472U (en) * 2016-08-24 2017-08-11 王忠民 A kind of virtual reality device and its alignment system
CN106959108A (en) * 2017-03-23 2017-07-18 联想(北京)有限公司 Location determining method, system and electronic equipment
CN206788480U (en) * 2017-06-15 2017-12-22 北京虚实视界科技有限公司 A kind of desktop level AR glasses systems based on location technology
CN110998409A (en) * 2017-08-30 2020-04-10 大众汽车有限公司 Augmented reality glasses, method of determining the pose of augmented reality glasses, motor vehicle adapted to use the augmented reality glasses or method
CN107977977A (en) * 2017-10-20 2018-05-01 深圳华侨城卡乐技术有限公司 A kind of indoor orientation method, device and the storage medium of VR game
CN109816719A (en) * 2017-11-22 2019-05-28 冯晶 The key technology of unmanned helicopter visual guidance
CN110031975A (en) * 2017-12-12 2019-07-19 大众汽车有限公司 The method and system and augmented reality glasses of augmented reality glasses are calibrated in the car
JP2019144003A (en) * 2018-02-16 2019-08-29 Kddi株式会社 Device and method for use in moving body and control program and method for same
CN108682038A (en) * 2018-04-27 2018-10-19 腾讯科技(深圳)有限公司 Pose determines method, apparatus and storage medium
US20200184726A1 (en) * 2018-12-05 2020-06-11 Geun Sik Jo Implementing three-dimensional augmented reality in smart glasses based on two-dimensional data
CN111338474A (en) * 2020-02-19 2020-06-26 Oppo广东移动通信有限公司 Virtual object pose calibration method and device, storage medium and electronic equipment
CN111965624A (en) * 2020-08-06 2020-11-20 北京百度网讯科技有限公司 Calibration method, device and equipment for laser radar and camera and readable storage medium
CN112118436A (en) * 2020-09-18 2020-12-22 联想(北京)有限公司 Image presentation method, device and system based on augmented reality device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327066A (en) * 2021-12-30 2022-04-12 上海曼恒数字技术股份有限公司 Three-dimensional display method, device and equipment of virtual reality screen and storage medium
CN114545629A (en) * 2022-01-21 2022-05-27 广东虚拟现实科技有限公司 Augmented reality device, information display method and device
CN115342806A (en) * 2022-07-14 2022-11-15 歌尔股份有限公司 Positioning method and device of head-mounted display equipment, head-mounted display equipment and medium

Also Published As

Publication number Publication date
CN112631431B (en) 2023-06-16

Similar Documents

Publication Publication Date Title
CN112631431A (en) AR (augmented reality) glasses pose determination method, device and equipment and storage medium
CN109084732B (en) Positioning and navigation method, device and processing equipment
CN110377015B (en) Robot positioning method and robot positioning device
US9529945B2 (en) Robot simulation system which simulates takeout process of workpieces
Panahandeh et al. Vision-aided inertial navigation based on ground plane feature detection
JP3735344B2 (en) Calibration apparatus, calibration method, and calibration program
US8350897B2 (en) Image processing method and image processing apparatus
CN109816704A (en) The 3 D information obtaining method and device of object
KR102016636B1 (en) Calibration apparatus and method of camera and rader
JP5803367B2 (en) Self-position estimation apparatus, self-position estimation method and program
EP1596332A2 (en) Information processing method and apparatus for finding position and orientation of targeted object
EP1437645A2 (en) Position/orientation measurement method, and position/orientation measurement apparatus
JP5030953B2 (en) Method and system for determining the relative position of a first object with respect to a second object, a corresponding computer program and a corresponding computer-readable recording medium
CN109313417A (en) Help robot localization
Huang et al. A novel multi-planar LIDAR and computer vision calibration procedure using 2D patterns for automated navigation
CN109690622A (en) Camera registration in multicamera system
KR20150119337A (en) Generation of 3d models of an environment
JP2004198212A (en) Apparatus for monitoring vicinity of mobile object
JP2018190402A (en) Camera parameter set calculation device, camera parameter set calculation method, and program
JP2006317223A (en) Position attitude measuring method and apparatus
CN115371665B (en) Mobile robot positioning method based on depth camera and inertial fusion
Kinnell et al. Autonomous metrology for robot mounted 3D vision systems
Bazargani et al. Camera calibration and pose estimation from planes
Alves et al. Camera-inertial sensor modelling and alignment for visual navigation
US20090226094A1 (en) Image correcting device and method, and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant