CN116883510A - Calibration and calibration system and method for augmented reality virtual-real alignment - Google Patents

Calibration and calibration system and method for augmented reality virtual-real alignment Download PDF

Info

Publication number
CN116883510A
CN116883510A CN202310863716.3A CN202310863716A CN116883510A CN 116883510 A CN116883510 A CN 116883510A CN 202310863716 A CN202310863716 A CN 202310863716A CN 116883510 A CN116883510 A CN 116883510A
Authority
CN
China
Prior art keywords
environment detection
detection imaging
augmented reality
information
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310863716.3A
Other languages
Chinese (zh)
Inventor
龙知洲
马天
李伟萍
唐荣富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weihai Zhonghe Electromechanical Technology Co ltd
Institute of Systems Engineering of PLA Academy of Military Sciences
Original Assignee
Weihai Zhonghe Electromechanical Technology Co ltd
Institute of Systems Engineering of PLA Academy of Military Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weihai Zhonghe Electromechanical Technology Co ltd, Institute of Systems Engineering of PLA Academy of Military Sciences filed Critical Weihai Zhonghe Electromechanical Technology Co ltd
Priority to CN202310863716.3A priority Critical patent/CN116883510A/en
Publication of CN116883510A publication Critical patent/CN116883510A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a calibration and calibration system and a calibration and calibration method for augmented reality virtual-real alignment, wherein the method comprises an AR (augmented reality) optical machine, a fixed connection component and environment detection imaging equipment; the AR light machine is connected with the environment detection imaging equipment through a fixed connecting component; the imaging optical axis of the AR optomachine and the lens optical axis of the environment detection imaging device are parallel to the z-axis of the space where the system is located and perpendicular to the xoy plane of the space where the system is located. And acquiring relative position information and imaging parameters of the environment detection imaging equipment and the AR optical machine, shooting a real environment image by the environment detection imaging equipment, extracting or generating corresponding augmented reality information, acquiring relative distance information between an environment object and the environment detection imaging equipment, performing scaling, translation and rotation processing on the augmented reality information, and completing virtual-real alignment. The invention can make the design of the initial relative position of the AR optical machine and the environment detection imaging device more flexible and has higher practicability.

Description

Calibration and calibration system and method for augmented reality virtual-real alignment
Technical Field
The invention relates to the technical field of augmented reality, in particular to a calibration and calibration system and method for virtual-real alignment of augmented reality.
Background
With the development of augmented reality technology, more and more head display devices begin to introduce an AR light machine for near-to-eye display. The AR optical machine can provide a visual field penetrating effect, and can display a virtual image which is rendered by software in front of eyes of a person under the condition that normal environment visual field is not blocked, and the virtual image is overlapped and aligned with an actual image of a real environment, so that an augmented reality effect is achieved. Especially in the field of fire control, the environment of certain fire scene is complicated, the field of vision is extremely poor, and the AR technology can well restore the real environment information of scene, effectively improves the recognition of fire fighters to the scene of fire environment, makes the on-site judgement better, promotes firefighter's rescue efficiency and survival probability greatly, therefore has extremely high practical value.
The current common mode is to use environment detection imaging equipment to collect environment information, and to perform near-to-eye display through an AR optical machine after background processing end identification and judgment. The environment detection imaging device and the AR optical machine are connected through a structural member, the structural member is required to be strictly locked together for virtual and real alignment, physical fixation is completed, and then display alignment is carried out through background software by means of an external calibration plate. However, in reality, due to the fact that the structural design of the environment detection imaging device and the AR optical machine is complex, the processing precision is general, and the dimensions fixedly connected between the environment detection imaging device and the AR optical machine are single, the prior art is generally represented in terms of the precision of virtual-real alignment, in addition, in the later use process, a structural member can generate certain deformation along with the stress in the use process, so that relative displacement is generated between the environment detection imaging device and the AR optical machine, dislocation offset can be generated in the completed virtual-real alignment, software calibration is required to be performed on the device again, and the use effect of the device is generally achieved.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a calibration system and a calibration method for virtual-real alignment of augmented reality, which can complete direct fixedly connected calibration of an environment detection imaging device and an AR optical machine in at least two dimensions through a structural member with specific rigidity and structure, and can be connected with other structures of a product through a local small-range assembly surface, so that the relative fixation of space coordinates between the AR optical machine and the environment detection imaging device can be well completed; the three-dimensional space relative coordinates of the fixedly connected environment detection imaging device and the AR optical machine are mapped and converted through background software and an algorithm, virtual images and real images are aligned according to the angle of view, the size of a display area and the like, and the alignment can be further completed by combining the assistance of an external calibration plate and the like.
To solve the above technical problem, a first aspect of the present invention discloses a calibration and calibration system for augmented reality virtual-real alignment, the system comprising:
the AR light machine (11) is connected with the environment detection imaging equipment through the fixed connecting component (12) to form stable space relative position coordinates;
the imaging optical axis of the AR optical machine (11) and the lens optical axis of the environment detection imaging device (13) are parallel to the z-axis of the space where the calibration system is located and perpendicular to the xoy plane of the space where the calibration system is located.
As an alternative embodiment, the solid link member (12) includes an AR ray machine locking structure (201), a solid link structure (202), and an environment detection imaging device locking structure (203);
the AR optical machine locking structure (201) and the environment detection imaging device locking structure (203) are located above the fixing structure (202).
The AR light machine is near-to-eye display equipment;
the fixed connection component is used for connecting and physically calibrating the space relative positions of the AR optical machine and the environment detection imaging equipment;
as an alternative embodiment, the AR light engine locking structure (201) is connected to the AR light engine (11);
the AR ray machine locking structure (201) is used for realizing translational locking constraint and rotation constraint around a z axis on the AR ray machine (11) in the x axis direction and the y axis direction of a space where the calibration system is located.
As an alternative embodiment, the environment detection imaging device locking structure (203) is connected to the environment detection imaging device (13);
the environment detection imaging device locking structure (203) realizes translational locking constraint and rotation constraint around a z-axis on the environment detection imaging device (13) in the x-axis direction and the y-axis direction of a space where the calibration system is located.
As an alternative embodiment, the connection mode of the AR light machine locking structure (201) and the AR light machine (11), and the connection mode of the environment detection imaging device locking structure (203) and the environment detection imaging device (13) include, but are not limited to, screw connection, adhesive connection and clamping connection.
As an alternative implementation manner, the materials of the AR optomechanical locking structure (201), the fixing structure (202) and the environment detection imaging device locking structure (203) are preferably 7-series aluminum alloy.
The fastening members (12) should be machined from a sufficiently rigid metal piece material to ensure that no significant deformation occurs under normal external forces. Wherein, AR ray apparatus locking structure (201) mainly plays the connection effect between AR ray apparatus (11) and environment detection imaging device (13). The fixed connection structure (202) and the environment detection imaging device locking structure (203) are designed to be mainly used for limiting and locking the AR optical machine (11) and the environment detection imaging device (13) respectively in at least two dimensional directions (x and y directions), so that projection coordinates of the AR optical machine (11) and the environment detection imaging device (13) on an xoy plane are fixed, and meanwhile, due to physical locking in the x and y directions, the relative rotation angle of the AR optical machine (11) and the environment detection imaging device (13) around a z axis can be locked.
Locking means for designing the fastening structure (202) and the environment detection imaging device locking structure (203) include, but are not limited to, screws, adhesives, clamping, and the like. The design of the fixed connection structure (202) can be performed in a right angle mode, or in an arc or abnormal mode, and no matter what design mode is, the locking effect is finally generated at least in the x and y directions, and meanwhile, the indirect locking effect is generated on the relative rotation angle around the z axis.
The second aspect of the embodiment of the invention discloses a calibration and calibration method for augmented reality virtual-real alignment, which comprises the following steps:
s1, assembling and connecting the AR light machine, the fixed connection component and the environment detection imaging equipment, and locking the relative positions of the AR light machine and the environment detection imaging equipment;
s2, acquiring relative position information and imaging parameter information of the environment detection imaging equipment and the AR optical machine;
the method comprises the steps of obtaining relative position information of the environment detection imaging device and the AR optical machine and imaging parameters: according to the device for completing the fixation in the S1, the relative position information of the environment detection imaging device and the AR optical machine in a space xyz coordinate system is determined, and in addition, the respective imaging parameters of the environment detection imaging device and the AR optical machine are determined, such as information of the field angle, the aperture size, the focal length, the resolution and the like of the environment detection imaging device, and information of the field angle, the focusing distance, the display resolution and the like of the AR optical machine are determined;
s3, shooting by using the environment detection imaging equipment to obtain a real environment image;
comprising the following steps: the environment detection imaging device shoots a real environment image and extracts or generates augmented reality information corresponding to the real environment image: the environment detection imaging equipment shoots environment images singly or continuously, and extracts a target or performs augmented reality rendering on a target object according to service requirements to generate corresponding information such as augmented reality graphics or instructions;
s4, processing the real environment image to obtain augmented reality information of the real environment image;
comprising the following steps: acquiring environment depth information: extracting relative distance information between an environmental object and an environmental detection imaging device through the shot environmental image, and simultaneously completing conversion from real world physical space position information of the object to position information in a shot image space;
s5, processing the real environment image, extracting relative distance information between an environment object and the environment detection imaging device, and converting the position information of the environment object in a real world physical space into the position information in a shot image space to obtain image space position information;
comprising the following steps: scaling, translating and rotating the display image: the environment detection imaging device determines the corresponding display proportion and display position of the augmented reality information to be displayed in the virtual display picture of the AR optical machine according to the relative position information and the imaging parameters acquired in the step S2, and performs scaling, translation and rotation of the display image according to the display proportion and the display position;
s6, processing the augmented reality information of the real environment image by using the environment detection imaging equipment according to the relative position information, the imaging parameter information and the image space position information to obtain display proportion information and display position information of the augmented reality information in a virtual display picture of the AR optical machine;
s7, scaling, translating and rotating the augmented reality information according to the display proportion information and the display position information of the augmented reality information in the virtual display picture of the AR optical machine to obtain virtual augmented reality information;
and S8, carrying out virtual-real alignment on the real environment image and the virtual augmented reality information to obtain an augmented reality virtual-real alignment result.
As an alternative embodiment, the imaging parameter information includes, but is not limited to, imaging parameter information of the environment detection imaging device and imaging parameter information of the AR ray machine;
the imaging parameter information of the environment detection imaging device comprises the information of the field angle, the aperture size, the focal length and the resolution of the environment detection imaging device;
the imaging parameter information of the AR light machine includes, but is not limited to, the angle of view, the focusing distance, and the display resolution information of the AR light machine.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
(1) The calibration system for the augmented reality virtual-real alignment is used for directly and rigidly connecting the AR optical machine and the environment detection imaging equipment through the fixed connecting component, so that the space relative positions of the AR optical machine and the environment detection imaging equipment are more stable; the fixing connection in at least two dimension directions can better reduce errors introduced during assembly; the structure connection is carried out through the local small-range assembly surface, so that the deformation influence of external stress on the fixedly connected component in the use process can be better reduced, and the repeated calibration of the equipment in the use process is avoided; the design creates good physical conditions for virtual-real alignment, and can greatly improve the precision of virtual-real alignment and the alignment consistency between devices;
(2) The calibration method for the virtual-real alignment of the augmented reality is used for converting the relative coordinates of the three-dimensional space, so that the design of the initial relative positions of the AR optical machine and the environment detection imaging equipment is more flexible, and the practicability is higher.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a calibration system for augmented reality virtual-real alignment according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the connection relationship of a calibration system for augmented reality virtual-real alignment according to an embodiment of the present invention;
FIG. 3 is a schematic illustration of a connection of a solid-state joint member to an environmental detection imaging apparatus according to an embodiment of the present invention; FIG. 4 is a schematic diagram illustrating a connection between a fixing and connecting member and an AR optical engine according to an embodiment of the present invention;
FIG. 5 is a schematic view showing the composition of the fastening members according to the embodiment of the present invention;
FIG. 6 is a schematic illustration of a connection of a solid-state joint member to an environmental detection imaging apparatus according to an embodiment of the present invention;
FIG. 7 is a flow chart of a calibration method for augmented reality virtual-real alignment disclosed in an embodiment of the present invention;
FIG. 8 is a schematic diagram of a calibration lens capturing a virtual image and a real image of a calibration object according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a connection relationship of another calibration system for augmented reality virtual-real alignment according to an embodiment of the present invention;
FIG. 10 is a schematic illustration of the connection of another solid-state joint member to an environmental detection imaging apparatus according to an embodiment of the present invention;
fig. 11 is a schematic diagram of an industrial camera capturing a virtual image and a real image of a calibration object according to an embodiment of the present invention.
Detailed Description
In order to make the present invention better understood by those skilled in the art, the following description will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, apparatus, article, or device that comprises a list of steps or elements is not limited to the list of steps or elements but may, in the alternative, include other steps or elements not expressly listed or inherent to such process, method, article, or device.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The invention discloses a calibration and calibration system and a calibration and calibration method for augmented reality virtual-real alignment, wherein the method comprises an AR (augmented reality) optical machine, a fixed connection component and environment detection imaging equipment; the AR light machine is connected with the environment detection imaging equipment through a fixed connecting component; the imaging optical axis of the AR optomachine and the lens optical axis of the environment detection imaging device are parallel to the z-axis of the space where the system is located and perpendicular to the xoy plane of the space where the system is located. And acquiring relative position information and imaging parameters of the environment detection imaging equipment and the AR optical machine, shooting a real environment image by the environment detection imaging equipment, extracting or generating corresponding augmented reality information, acquiring relative distance information between an environment object and the environment detection imaging equipment, performing scaling, translation and rotation processing on the augmented reality information, and completing virtual-real alignment. The invention can make the design of the initial relative position of the AR optical machine and the environment detection imaging device more flexible and has higher practicability. The following will describe in detail.
Example 1
Referring to fig. 1 to 6, fig. 1 to 6 are schematic structural diagrams of a calibration and calibration system for augmented reality virtual-real alignment according to an embodiment of the present invention. The calibration system for virtual-real alignment of augmented reality described in fig. 1 to 6 is applied to a data processing system, such as a local server or a cloud server for real augmented display calibration, and the embodiment of the invention is not limited. As shown in fig. 1, the calibration and calibration system for augmented reality virtual-real alignment comprises an AR optical machine (11), a fixed connection member (12) and an environment detection imaging device (13);
the AR light machine (11) is connected with the environment detection imaging equipment through the fixed connecting component (12) to form stable space relative position coordinates;
the connection is realized through screw connection;
the AR light machine is near-eye display equipment;
the imaging optical axis of the AR optical machine (11) and the lens optical axis of the environment detection imaging device (13) are parallel to the z-axis of the space where the calibration system is located and perpendicular to the xoy plane of the space where the calibration system is located. I.e. there is no z-axis coordinate value difference.
The image optical axes of the AR light machine (11) and the environment detection imaging device (13) are parallel to the z-axis shown in the figure and perpendicular to the paper surface. The two devices are not required to be flush in the x direction or the y direction, and the relative position is fixed only by virtue of the fixed connecting component (12).
Optionally, the fixed connection member (12) comprises an AR light machine locking structure (201), a fixed connection structure (202) and an environment detection imaging device locking structure (203);
202 fixedly connects 201 and 203 together, i.e. 201 is connected with 202, 202 is connected with 203;
the AR optical machine locking structure (201) and the environment detection imaging device locking structure (203) are located above the fixing structure (202).
Optionally, the AR light engine locking structure (201) and the environment detection imaging device locking structure (203) may be located at the side or below the fixed connection structure (202), but the effect is poor, and the upper effect is better;
optionally, the AR light engine locking structure (201) is connected with the AR light engine (11);
the connection is realized by screw connection or adhesive connection, and the effect is that the screw connection and the adhesive connection are stably combined.
The AR ray machine locking structure (201) is used for realizing translational locking constraint and rotation constraint around a z axis on the AR ray machine (11) in the x axis direction and the y axis direction of a space where the calibration system is located.
Optionally, the AR optical machine locking structure (201) fully considers the following design of the whole appearance of the product, and is connected with the AR optical machine (11) in an arc-shaped manner through 4 screws, and the AR optical machine locking structure (201) realizes translational locking constraint and rotation constraint around a z-axis on the AR optical machine (11) in the directions of an x-axis and a y-axis of a space where the system is located; the environment detection imaging device locking structure (203) is connected with the environment detection imaging device (13) through 4 screws and a right angle assembly mode, and the environment detection imaging device locking structure (203) realizes translational locking constraint and rotation constraint around a z axis on the environment detection imaging device (13) in the x axis direction and the y axis direction of a space where the device is located.
Optionally, the environment detection imaging device locking structure (203) is connected with the environment detection imaging device (13);
the connection is realized through screw connection or adhesive connection;
the environment detection imaging device locking structure (203) realizes translational locking constraint and rotation constraint around a z-axis on the environment detection imaging device (13) in the x-axis direction and the y-axis direction of a space where the calibration system is located.
Optionally, a connection mode between the AR light machine locking structure (201) and the AR light machine (11), and a connection mode between the environment detection imaging device locking structure (203) and the environment detection imaging device (13) include, but are not limited to, screw connection, adhesive connection and clamping connection.
Optionally, materials of the AR optical machine locking structure (201), the fixing structure (202) and the environment detection imaging device locking structure (203) are preferably 7-series aluminum alloy.
Preferably, the AR ray machine locking structure, the fixing structure and the environment detection imaging device locking structure are made of 7-series aluminum alloy, and are high in hardness and good in welding performance. The good rigidity and the multi-screw locking in at least two dimensions enable the spatial physical position between the AR (11) and the environment detection imaging device (13) to be highly locked. The AR optical machine locking structure (201) and the AR optical machine (11) are locked through a plurality of screws, and meanwhile, the assembly with the fixed connection structure (202) is synchronously realized; the environment detection imaging device locking structure (203) and the environment detection imaging device (13) are locked through multiple screws, and meanwhile assembly with the fixed connection structure (202) is synchronously realized.
Alternatively, the solid connecting member (12) should be machined from a sufficiently rigid metal piece material to ensure that no significant deformation occurs under normal external forces. Wherein, design AR ray apparatus locking structure (201) mainly plays the connection effect between AR ray apparatus (11) and environment detection imaging device (13). The design fixing connection structure (202) and the design environment detection imaging device locking structure (203) are mainly used for limiting and locking the AR optical machine (11) and the environment detection imaging device (13) respectively in at least two dimensional directions (x and y directions), so that projection coordinates of the AR optical machine (11) and the environment detection imaging device (13) on an xoy plane are fixed, and meanwhile, due to physical locking in the x and y directions, the relative rotation angle of the AR optical machine (11) and the environment detection imaging device (13) around a z axis can be locked.
Locking means for the design fastening structure (202) and the design environment detection imaging device locking structure (203) include, but are not limited to, screws, adhesives, clamping, and the like. The design of the fixed connection structure (202) can be performed in a right angle mode, or in an arc or abnormal mode, and no matter what design mode is, the locking effect is finally generated at least in the x and y directions, and meanwhile, the indirect locking effect is generated on the relative rotation angle around the z axis.
Example two
Referring to fig. 7, fig. 7 is a flowchart of a calibration method for augmented reality virtual-real alignment according to an embodiment of the present invention. The calibration method for virtual-real alignment of augmented reality described in fig. 7 is applied to a data processing system, such as a local server or cloud server for real augmented display calibration, and is used for emergency rescue, augmented reality (virtual reality), infrared thermal imaging, intelligent detection, wearable equipment field, etc., which is not limited in the embodiment of the invention. As shown in fig. 7, the calibration method for augmented reality virtual-real alignment includes:
s1, assembling and connecting the AR light machine, the fixed connection component and the environment detection imaging equipment, and locking the relative positions of the AR light machine and the environment detection imaging equipment;
s2, acquiring relative position information and imaging parameter information of the environment detection imaging equipment and the AR optical machine;
optionally, acquiring relative position information and imaging parameters of the environment detection imaging device and the AR light machine: according to the device for completing the fixation in the S1, the relative position information of the environment detection imaging device and the AR optical machine in a space xyz coordinate system is determined, and in addition, the respective imaging parameters of the environment detection imaging device and the AR optical machine are determined, such as information of the field angle, the aperture size, the focal length, the resolution and the like of the environment detection imaging device, and information of the field angle, the focusing distance, the display resolution and the like of the AR optical machine are determined;
s3, shooting by using the environment detection imaging equipment to obtain a real environment image;
optionally, the processing includes image preprocessing, including gaussian filtering, image denoising, image enhancement, etc., of the image generated by the environment detection imaging device. And the texture in the formed image is outlined by utilizing an edge detection algorithm, and the filling part is removed, namely, only useful information such as environmental texture and the like is reserved, and other non-texture filling information is removed, so that the characteristics of penetration display of augmented reality are met.
Optionally, the edge detection algorithm is:
1. randomly extracting a training set from the collected effective image data, ensuring that the training set and the verification set have no cross repeated data, marking the data to form a label file, and completing the manufacture of the training set;
2. taking the manufactured data set as the input of a deep convolution network, and constructing a Blend Mask network; improving the model trained by the Blend Mask network to obtain an improved Blend Mask model; using the verification set image to test the improved Blend Mask model to obtain an edge detection result; and evaluating the detection result of the model by adopting the accuracy index, and adjusting the parameters of the model according to the detection result until the training set index is close to the verification set index.
3. Improving the algorithm based on the original training model, replacing a backbone network, introducing deformable convolution and optimizing a loss function, and continuously training the improved algorithm;
4. and testing and analyzing the improved model, and comparing to obtain the model with the best edge detection effect.
The environment detection imaging device shoots a real environment image and extracts or generates augmented reality information corresponding to the real environment image: the environment detection imaging equipment shoots environment images singly or continuously, and extracts a target or performs augmented reality rendering on a target object according to service requirements to generate corresponding information such as augmented reality graphics or instructions;
s4, processing the real environment image to obtain augmented reality information of the real environment image;
optionally, extracting distance information between the target object and the environment detection imaging device through a monocular depth estimation algorithm;
s5, processing the real environment image, extracting relative distance information between an environment object and the environment detection imaging device, and converting the position information of the environment object in a real world physical space into the position information in a shot image space to obtain image space position information;
optionally, the display image zooms, translates, rotates: the environment detection imaging device determines the corresponding display proportion and display position of the augmented reality information to be displayed in the virtual display picture of the AR optical machine according to the relative position information and the imaging parameters acquired in the step S2, and performs scaling, translation and rotation of the display image according to the display proportion and the display position;
optionally, the image scaling method comprises the following steps:
1. the image is input to a depth separable convolutional layer process.
2. And enriching multi-scale edge information by using a multi-scale feature extraction module CDCM.
3. Background noise in the feature image is eliminated according to the spatial attention mechanism module CSAM.
4. The image feature quantity is further reduced to a single channel diagram by using a 1×1 convolution layer, and the image is restored to the original size by interpolation.
5. And creating edge graphs of each stage by using a Sigmoid function, and extracting the final image edge after single-point convolution fusion by using the independent channel feature graphs generated by the four stages and cascading.
6. And traversing and judging the areas where all the pixel points are located after edge extraction, processing a non-edge area by using a bilinear interpolation algorithm, adaptively interpolating the edge point area by using covariance, and finally finishing processing the edge line direction points by using a linear interpolation algorithm.
7. And merging interpolation results to finish image interpolation.
S6, processing the augmented reality information of the real environment image by using the environment detection imaging equipment according to the relative position information, the imaging parameter information and the image space position information to obtain display proportion information and display position information of the augmented reality information in a virtual display picture of the AR optical machine;
s7, scaling, translating and rotating the augmented reality information according to the display proportion information and the display position information of the augmented reality information in the virtual display picture of the AR optical machine to obtain virtual augmented reality information;
and S8, carrying out virtual-real alignment on the real environment image and the virtual augmented reality information to obtain an augmented reality virtual-real alignment result.
Optionally, the calibration lens shoots a virtual image and a real image of the calibration object: and using a calibration camera, shooting a virtual image and a real image of a calibration object through an AR optical machine according to the visual angle of human eyes, and obtaining alignment information of the virtual image and the real image. In this way, any pixel level offset of the virtual and real images will result from image analysis. As shown in fig. 8 below, the calibration object does not need to have a special shape, but needs to have a clear edge profile, and the calibration should be performed by at least two calibration objects, so as to obtain the difference information between the current virtual image and the real image more accurately;
the AR light engine serves as a near-to-eye display device.
The environment detection imaging device is preferably a thermal infrared imager, and can also be a visible light lens, a micro-light lens, an infrared lens and the like, so that the environment detection imaging device can penetrate through smoke to shoot an environment image of a fire scene.
Optionally, the environment detection imaging device shoots a real environment image and extracts or generates augmented reality information corresponding to the real environment image: and (3) performing single shooting or continuous shooting of environmental images through the environmental detection imaging equipment, extracting a target or performing augmented reality rendering on a target object, and generating corresponding information such as an augmented reality graph or indication.
Optionally, the imaging parameter information includes, but is not limited to, imaging parameter information of the environment detection imaging device and imaging parameter information of the AR light engine;
the imaging parameter information of the environment detection imaging device comprises the information of the field angle, the aperture size, the focal length and the resolution of the environment detection imaging device;
the imaging parameter information of the AR light machine includes, but is not limited to, the angle of view, the focusing distance, and the display resolution information of the AR light machine.
Optionally, further virtual image scaling, translation, rotation and distortion correction are performed according to the degree of overlap of the virtual image and the real image contours: the calibration camera feeds back the pictures obtained by overlapping the photographed virtual images and the real images to software and an algorithm for image analysis and identification, and the software and the algorithm can further accurately calculate the coordinate position difference and rotation angle difference between the virtual images and the real images and distortion existing between the virtual images and the real images through multi-point positioning and difference analysis, so that the display effect of the virtual images is adjusted through zooming, translation and rotation modes. Repeating the shooting, analyzing and calibrating actions until the virtual image and the real image reach pixel level alignment;
and (3) finishing virtual-real alignment and storage of calibration parameters: based on the calibration result obtained in the step S7, the equipment stores the corresponding calibration parameters, and the stored parameters are directly called for normal display without re-calibration and calibration in the subsequent use of the equipment.
Example III
FIG. 9 is a schematic diagram of a connection relationship of another calibration system for augmented reality virtual-real alignment according to an embodiment of the present invention; the system device mainly comprises the following three modules:
ar ray machine: a near-eye display device;
module 42, fixed connection member: for rigid direct connection modules 41 and 43;
module 43 thermal infrared imager: the device is used for penetrating smoke to shoot an environmental image of a fire scene.
The imaging optical axis of the AR optical machine and the lens optical axis of the thermal infrared imager are parallel to the z axis and perpendicular to the paper surface; the two are in the same xoy plane in real space, i.e. there is no difference in the z-axis coordinate values. The AR optical machine and the thermal infrared imager after being fixedly connected form stable space relative position coordinates.
As shown in fig. 10, the fastening member is mainly composed of three structures:
structure 51: AR ray apparatus locking structure
Structure 52: fixed connection structure of AR optical machine and thermal infrared imager
Structure 53: locking structure of thermal infrared imager
The following design of the structure 51 fully considered and the whole appearance of the product, and the translational locking constraint of the AR optical machine in the x direction and the y direction are realized in an arc-shaped manner through 4 screws, and meanwhile, the rotation constraint of the AR optical machine around the z axis is realized; the structure 53 generates translational locking constraint on the thermal infrared imager in the x direction and the y direction by means of 4 screws and right angle assembly, and simultaneously realizes the rotation constraint of the thermal infrared imager around the z axis; the structure 52 is profiled with sufficient consideration of the functional distribution of the product. The materials of the structure 51, the structure 52 and the structure 53 are designed by adopting 7-series aluminum alloy with higher rigidity, and the structure 51 and the AR optical machine are synchronously assembled with the structure 52 while being locked by multiple screws; the structure 53 is locked with the thermal infrared imager through multiple screws, and meanwhile, the assembly with the structure 52 is realized synchronously.
The high rigidity and the directional locking of at least two dimensions of the multiple screws enable the physical space position between the AR optical machine and the thermal infrared imager to be locked to a high degree.
(1) The device is fixedly connected with the structure: according to the device, the assembly connection of the environment detection imaging device and the AR optical machine is completed by using the fixed connection component, and the relative position between the environment detection imaging device and the AR optical machine in the space is locked;
(2) Acquiring relative position information and imaging parameters of the thermal infrared imager and the AR optical machine: determining relative position information of the thermal infrared imager and the AR optical machine in a space xyz coordinate system according to the device for completing the fixation in the step (1), and determining respective imaging parameters of the thermal infrared imager and the AR optical machine, such as information of a field angle, an aperture size, a focal length, resolution and the like of the thermal infrared imager, and information of a field angle, a focusing distance, a display resolution and the like of the AR optical machine;
(3) The thermal infrared imager shoots a real environment image and extracts or generates augmented reality information corresponding to the real environment image: the method comprises the steps of shooting environmental images through a thermal infrared imager for a single time or continuously, extracting a target or performing augmented reality rendering on a target object according to service requirements, and generating corresponding information such as an augmented reality graph or indication;
(4) Acquiring environment depth information: extracting relative distance information between an environmental object and the thermal infrared imager through the shot environmental image, and simultaneously completing conversion from real world physical space position information of the object to position information in a shot image space;
(5) Scaling, translating and rotating the display image: the thermal infrared imager determines the corresponding display proportion and display position of the augmented reality information to be displayed in the virtual display picture of the AR optical machine according to the relative position information and the imaging parameters acquired in the step (2), and zooms, translates and rotates the display image according to the display proportion and the display position;
(6) The industrial camera shoots virtual images and real images of a calibration object: and using an industrial camera, shooting a virtual image and a real image of a calibration object through an AR optical machine according to the visual angle of human eyes, and obtaining alignment information of the virtual image and the real image. The calibration object is selected to be a black-white checkerboard pattern, and in this way, the distortion condition of each part of the image can be synchronously analyzed and judged while the position difference and the rotation angle difference are analyzed. As shown in fig. 11.
(7) Further virtual image scaling, translation, rotation and distortion correction are performed according to the degree of overlap of the virtual image and the real image contour: the industrial camera feeds back the pictures obtained by overlapping the photographed virtual images and the real images to software and an algorithm for image analysis and identification, and the software and the algorithm can further accurately calculate the coordinate position difference and rotation angle difference between the virtual images and the real images and distortion existing between the virtual images and the real images through multi-point positioning and difference analysis, so that the display effect of the virtual images is adjusted through zooming, translation and rotation modes. Repeating the shooting, analyzing and calibrating actions until the virtual image and the real image reach pixel level alignment;
(8) And (3) finishing virtual-real alignment and storage of calibration parameters: based on the calibration result obtained in the step (7), the equipment stores the corresponding calibration parameters, and the stored parameters are directly called for normal display without re-calibration and calibration in the subsequent use of the equipment.
Therefore, the space relative positions of the AR light machine and the environment detection imaging equipment can be more stable through the direct rigid fixation of the structural members; the fixing connection in at least two dimension directions can better reduce errors introduced during assembly; the structure connection is carried out through the local small-range assembly surface, so that the deformation influence of external stress on the fixedly connected component in the use process can be better reduced, and the repeated calibration of the equipment in the use process is avoided; the design creates good physical conditions for virtual-real alignment, and can greatly improve the precision of virtual-real alignment and the alignment consistency between devices. The conversion of the three-dimensional space relative coordinates is carried out through software and an algorithm, so that the design of the initial relative positions of the AR optical machine and the environment detection imaging equipment is more flexible, and the practicability is higher. And the adjustment and optimization of the external calibration plate are combined to further increase the alignment precision and offset the negative influence of the machining and assembly errors of the structural member on the virtual and real alignment.
The apparatus embodiments described above are merely illustrative, in which the modules illustrated as separate components may or may not be physically separate, and the components shown as modules may or may not be physical, i.e., may be located in one place, or may be distributed over multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above detailed description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course by means of hardware. Based on such understanding, the foregoing technical solutions may be embodied essentially or in part in the form of a software product that may be stored in a computer-readable storage medium including Read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), one-time programmable Read-Only Memory (OTPROM), electrically erasable programmable Read-Only Memory (EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM) or other optical disc Memory, magnetic disc Memory, tape Memory, or any other medium that can be used for computer-readable carrying or storing data.
Finally, it should be noted that: the embodiment of the invention discloses a calibration and calibration system and a calibration and calibration method for augmented reality virtual-real alignment, which are disclosed by the embodiment of the invention only for illustrating the technical scheme of the invention, but not limiting the technical scheme; although the invention has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that; the technical scheme recorded in the various embodiments can be modified or part of technical features in the technical scheme can be replaced equivalently; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (8)

1.A calibration and calibration system for augmented reality virtual-real alignment, characterized in that the system comprises an AR-ray machine (11), a solid connection member (12) and an environment detection imaging device (13);
the AR light machine (11) is connected with the environment detection imaging equipment through the fixed connecting component (12) to form stable space relative position coordinates;
the imaging optical axis of the AR optical machine (11) and the lens optical axis of the environment detection imaging device (13) are parallel to the z-axis of the space where the calibration system is located and perpendicular to the xoy plane of the space where the calibration system is located.
2. The calibration system for augmented reality virtual-real alignment according to claim 1, wherein the solid link member (12) comprises an AR opto-mechanical locking structure (201), a solid link structure (202) and an environment detection imaging device locking structure (203);
the AR optical machine locking structure (201) and the environment detection imaging device locking structure (203) are located above the fixing structure (202).
3. The calibration system for augmented reality virtual-real alignment according to claim 1, characterized in that the AR ray machine locking structure (201) is connected with the AR ray machine (11);
the AR ray machine locking structure (201) is used for realizing translational locking constraint and rotation constraint around a z axis on the AR ray machine (11) in the x axis direction and the y axis direction of a space where the calibration system is located.
4. Calibration system for augmented reality virtual reality alignment according to claim 1, characterized in that the environment detection imaging device locking structure (203) is connected with the environment detection imaging device (13);
the environment detection imaging device locking structure (203) realizes translational locking constraint and rotation constraint around a z-axis on the environment detection imaging device (13) in the x-axis direction and the y-axis direction of a space where the calibration system is located.
5. The calibration system for augmented reality virtual-real alignment according to claim 1, wherein the connection mode of the AR light engine locking structure (201) and the AR light engine (11) and the connection mode of the environment detection imaging device locking structure (203) and the environment detection imaging device (13) comprise screw connection, adhesive connection and clamping connection.
6. The calibration system for augmented reality virtual-real alignment according to claim 1, wherein the AR ray machine locking structure (201), the fastening structure (202) and the environment detection imaging device locking structure (203) are made of 7-series aluminum alloy.
7. A calibration method for augmented reality virtual-real alignment, characterized in that a calibration system for augmented reality virtual-real alignment according to any one of claims 1 to 6 is employed, the method comprising:
s1, assembling and connecting the AR light machine, the fixed connection component and the environment detection imaging equipment, and locking the relative positions of the AR light machine and the environment detection imaging equipment;
s2, acquiring relative position information and imaging parameter information of the environment detection imaging equipment and the AR optical machine;
s3, shooting by using the environment detection imaging equipment to obtain a real environment image;
s4, processing the real environment image to obtain augmented reality information of the real environment image;
s5, processing the real environment image, extracting relative distance information between an environment object and the environment detection imaging device, and converting the position information of the environment object in a real world physical space into the position information in a shot image space to obtain image space position information;
s6, processing the augmented reality information of the real environment image by using the environment detection imaging equipment according to the relative position information, the imaging parameter information and the image space position information to obtain display proportion information and display position information of the augmented reality information in a virtual display picture of the AR optical machine;
s7, scaling, translating and rotating the augmented reality information according to the display proportion information and the display position information of the augmented reality information in the virtual display picture of the AR optical machine to obtain virtual augmented reality information;
and S8, carrying out virtual-real alignment on the real environment image and the virtual augmented reality information to obtain an augmented reality virtual-real alignment result.
8. The calibration method for augmented reality virtual reality alignment of claim 7, wherein the imaging parameter information includes, but is not limited to, imaging parameter information of the environment detection imaging device and imaging parameter information of the AR ray machine;
the imaging parameter information of the environment detection imaging device comprises the information of the field angle, the aperture size, the focal length and the resolution of the environment detection imaging device;
the imaging parameter information of the AR light machine includes, but is not limited to, the angle of view, the focusing distance, and the display resolution information of the AR light machine.
CN202310863716.3A 2023-07-13 2023-07-13 Calibration and calibration system and method for augmented reality virtual-real alignment Pending CN116883510A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310863716.3A CN116883510A (en) 2023-07-13 2023-07-13 Calibration and calibration system and method for augmented reality virtual-real alignment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310863716.3A CN116883510A (en) 2023-07-13 2023-07-13 Calibration and calibration system and method for augmented reality virtual-real alignment

Publications (1)

Publication Number Publication Date
CN116883510A true CN116883510A (en) 2023-10-13

Family

ID=88256374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310863716.3A Pending CN116883510A (en) 2023-07-13 2023-07-13 Calibration and calibration system and method for augmented reality virtual-real alignment

Country Status (1)

Country Link
CN (1) CN116883510A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106308946A (en) * 2016-08-17 2017-01-11 清华大学 Augmented reality device applied to stereotactic surgical robot and method of augmented reality device
CN107016704A (en) * 2017-03-09 2017-08-04 杭州电子科技大学 A kind of virtual reality implementation method based on augmented reality
CN108333774A (en) * 2018-04-16 2018-07-27 亮风台(上海)信息科技有限公司 A kind of optical display subass embly and AR smart machines
CN109541807A (en) * 2018-12-25 2019-03-29 北京谷东网科技有限公司 Augmented reality 3 d display device and display methods
CN110677634A (en) * 2019-11-27 2020-01-10 成都极米科技股份有限公司 Trapezoidal correction method, device and system for projector and readable storage medium
US20200184726A1 (en) * 2018-12-05 2020-06-11 Geun Sik Jo Implementing three-dimensional augmented reality in smart glasses based on two-dimensional data
CN114002898A (en) * 2020-07-28 2022-02-01 宁波舜宇光电信息有限公司 Projection module, assembling method thereof and near-to-eye display equipment comprising projection module
CN114545629A (en) * 2022-01-21 2022-05-27 广东虚拟现实科技有限公司 Augmented reality device, information display method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106308946A (en) * 2016-08-17 2017-01-11 清华大学 Augmented reality device applied to stereotactic surgical robot and method of augmented reality device
CN107016704A (en) * 2017-03-09 2017-08-04 杭州电子科技大学 A kind of virtual reality implementation method based on augmented reality
CN108333774A (en) * 2018-04-16 2018-07-27 亮风台(上海)信息科技有限公司 A kind of optical display subass embly and AR smart machines
US20200184726A1 (en) * 2018-12-05 2020-06-11 Geun Sik Jo Implementing three-dimensional augmented reality in smart glasses based on two-dimensional data
CN109541807A (en) * 2018-12-25 2019-03-29 北京谷东网科技有限公司 Augmented reality 3 d display device and display methods
CN110677634A (en) * 2019-11-27 2020-01-10 成都极米科技股份有限公司 Trapezoidal correction method, device and system for projector and readable storage medium
CN114002898A (en) * 2020-07-28 2022-02-01 宁波舜宇光电信息有限公司 Projection module, assembling method thereof and near-to-eye display equipment comprising projection module
CN114545629A (en) * 2022-01-21 2022-05-27 广东虚拟现实科技有限公司 Augmented reality device, information display method and device

Similar Documents

Publication Publication Date Title
US11126016B2 (en) Method and device for determining parameters for spectacle fitting
US9317973B2 (en) Augmented reality method applied to the integration of a pair of spectacles into an image of a face
US6778207B1 (en) Fast digital pan tilt zoom video
CN109003311B (en) Calibration method of fisheye lens
CN107277495B (en) A kind of intelligent glasses system and its perspective method based on video perspective
CN110363116B (en) Irregular human face correction method, system and medium based on GLD-GAN
JP4284664B2 (en) Three-dimensional shape estimation system and image generation system
CN110782394A (en) Panoramic video rapid splicing method and system
CN109685913B (en) Augmented reality implementation method based on computer vision positioning
JP2016018213A (en) Hmd calibration with direct geometric modeling
CN111292364A (en) Method for rapidly matching images in three-dimensional model construction process
WO2019140945A1 (en) Mixed reality method applied to flight simulator
TW201342304A (en) Method and system for adaptive perspective correction of ultra wide-angle lens images
CN106296825B (en) A kind of bionic three-dimensional information generating system and method
GB2464453A (en) Determining Surface Normals from Three Images
CN112085659A (en) Panorama splicing and fusing method and system based on dome camera and storage medium
Ziegler et al. Acquisition system for dense lightfield of large scenes
CN112595496B (en) Method, device, equipment and storage medium for detecting faults of near-eye display equipment
JP2014010783A (en) Image processing apparatus, image processing method, and program
CN111951339A (en) Image processing method for performing parallax calculation by using heterogeneous binocular cameras
CN105488780A (en) Monocular vision ranging tracking device used for industrial production line, and tracking method thereof
CN111340959A (en) Three-dimensional model seamless texture mapping method based on histogram matching
JP7489253B2 (en) Depth map generating device and program thereof, and depth map generating system
KR101841750B1 (en) Apparatus and Method for correcting 3D contents by using matching information among images
CN116883510A (en) Calibration and calibration system and method for augmented reality virtual-real alignment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination