CN117490598A - Object space information measuring method, device, computer equipment and storage medium - Google Patents

Object space information measuring method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN117490598A
CN117490598A CN202311305478.0A CN202311305478A CN117490598A CN 117490598 A CN117490598 A CN 117490598A CN 202311305478 A CN202311305478 A CN 202311305478A CN 117490598 A CN117490598 A CN 117490598A
Authority
CN
China
Prior art keywords
speckle
coordinate information
dimensional coordinate
target
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311305478.0A
Other languages
Chinese (zh)
Inventor
李一超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Solid High Tech Co ltd
Original Assignee
Solid High Tech Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Solid High Tech Co ltd filed Critical Solid High Tech Co ltd
Priority to CN202311305478.0A priority Critical patent/CN117490598A/en
Publication of CN117490598A publication Critical patent/CN117490598A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application relates to an object space information measurement method, an object space information measurement device, computer equipment and a storage medium. The method comprises the following steps: determining a current speckle laser from the speckle laser group, and acquiring speckle images projected by the current speckle laser on a target object by each camera in the target camera; determining target pixel points matched with each scattered spot based on a preset stereo matching algorithm; determining a space linear equation of each target pixel point based on the corresponding relation between the pixel point and the space linear equation; the three-dimensional coordinate information of each scattered spot obtained by fusing the space linear equation of each target pixel point matched with the same scattered spot is formed into a three-dimensional coordinate information set corresponding to the current scattered spot laser; and (3) entering the step of determining the current speckle laser until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and forming each three-dimensional coordinate information set into a target three-dimensional coordinate information set of the target object. The method can improve the accuracy of measuring the object space information.

Description

Object space information measuring method, device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and apparatus for measuring spatial information of an object, a computer device, and a storage medium.
Background
With the development of technology, the camera three-dimensional measurement technology is widely applied to the fields of industry and the like. However, the existing cameras widely adopt a pinhole camera model, and in practical application, the cameras do not completely conform to the pinhole camera model, so that the accuracy of measuring the three-dimensional spatial information of the object is low.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an object space information measurement method, apparatus, computer device, and storage medium capable of measuring object space information with high accuracy, which improves the accuracy of measuring object space information.
In a first aspect, the present application provides an object space information measurement method. The method comprises the following steps:
determining a current speckle laser from a speckle laser group, and acquiring speckle images projected on a target object by each camera of a target camera, wherein the target camera comprises at least two cameras;
based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image;
Determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation;
fusing the space linear equations corresponding to the matched target pixel points of the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming the three-dimensional coordinate information corresponding to each speckle into a three-dimensional coordinate information set corresponding to the current speckle laser;
and the step of determining the current speckle laser from the speckle laser group is entered until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and the three-dimensional coordinate information sets corresponding to each speckle laser are combined to form a target three-dimensional coordinate information set corresponding to the target object.
In one embodiment, the angles at which each of the group of speckle lasers projects onto the target object are different.
In one embodiment, fusing the spatial linear equation corresponding to the target pixel point matched with the same speckle in each speckle image to obtain the three-dimensional coordinate information corresponding to each speckle comprises:
calculating the intersection points of the same speckle in the space linear equations corresponding to the matched target pixel points in each speckle image to obtain three-dimensional coordinate information of each intersection point;
And based on the corresponding relation among the intersection points, the target pixel points and the scattered spots, taking the three-dimensional coordinate information corresponding to each intersection point as the three-dimensional coordinate information of the corresponding scattered spot, and obtaining the three-dimensional coordinate information corresponding to each scattered spot.
In one embodiment, based on a preset stereo matching algorithm, determining target pixels in each speckle image that are matched with each speckle projected by the current speckle laser includes:
acquiring speckle characteristics corresponding to each speckle projected by the current speckle laser, and acquiring pixel point characteristics corresponding to each pixel point in each speckle image;
and based on the preset stereo matching algorithm, the speckle characteristics corresponding to each speckle and the pixel characteristics corresponding to each pixel in each speckle image, taking the pixel with the most similar pixel characteristics and speckle characteristics as a target pixel matched with the corresponding speckle, and obtaining a target pixel matched with each speckle in each speckle image.
In one embodiment, before determining the spatial linear equation corresponding to each target pixel point in each speckle image based on the correspondence between the pixel point and the spatial linear equation, the method further includes:
acquiring three-dimensional coordinate information and pixel coordinate information corresponding to characteristic points of each calibration plate in calibration plate images corresponding to a plurality of different positions shot by each camera;
Calculating three-dimensional coordinate information corresponding to each pixel point in the calibration plate images corresponding to a plurality of different positions based on a preset interpolation algorithm and the three-dimensional coordinate information and the pixel coordinate information corresponding to each calibration plate feature point;
and calculating a space linear equation corresponding to each pixel point based on three-dimensional coordinate information corresponding to each pixel point in the calibration plate image corresponding to the plurality of different positions, and obtaining the corresponding relation between each pixel point and the space linear equation.
In a second aspect, the present application also provides an object space information apparatus. The device comprises:
the system comprises an acquisition module, a target camera and a control module, wherein the acquisition module determines a current speckle laser from a speckle laser group, acquires speckle images projected on a target object by the current speckle laser from cameras in the target camera, and the target camera comprises at least two cameras;
the matching module is used for respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image based on a preset stereo matching algorithm;
the determining module is used for determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation;
the fusion module is used for fusing the space linear equation corresponding to the target pixel point matched with the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and the three-dimensional coordinate information corresponding to each speckle is formed into a three-dimensional coordinate information set corresponding to the current speckle laser;
And the combination module is used for entering the step of determining the current speckle laser from the speckle laser group until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and combining the three-dimensional coordinate information sets corresponding to each speckle laser to form a target three-dimensional coordinate information set corresponding to the target object.
In one embodiment, the matching module is further configured to obtain speckle characteristics corresponding to each speckle projected by the current speckle laser, and obtain pixel point characteristics corresponding to each pixel point in each speckle image; and based on the preset stereo matching algorithm, the speckle characteristics corresponding to each speckle and the pixel characteristics corresponding to each pixel in each speckle image, taking the pixel with the most similar pixel characteristics and speckle characteristics as a target pixel matched with the corresponding speckle, and obtaining a target pixel matched with each speckle in each speckle image.
In one embodiment, the fusion module is further configured to calculate an intersection point between space linear equations corresponding to the target pixel points matched in each speckle image by using the same speckle, so as to obtain three-dimensional coordinate information of each intersection point; and based on the corresponding relation among the intersection points, the target pixel points and the scattered spots, taking the three-dimensional coordinate information corresponding to each intersection point as the three-dimensional coordinate information of the corresponding scattered spot, and obtaining the three-dimensional coordinate information corresponding to each scattered spot.
A computer device comprising a memory storing a computer program and a processor which when executing the computer program performs the steps of:
determining a current speckle laser from a speckle laser group, and acquiring speckle images projected on a target object by each camera of a target camera, wherein the target camera comprises at least two cameras;
based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image;
determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation;
fusing the space linear equations corresponding to the matched target pixel points of the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming the three-dimensional coordinate information corresponding to each speckle into a three-dimensional coordinate information set corresponding to the current speckle laser;
and the step of determining the current speckle laser from the speckle laser group is entered until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and the three-dimensional coordinate information sets corresponding to each speckle laser are combined to form a target three-dimensional coordinate information set corresponding to the target object.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
determining a current speckle laser from a speckle laser group, and acquiring speckle images projected on a target object by each camera of a target camera, wherein the target camera comprises at least two cameras;
based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image;
determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation;
fusing the space linear equations corresponding to the matched target pixel points of the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming the three-dimensional coordinate information corresponding to each speckle into a three-dimensional coordinate information set corresponding to the current speckle laser;
and the step of determining the current speckle laser from the speckle laser group is entered until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and the three-dimensional coordinate information sets corresponding to each speckle laser are combined to form a target three-dimensional coordinate information set corresponding to the target object.
According to the object space information measuring method, the device, the computer equipment and the storage medium, the current speckle laser is determined from the speckle laser group, speckle images projected on an object by the current speckle laser are shot by all cameras in the object camera, and the object camera comprises at least two cameras; based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image; determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation; fusing the space linear equations corresponding to the matched target pixel points of the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming the three-dimensional coordinate information corresponding to each speckle into a three-dimensional coordinate information set corresponding to the current speckle laser; the method comprises the steps of determining the current speckle laser from the speckle laser group until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and combining the three-dimensional coordinate information sets corresponding to each speckle laser to form a target three-dimensional coordinate information set corresponding to a target object, so that the three-dimensional space information of the target object is measured quickly and efficiently by using the speckle laser, the problem that in the prior art, due to the fact that the internal parameters of a camera are highly coupled with distortion coefficients of a lens when a pinhole camera model is adopted, the accuracy of information measured when the object is measured in a three-dimensional mode is limited is avoided, and the accuracy of measuring the object space information is improved.
Drawings
FIG. 1 is an application environment diagram of an object space information measurement method in one embodiment;
FIG. 2 is a flow chart of a method for measuring object space information in one embodiment;
FIG. 3 is a schematic diagram of spatial information of a camera measurement object in one embodiment;
FIG. 4 is a block diagram showing the structure of an object space information measuring apparatus in one embodiment;
FIG. 5 is an internal block diagram of a computer device in one embodiment;
fig. 6 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The object space information measuring method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104 or may be located on a cloud or other network server. The server 104 is configured to determine a current speckle laser from the speckle laser group, obtain speckle images projected on a target object by the current speckle laser captured by each camera in the target camera, and the target camera includes at least two cameras; based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image; determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation; fusing the space linear equations corresponding to the matched target pixel points of the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming the three-dimensional coordinate information corresponding to each speckle into a three-dimensional coordinate information set corresponding to the current speckle laser; and a step of determining the current speckle laser from the speckle laser group is carried out until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and the three-dimensional coordinate information sets corresponding to the speckle lasers are combined to form a target three-dimensional coordinate information set corresponding to the target object. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices, and portable wearable devices, where the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart vehicle devices, and the like. The portable wearable device may be a smart watch, smart bracelet, headset, or the like. The server 104 may be implemented as a stand-alone server or as a server cluster of multiple servers.
In one embodiment, the object space information measurement method provided by the embodiment of the application may be further applied to a multi-view camera device, where the multi-view camera device is configured to determine a current speckle laser from a speckle laser group, obtain speckle images projected on a target object by the current speckle laser captured by each camera in a target camera, and the target camera includes at least two cameras; based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image; determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation; fusing the space linear equations corresponding to the matched target pixel points of the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming the three-dimensional coordinate information corresponding to each speckle into a three-dimensional coordinate information set corresponding to the current speckle laser; and a step of determining the current speckle laser from the speckle laser group is carried out until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and the three-dimensional coordinate information sets corresponding to the speckle lasers are combined to form a target three-dimensional coordinate information set corresponding to the target object. A multi-view camera device refers to a device comprising a plurality of cameras, which may be target cameras.
In one embodiment, as shown in fig. 2, there is provided an object space information measuring method, which is exemplified as the application of the method to a multi-camera device or a server in fig. 1, including the steps of:
step S200, determining a current speckle laser from the speckle laser group, and acquiring speckle images projected on a target object by the current speckle laser from cameras in a target camera, wherein the target camera comprises at least two cameras.
The speckle laser is a random lattice laser emitter, and the random lattice is to generate tens of thousands of randomly distributed laser points on the surface of the target object. Current speckle lasers refer to speckle lasers that are used at the current time to project randomly distributed laser points on the surface of a target object. The target camera refers to a camera for photographing a speckle laser spot projected onto a surface of a target object; the target camera is a multi-view camera, which can carry at least two cameras, and each camera can be the same or different types of cameras, and can be specifically adjusted according to actual requirements. The target object refers to an object that measures spatial information and may be any entity. The speckle image refers to an image of a speckle projected by a current speckle laser on a target object.
Specifically, the speckle laser can project a preset number of randomly distributed laser speckle on the target object, and the projected speckle and surrounding speckle can form a pattern with characteristic information, so as to provide a data base for determining the target pixel point by a follow-up preset stereo matching algorithm. In addition, the scattered spots obtained by projecting the single scattered spot laser onto the target object are sparse, so that more scattered spots which can be used for measuring the three-dimensional space information of the target object can be obtained, the scattered spots of the target object can be projected sequentially by using the scattered spot lasers in the scattered spot laser group at least comprising two scattered spot lasers, the currently projected scattered spot lasers are used as current scattered spot lasers, then the scattered spots irradiated by the current scattered spot lasers on the target object are shot through all cameras in the target camera, and the scattered spot images shot by all cameras in the target camera can be obtained. It should be noted that the object space information measurement method can also be implemented when there is only one speckle laser in the speckle laser group.
Step S202, based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image.
The preset stereo matching algorithm refers to an algorithm for a stereo vision task in computer vision, and is used for determining the corresponding relation between the most similar pixel points by comparing the pixel points in two or more images, and is commonly used for reconstructing a three-dimensional scene, measuring depth, generating a stereo image and other tasks; may be Graph cut algorithms (GC for short), belief propagation algorithms (Belief Propagation BP for short), dynamic programming algorithms (Dynamic Programming DP for short). The target pixel points refer to pixel points with the corresponding relation with the scattered spots having the most similar characteristics.
Specifically, the preset stereo matching algorithm can be used for calculating the pixel points of each scattered spot matched in each speckle image, and the stereo matching algorithm can be classified into a passive light source method (ambient light) and an active light source method (active projection of light with information, such as random points, gray codes, sinusoidal light, etc.), wherein the stereo matching belonging to the passive light source method is performed by performing point matching completely depending on the color characteristics of the measured object, and the active light source method is performed by projecting an image with information to improve the characteristics of the measured object, so that the stereo matching accuracy is higher. Further, since each speckle is a random lattice projected by the speckle laser, when the speckle is matched with a pixel point in the speckle image based on a preset stereo matching algorithm, the accuracy of the target pixel point matched by the speckle is higher.
Step S204, based on the corresponding relation between the pixel points and the space linear equation, determining the space linear equation corresponding to each target pixel point in each speckle image.
The space linear equation refers to an equation for representing the same straight line of the pixel point and the scattered spots in the three-dimensional space, and can be used for calculating space coordinate information, namely three-dimensional space coordinate information.
Specifically, since the assembly of the target camera is calibrated, for each pixel point on the image shot by each camera in the target camera, there is a corresponding space linear equation, and the scattered spots matched with the target pixel point are all on the space linear represented by the space linear equation corresponding to the target pixel point, the space linear equation corresponding to each target pixel point in each speckle image can be determined according to the corresponding relation between the pixel point and the space linear equation, and since the same scattered spot has the matched target pixel point in each speckle image, the intersection point of the space linear represented by the space linear equation of the matched target pixel point in each speckle image is the scattered spot, and the determination of the space linear equation provides a basis for the calculation of the three-dimensional space coordinates of the subsequent scattered spots.
Step S206, fusing the space linear equations corresponding to the target pixel points matched by the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming the three-dimensional coordinate information corresponding to each speckle into a three-dimensional coordinate information set corresponding to the current speckle laser.
The fusion refers to the operation of calculating three-dimensional coordinate information of points, which are characterized by each space straight line equation and are intersected by the space straight line. The three-dimensional coordinate information refers to coordinate information of a three-dimensional space. The three-dimensional coordinate information set refers to a set of three-dimensional space coordinate information of each scattered spot; can be used to reflect the three-dimensional state of each speckle of the current speckle laser projected onto the target object.
Specifically, the intersection point between the space straight lines represented by the space straight line equation corresponding to the matched target pixel points of the same speckle in each speckle image is the speckle, the intersection point coordinate information of the space straight line equation corresponding to each target pixel point matched by the same speckle can be calculated through the space straight line equation corresponding to each target pixel point matched by the same speckle, the intersection point coordinate information is the three-dimensional coordinate information of the speckle, the three-dimensional coordinate information of each speckle is the three-dimensional coordinate information set of the speckle projected to the target object by the current speckle laser, the three-dimensional coordinate information set corresponding to the current speckle laser reflects part of three-dimensional space information on the target object which is measured and analyzed by the corresponding measurement of the current speckle laser, and the three-dimensional space information corresponding to the target object with higher precision can be obtained through participation of more speckle lasers.
For example, the target camera is a binocular camera (i.e. two cameras), the camera A shoots the speckle image A, the camera B shoots the speckle image B, the speckle Q is matched with the pixel point a in the speckle image A, and the speckle Q is matched with the pixel point B in the speckle image B, and the spatial linear equation corresponding to the pixel point a is used for the imaging systemAs shown in formula (1), the spatial linear equation of pixel b>As shown in the formula (2), the three-dimensional coordinate information of the intersection point calculated by the formula (1) and the formula (2) is the three-dimensional coordinate information of the scattered spots Q. Wherein +.in formula (1) and formula (2)>And->Is the coordinates of a three-dimensional point on the spatial linear equation,/->And->Is the direction vector of the spatial straight line, t is E (-. Infinity, +. Infinity), x, y and z are the coordinates of the x-axis, y-axis and z-axis, respectively.
Step S208, a step of determining the current speckle laser from the speckle laser group is carried out until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and the three-dimensional coordinate information sets corresponding to each speckle laser are combined to form a target three-dimensional coordinate information set corresponding to the target object.
The target three-dimensional coordinate information set refers to a set of three-dimensional space coordinate information of scattered spots projected onto a target object by each scattered spot laser in the scattered spot laser group; the method can be used for reflecting the state information of the target object in the three-dimensional space, and the three-dimensional space information of the target object is more and more accurate because the state information is the space information obtained by carrying out speckle projection on the same target object by a plurality of speckle lasers.
Specifically, because the laser speckle projected by a single speckle laser is sparse, the requirement of dense measurement in the actual situation is difficult to meet, the number of speckle lasers in the speckle laser group can be set according to the actual requirement, each speckle laser in the speckle laser group can be measured to obtain a three-dimensional coordinate information set corresponding to a target, and then the three-dimensional coordinate information sets corresponding to each speckle laser in the speckle laser group are fused to obtain a target three-dimensional coordinate information set corresponding to a target object, and the obtained target three-dimensional coordinate information set can reflect the three-dimensional space information of the target object with higher precision.
According to the object space information measuring method, the current speckle laser is determined from the speckle laser group, speckle images projected on a target object by the current speckle laser are shot by all cameras in the target camera, and the target camera comprises at least two cameras; based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image; determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation; fusing the space linear equations corresponding to the matched target pixel points of the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming the three-dimensional coordinate information corresponding to each speckle into a three-dimensional coordinate information set corresponding to the current speckle laser; the method comprises the steps of determining the current speckle laser from the speckle laser group until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and combining the three-dimensional coordinate information sets corresponding to each speckle laser to form a target three-dimensional coordinate information set corresponding to a target object, so that the three-dimensional space information of the target object is measured quickly and efficiently by using the speckle laser, the problem that in the prior art, due to the fact that the internal parameters of a camera are highly coupled with distortion coefficients of a lens when a pinhole camera model is adopted, the accuracy of information measured when the object is measured in a three-dimensional mode is limited is avoided, and the accuracy of measuring the object space information is improved.
In one embodiment, the angles at which each of the speckle lasers in the speckle laser group projects onto the target object are different. Specifically, if the plurality of speckle lasers in the speckle laser group are all projected onto the target object at the same angle, the scattered spots obtained by projection can be caused to have a plurality of overlapping conditions, and it is difficult to ensure that the scattered spots projected by each speckle laser in the speckle laser group uniformly and completely cover the target object as much as possible, so different projection angles can be set for different speckle lasers, so that the plurality of speckle lasers act together to obtain more dense scattered spots projected onto the target object, and the accuracy of the three-dimensional coordinate information of the target object obtained finally is higher.
In one embodiment, step S202 includes:
step S300, speckle characteristics corresponding to each speckle projected by the current speckle laser are obtained, and pixel point characteristics corresponding to each pixel point in each speckle image are obtained.
Step S302, based on a preset stereo matching algorithm, the speckle characteristics corresponding to each speckle and the pixel characteristics corresponding to each pixel in each speckle image, taking the pixel with the most similar pixel characteristics and speckle characteristics as a target pixel matched with the corresponding speckle, and obtaining a target pixel matched with each speckle in each speckle image.
Wherein speckle characteristics refer to characteristic information used to characterize speckle. The pixel point feature refers to feature information for characterizing the pixel point.
Specifically, the speckle images are images of speckle photographed by the camera, the most similar pixel point of each speckle in each speckle image can be found out based on a preset stereo matching algorithm according to the speckle characteristics of the speckle and the pixel point characteristics of the pixel points in the speckle images, the most similar pixel point in each speckle image is used as a target pixel point matched in a stereo space, the corresponding relation between each speckle and the matched target pixel point in each speckle image is determined, and a basis is provided for further calculation of three-dimensional coordinate information of the speckle.
In the above embodiment, the target pixel points matched in each scattered spot and each scattered spot image are determined according to the preset stereo matching algorithm, so that a basis is provided for the subsequent calculation of the three-dimensional coordinate information of each scattered spot, and the accuracy of the calculation of the three-dimensional coordinate information of each scattered spot is ensured to a certain extent.
In one embodiment, before step S204, the method further includes:
step S400, three-dimensional coordinate information and pixel coordinate information corresponding to characteristic points of each calibration plate in the calibration plate images corresponding to a plurality of different positions shot by each camera are obtained.
Step S402, calculating three-dimensional coordinate information corresponding to each pixel point in the calibration plate image corresponding to a plurality of different positions based on a preset interpolation algorithm and the three-dimensional coordinate information and the pixel coordinate information corresponding to each calibration plate feature point.
And step S404, calculating a space linear equation corresponding to each pixel point based on three-dimensional coordinate information corresponding to each pixel point in the calibration plate image corresponding to a plurality of different positions, and obtaining the corresponding relation between each pixel point and the space linear equation.
The calibration plate image refers to images of the calibration plate photographed at different positions. The calibration plate refers to a flat plate used for calibrating the target camera. The characteristic points of the calibration plate indicate the pattern arrays with fixed intervals on the calibration plate. The three-dimensional coordinate information refers to the three-dimensional coordinates of the characteristic points of the calibration plates in the world coordinate system, namely the three-dimensional space coordinate information of the characteristic points of each calibration plate in the three-dimensional space coordinate system. The preset interpolation algorithm refers to an interpolation algorithm for calculating three-dimensional space coordinate information of each pixel point, and can be a bilinear interpolation algorithm.
Specifically, the three-dimensional coordinate information of each pixel point in each calibration plate image can be calculated based on a preset interpolation algorithm, such as a bilinear interpolation algorithm, and the spatial linear equation corresponding to each pixel point can be calculated based on the three-dimensional coordinate information of each pixel point in each calibration plate image corresponding to the same calibration plate feature point in a plurality of different positions.
In the above embodiment, three-dimensional coordinate information and pixel coordinate information corresponding to each calibration plate feature point in the calibration plate images corresponding to the different positions shot by each camera are obtained, based on a preset interpolation algorithm and the three-dimensional coordinate information and pixel coordinate information corresponding to each calibration plate feature point, the three-dimensional coordinate information corresponding to each pixel point in the calibration plate images corresponding to the different positions is calculated, based on the three-dimensional coordinate information corresponding to each pixel point in the calibration plate images corresponding to the different positions, a space linear equation corresponding to each pixel point is calculated, the corresponding relation between each pixel point and the space linear equation is obtained, the determination of the space linear equation corresponding to the pixel point is realized, the participation of the information of the calibration plate in the different positions is ensured, the calculation accuracy of the space linear equation corresponding to the pixel point is ensured, and the measurement accuracy of the three-dimensional space information of the target object is improved to a certain extent.
In one embodiment, step S206 includes:
and S500, calculating the intersection points of the same speckle in the space linear equations corresponding to the matched target pixel points in each speckle image, and obtaining the three-dimensional coordinate information of each intersection point.
Step S502, based on the corresponding relation among the intersection points, the target pixel points and the scattered spots, taking the three-dimensional coordinate information corresponding to each intersection point as the three-dimensional coordinate information of the corresponding scattered spot, and obtaining the three-dimensional coordinate information corresponding to each scattered spot.
Specifically, the intersection point of the space straight line represented by the space straight line equation corresponding to the matched target pixel point in each speckle image is the speckle, each speckle can obtain the three-dimensional coordinate information of the intersection point by calculating the intersection point of the space straight line equation of the matched target pixel point in each speckle image, the three-dimensional coordinate information of the intersection point is the three-dimensional coordinate information of the corresponding speckle, and the speckle corresponding to the intersection point can be determined according to the corresponding relation among the intersection point, the target pixel point and the speckle.
In the above embodiment, by calculating the three-dimensional coordinate information of the intersection point between the spatial linear equations corresponding to the target pixel points matched in each speckle image by the same speckle, and determining the three-dimensional coordinate information of each speckle based on the correspondence between the intersection point, the target pixel point and the speckle, the calculation of the three-dimensional coordinate information of each speckle is realized, and the three-dimensional coordinate information with higher target object precision can be calculated without the participation of parameters and lens distortion coefficients in a camera as in the prior art, thereby being beneficial to improving the accuracy of object space information measurement.
In one embodiment, the measurement of the three-dimensional space information of the measurement object can be realized when one speckle laser is used, that is, speckle images projected on the target object by the speckle lasers can be obtained by all cameras in the target camera, and the target camera comprises at least two cameras; based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by a speckle laser in each speckle image; determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation; and fusing the space linear equation corresponding to the target pixel points matched with the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming a three-dimensional coordinate information set corresponding to the target object by the three-dimensional coordinate information corresponding to each speckle, so that the measurement of the three-dimensional space information of the target object by a single speckle laser is realized.
In one embodiment, the target camera is taken as a binocular camera and four speckle lasers in the speckle laser group are taken as an example for description, and fig. 3 is specifically seen. In fig. 3, the left camera and the right camera are binocular cameras, that is, the target camera comprises two cameras, the corresponding projection angles of the speckle lasers in the speckle laser groups are different, a projection sequence can be preset, the speckle is projected onto the target object according to the projection angles of the speckle lasers in sequence, and the speckle images of the speckle projected onto the target object by the speckle lasers in the current projection sequence are shot through the left camera and the right camera, wherein Showing different projection angles +.>For one pixel point on the speckle image shot by the left camera,/for the left camera>For one pixel point on the speckle image photographed by the right camera,/for the left and right camera>For a speckle projected by a speckle laser onto a target object,and->Are all->A corresponding matched target pixel point, wherein +.>For the target pixel->The corresponding spatial straight line is used for the treatment of the heart disease,for the target pixel->Corresponding spatial straight line by calculation +.>Corresponding spatial linear equation +.>Three-dimensional coordinate information of the intersection point of the corresponding space linear equation can be obtained>And->Is also a scattered spot +.>By circularly calculating the three-dimensional coordinate information set of each scattered spot projected by the speckle lasers with different projection angles in the speckle laser group, the target three-dimensional coordinate information set corresponding to the target object is obtained, thereby realizing the measurement of the three-dimensional space information of the target object with high efficiency and higher precision, eliminating the need of three-dimensional measurement of an object by a camera adopting a pinhole camera model in the prior art, and effectively avoiding the high coupling of the camera internal parameters and the lens distortion coefficients used in the prior art when the pinhole camera model is adopted The problem that the accuracy of information obtained by measurement when an object performs three-dimensional measurement is limited is caused, and meanwhile, the accuracy of measuring the spatial information of the object is better improved.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an object space information device for realizing the above-mentioned related object space information method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of one or more object space information devices provided below may be referred to the limitation of the object space information method hereinabove, and will not be described herein.
In one embodiment, as shown in fig. 4, there is provided an object space information apparatus including: an acquisition module 400, a matching module 402, a determination module 404, a fusion module 406, and a combination module 408, wherein:
the obtaining module 400 determines a current speckle laser from the speckle laser group, obtains speckle images projected by the current speckle laser on a target object by each camera in a target camera, and the target camera comprises at least two cameras.
The matching module 402 is configured to determine, based on a preset stereo matching algorithm, target pixel points in each speckle image that are matched with each speckle projected by the current speckle laser.
The determining module 404 determines a spatial linear equation corresponding to each target pixel point in each speckle image based on the correspondence between the pixel points and the spatial linear equation.
The fusion module 406 fuses the spatial linear equations corresponding to the target pixels matched by the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and composes the three-dimensional coordinate information corresponding to each speckle into the three-dimensional coordinate information set corresponding to the current speckle laser.
The combining module 408 is configured to enter the step of determining the current speckle laser from the speckle laser group until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and combine the three-dimensional coordinate information sets corresponding to each speckle laser to form a target three-dimensional coordinate information set corresponding to the target object.
In one embodiment, the object space information device further includes a calculation module 410, configured to obtain three-dimensional coordinate information and pixel coordinate information corresponding to feature points of each calibration plate in calibration plate images corresponding to a plurality of different positions captured by each camera; calculating three-dimensional coordinate information corresponding to each pixel point in the calibration plate images corresponding to a plurality of different positions based on a preset interpolation algorithm and the three-dimensional coordinate information and the pixel coordinate information corresponding to each calibration plate feature point; and calculating a space linear equation corresponding to each pixel point based on three-dimensional coordinate information corresponding to each pixel point in the calibration plate image corresponding to the plurality of different positions, and obtaining the corresponding relation between each pixel point and the space linear equation.
In one embodiment, the matching module 402 is further configured to obtain speckle characteristics corresponding to each speckle projected by the current speckle laser, and obtain pixel point characteristics corresponding to each pixel point in each speckle image; and based on the preset stereo matching algorithm, the speckle characteristics corresponding to each speckle and the pixel characteristics corresponding to each pixel in each speckle image, taking the pixel with the most similar pixel characteristics and speckle characteristics as a target pixel matched with the corresponding speckle, and obtaining a target pixel matched with each speckle in each speckle image.
In one embodiment, the fusion module 406 is further configured to calculate an intersection point between space linear equations corresponding to the target pixel points matched in each speckle image by the same speckle, so as to obtain three-dimensional coordinate information of each intersection point; and based on the corresponding relation among the intersection points, the target pixel points and the scattered spots, taking the three-dimensional coordinate information corresponding to each intersection point as the three-dimensional coordinate information of the corresponding scattered spot, and obtaining the three-dimensional coordinate information corresponding to each scattered spot.
The respective modules in the above object space information apparatus may be implemented in whole or in part by software, hardware, and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 5. The computer device includes a processor, a memory, an Input/Output interface (I/O) and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used to store data related to the execution process. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an object space information method.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure of which may be as shown in fig. 6. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement an object space information method. The display unit of the computer device is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be a key, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the structures shown in fig. 5 and 6 are block diagrams of only portions of structures that are relevant to the present application and are not intended to limit the computer device on which the present application may be implemented, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, storing a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the steps in the above-described method embodiments.
It should be noted that, the user information (including, but not limited to, user equipment information, user personal information, etc.) and the data (including, but not limited to, data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data are required to comply with the related laws and regulations and standards of the related countries and regions.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (10)

1. A method of measuring spatial information of an object, the method comprising:
determining a current speckle laser from a speckle laser group, and acquiring speckle images projected on a target object by each camera of a target camera, wherein the target camera comprises at least two cameras;
based on a preset stereo matching algorithm, respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image;
Determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation;
fusing the space linear equations corresponding to the matched target pixel points of the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and forming the three-dimensional coordinate information corresponding to each speckle into a three-dimensional coordinate information set corresponding to the current speckle laser;
and the step of determining the current speckle laser from the speckle laser group is entered until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and the three-dimensional coordinate information sets corresponding to each speckle laser are combined to form a target three-dimensional coordinate information set corresponding to the target object.
2. The method of claim 1, wherein angles at which each of the set of speckle lasers projects onto the target object are different.
3. The method of claim 1, wherein the fusing the spatial linear equations corresponding to the target pixels matched by the same speckle in each speckle image to obtain the three-dimensional coordinate information corresponding to each speckle comprises:
Calculating the intersection points of the same speckle in the space linear equations corresponding to the matched target pixel points in each speckle image to obtain three-dimensional coordinate information of each intersection point;
and based on the corresponding relation among the intersection points, the target pixel points and the scattered spots, taking the three-dimensional coordinate information corresponding to each intersection point as the three-dimensional coordinate information of the corresponding scattered spot, and obtaining the three-dimensional coordinate information corresponding to each scattered spot.
4. The method of claim 1, wherein the determining, based on a preset stereo matching algorithm, target pixels in each speckle image that match each speckle projected by the current speckle laser, respectively, comprises:
acquiring speckle characteristics corresponding to each speckle projected by the current speckle laser, and acquiring pixel point characteristics corresponding to each pixel point in each speckle image;
and based on the preset stereo matching algorithm, the speckle characteristics corresponding to each speckle and the pixel characteristics corresponding to each pixel in each speckle image, taking the pixel with the most similar pixel characteristics and speckle characteristics as a target pixel matched with the corresponding speckle, and obtaining a target pixel matched with each speckle in each speckle image.
5. The method according to claim 1, wherein before determining the spatial linear equation corresponding to each target pixel in each speckle image based on the correspondence between the pixel and the spatial linear equation, further comprises:
acquiring three-dimensional coordinate information and pixel coordinate information corresponding to characteristic points of each calibration plate in calibration plate images corresponding to a plurality of different positions shot by each camera;
calculating three-dimensional coordinate information corresponding to each pixel point in the calibration plate images corresponding to a plurality of different positions based on a preset interpolation algorithm and the three-dimensional coordinate information and the pixel coordinate information corresponding to each calibration plate feature point;
and calculating a space linear equation corresponding to each pixel point based on three-dimensional coordinate information corresponding to each pixel point in the calibration plate image corresponding to the plurality of different positions, and obtaining the corresponding relation between each pixel point and the space linear equation.
6. An object space information apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition module, a target camera and a control module, wherein the acquisition module determines a current speckle laser from a speckle laser group, acquires speckle images projected on a target object by the current speckle laser from cameras in the target camera, and the target camera comprises at least two cameras;
The matching module is used for respectively determining target pixel points matched with each speckle projected by the current speckle laser in each speckle image based on a preset stereo matching algorithm;
the determining module is used for determining a space linear equation corresponding to each target pixel point in each speckle image based on the corresponding relation between the pixel points and the space linear equation;
the fusion module is used for fusing the space linear equation corresponding to the target pixel point matched with the same speckle in each speckle image to obtain three-dimensional coordinate information corresponding to each speckle, and the three-dimensional coordinate information corresponding to each speckle is formed into a three-dimensional coordinate information set corresponding to the current speckle laser;
and the combination module is used for entering the step of determining the current speckle laser from the speckle laser group until each speckle laser in the speckle laser group has a corresponding three-dimensional coordinate information set, and combining the three-dimensional coordinate information sets corresponding to each speckle laser to form a target three-dimensional coordinate information set corresponding to the target object.
7. The apparatus of claim 6, wherein the matching module is further configured to obtain speckle characteristics corresponding to each speckle projected by the current speckle laser, and obtain pixel point characteristics corresponding to each pixel point in each speckle image; and based on the preset stereo matching algorithm, the speckle characteristics corresponding to each speckle and the pixel characteristics corresponding to each pixel in each speckle image, taking the pixel with the most similar pixel characteristics and speckle characteristics as a target pixel matched with the corresponding speckle, and obtaining a target pixel matched with each speckle in each speckle image.
8. The apparatus of claim 6, wherein the fusion module is further configured to calculate an intersection point between spatial linear equations corresponding to the target pixel points matched in each speckle image by the same speckle, so as to obtain three-dimensional coordinate information of each intersection point; and based on the corresponding relation among the intersection points, the target pixel points and the scattered spots, taking the three-dimensional coordinate information corresponding to each intersection point as the three-dimensional coordinate information of the corresponding scattered spot, and obtaining the three-dimensional coordinate information corresponding to each scattered spot.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 5 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 5.
CN202311305478.0A 2023-10-10 2023-10-10 Object space information measuring method, device, computer equipment and storage medium Pending CN117490598A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311305478.0A CN117490598A (en) 2023-10-10 2023-10-10 Object space information measuring method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311305478.0A CN117490598A (en) 2023-10-10 2023-10-10 Object space information measuring method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117490598A true CN117490598A (en) 2024-02-02

Family

ID=89673410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311305478.0A Pending CN117490598A (en) 2023-10-10 2023-10-10 Object space information measuring method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117490598A (en)

Similar Documents

Publication Publication Date Title
CN107223269B (en) Three-dimensional scene positioning method and device
US10726580B2 (en) Method and device for calibration
CN111127422A (en) Image annotation method, device, system and host
CN108769462B (en) Free visual angle scene roaming method and device
CN110930463B (en) Method and device for calibrating internal reference of monitoring camera and electronic equipment
WO2020237492A1 (en) Three-dimensional reconstruction method, device, apparatus, and storage medium
US11380016B2 (en) Fisheye camera calibration system, method and electronic device
TWI738196B (en) Method and electronic device for image depth estimation and storage medium thereof
CN113643414B (en) Three-dimensional image generation method and device, electronic equipment and storage medium
CN112862897B (en) Phase-shift encoding circle-based rapid calibration method for camera in out-of-focus state
CN113256742B (en) Interface display method and device, electronic equipment and computer readable medium
Wilm et al. Accurate and simple calibration of DLP projector systems
CN115035235A (en) Three-dimensional reconstruction method and device
CN116109765A (en) Three-dimensional rendering method and device for labeling objects, computer equipment and storage medium
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN117579753A (en) Three-dimensional scanning method, three-dimensional scanning device, computer equipment and storage medium
CN116758206A (en) Vector data fusion rendering method and device, computer equipment and storage medium
CN112241984A (en) Binocular vision sensor calibration method and device, computer equipment and storage medium
WO2020146965A1 (en) Image refocusing control method and system
CN117490598A (en) Object space information measuring method, device, computer equipment and storage medium
CN113487685A (en) Calibration method, device and equipment of line laser scanning camera and storage medium
CN111292414A (en) Method and device for generating three-dimensional image of object, storage medium and electronic equipment
CN116295031B (en) Sag measurement method, sag measurement device, computer equipment and storage medium
De Boi et al. How to turn your camera into a perfect pinhole model
CN115861520B (en) Highlight detection method, highlight detection device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination