CN113763466A - Loop detection method and device, electronic equipment and storage medium - Google Patents

Loop detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113763466A
CN113763466A CN202011079676.6A CN202011079676A CN113763466A CN 113763466 A CN113763466 A CN 113763466A CN 202011079676 A CN202011079676 A CN 202011079676A CN 113763466 A CN113763466 A CN 113763466A
Authority
CN
China
Prior art keywords
image frame
historical
target
loop
dimensional code
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011079676.6A
Other languages
Chinese (zh)
Other versions
CN113763466B (en
Inventor
张鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN202011079676.6A priority Critical patent/CN113763466B/en
Publication of CN113763466A publication Critical patent/CN113763466A/en
Application granted granted Critical
Publication of CN113763466B publication Critical patent/CN113763466B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1447Methods for optical code recognition including a method step for retrieval of the optical code extracting optical codes from image or text carrying said optical code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1452Methods for optical code recognition including a method step for retrieval of the optical code detecting bar code edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a loop detection method, a loop detection device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a current image frame and a historical image frame shot by a camera, wherein a plurality of different two-dimensional codes are placed in a shooting environment of the camera in advance; detecting whether a two-dimensional code exists in a shot current image frame; and if the target two-dimensional code exists in the current image frame, determining a historical loopback image frame with the target two-dimensional code from the historical image frame. By the technical scheme of the embodiment of the invention, the problem of inaccurate detection caused by poor ambient illumination conditions in the existing mode can be solved, so that the robustness and accuracy of loop detection are improved, and accumulated errors are further effectively eliminated.

Description

Loop detection method and device, electronic equipment and storage medium
Technical Field
The present invention relates to image processing technologies, and in particular, to a loop detection method and apparatus, an electronic device, and a storage medium.
Background
In the field of computer vision research, camera pose is typically estimated using a sequence of image frames taken by a camera. SLAM (Simultaneous localization and mapping) is a common technique that constructs a 3D trajectory of a camera by tracking the pose of the camera and maps the shooting environment in which the camera is located. SLAM has a wide range of applications, such as robotic navigation, autopilot, augmented reality, and so on.
In the positioning and mapping process based on the SLAM, the camera pose estimation is usually a recursion process, namely, the pose of the current image frame is solved by the pose of the previous image frame, so that errors are transmitted frame by frame, namely, accumulated errors are increased, further, the estimated pose is greatly deviated from the actual pose, and the constructed environment map also has the situations of dislocation and ghost images. For this, the accumulated error in the positioning and mapping process can be eliminated by using the loop detection and global optimization.
At present, loop detection is generally performed based on a bag-of-words model, that is, feature information in each image frame is extracted, and a historical loop image frame for loop is determined from the historical image frames by comparing the feature information, so that global optimization is performed subsequently based on the historical loop image frame.
However, in the process of implementing the present invention, the inventor finds that at least the following problems exist in the prior art:
in a loop detection mode based on a bag-of-words model, when the illumination condition in the environment is poor, the extracted texture feature information is sparse, so that the accuracy of loop detection can be reduced, and further, the accumulated error can not be effectively eliminated.
Disclosure of Invention
The embodiment of the invention provides a loop detection method, a loop detection device, electronic equipment and a storage medium, which are used for solving the problem of inaccurate detection caused by poor ambient illumination conditions in the existing mode, so that the robustness and accuracy of loop detection are improved, and accumulated errors are further effectively eliminated.
In a first aspect, an embodiment of the present invention provides a loop detection method, including:
acquiring a current image frame and a historical image frame shot by a camera, wherein a plurality of different two-dimensional codes are placed in a shooting environment of the camera in advance;
detecting whether a two-dimensional code exists in the shot current image frame;
and if the target two-dimensional code exists in the current image frame, determining a historical loopback image frame with the target two-dimensional code from the historical image frame.
In a second aspect, an embodiment of the present invention further provides a loop detection apparatus, including:
the image frame acquisition module is used for acquiring a current image frame and a historical image frame shot by a camera, wherein a plurality of different two-dimensional codes are placed in the shooting environment of the camera in advance;
the current image frame detection module is used for detecting whether the two-dimensional code exists in the shot current image frame;
and the historical loop image frame determining module is used for determining the historical loop image frame with the target two-dimensional code from the historical image frame if the target two-dimensional code exists in the current image frame.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a loop detection method as provided by any of the embodiments of the invention.
In a fourth aspect, the embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the loop detection method provided in any embodiment of the present invention.
The embodiment of the invention has the following advantages or beneficial effects:
a plurality of different two-dimensional codes are placed in the camera shooting environment in advance, so that the two-dimensional codes can exist in image frames shot by the camera, the image frames shot under different camera poses can be distinguished by using the different two-dimensional codes, and loop detection can be performed by using the two-dimensional codes. If the target two-dimensional code is detected to exist in the shot current video frame, the historical loopback image frame with the target two-dimensional code is determined from the shot historical image frame, so that the historical image frame with the same target two-dimensional code as the current image frame can be used as the historical loopback image frame for loopback, the two-dimensional code is in a black-and-white form and has higher contrast, and the two-dimensional code can be accurately detected under poor ambient lighting conditions, so that the influence of ambient lighting change is not easily caused, the robustness and accuracy of loopback detection are improved, and the accumulated error is effectively eliminated.
Drawings
Fig. 1 is a flowchart of a loop detection method according to an embodiment of the present invention;
fig. 2 is a flowchart of a loop detection method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a loop detection apparatus according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a loop detection method according to an embodiment of the present invention, which is applicable to a situation of performing loop detection on a current image frame, and especially applicable to a scene in which loop detection is performed in a visual positioning process so as to perform global pose optimization based on a detected historical loop image frame. The method may be performed by a loop detection apparatus, which may be implemented by software and/or hardware, integrated in an electronic device. As shown in fig. 1, the method specifically includes the following steps:
s110, acquiring a current image frame and a historical image frame shot by a camera, wherein a plurality of different two-dimensional codes are placed in the shooting environment of the camera in advance.
The current image frame may refer to an image frame captured by a camera at the current time. The history image frame may refer to an image frame photographed by the camera before the current time. The camera shooting environment may refer to an environment map to be constructed. The two-dimensional code can be a pattern which is distributed on a plane according to a certain rule by using a certain specific geometric figure, is alternate between black and white and records digital symbol information. The two-dimensional code is in a black-and-white form, so that the contrast is higher, and the two-dimensional code can be accurately detected even under poor ambient illumination conditions, so that the ambient illumination change can be more robustly responded, and the robustness and the accuracy of loop detection are improved. Each two-dimension code corresponds to one two-dimension code identification so as to distinguish different two-dimension codes. Each two-dimensional code may be placed in a different position that the camera can capture so that the two-dimensional code may be present in the image frames captured by the camera. The placing interval of two adjacent two-dimensional codes can be set based on the pose change speed of the camera, so that the situation that the same two-dimensional code is shot in two adjacent image frames is avoided. For example, a plurality of different two-dimensional codes may be placed on the floor of a camera shooting environment at intervals of several tens of meters.
It should be noted that, because the camera is moving, the camera pose at each shooting time, that is, the camera pose corresponding to each image frame, may also change in real time, so that there may exist a historical loop image frame close to the camera pose corresponding to the current image frame in the historical image frame, loop detection needs to be performed in the historical image frame to establish a pose constraint relationship between the current image frame and the historical loop image frame, so that the pose of the current image frame can be calculated based on the historical loop image frame pose, and compared with the calculation of the current image frame based on the pose of the previous image frame, the accumulated error can be effectively eliminated.
Specifically, a current image frame and a history image frame captured by a camera are acquired in a camera capturing environment in which a plurality of different two-dimensional codes are placed. The two-dimensional code placed in the environment may be shot in each image frame shot by the camera, or the two-dimensional code placed in the environment may not be shot. In the embodiment, loop detection may be performed for each image frame taken by the camera, that is, the operations of steps S110 to S130 are performed based on the current image frame taken at the current moment, so that loop detection is performed in real time, so as to optimize the camera pose in real time.
And S120, detecting whether the two-dimensional code exists in the shot current image frame.
Specifically, whether the two-dimensional code exists in the current image frame can be detected through an apriltag two-dimensional code algorithm. For example, various edges in the current image frame are detected according to the gradient, the edges are detected, non-linear edges are removed, adjacent edge searching is carried out on the linear edges, a closed-loop quadrilateral image is determined, and the two-dimensional code identification corresponding to the quadrilateral image is determined by encoding and decoding the quadrilateral image. If the two-dimension code identification corresponding to the quadrilateral image can be determined, the two-dimension code exists in the current image frame. If the quadrilateral image does not exist or the two-dimensional code identifier corresponding to the quadrilateral image cannot be determined, it is indicated that the two-dimensional code does not exist in the current image frame, that is, the two-dimensional code is not shot when the current image frame is shot.
It should be noted that, if the two-dimensional code does not exist in the current image frame, it indicates that a historical loop image frame with a pose similar to that of the current image frame does not exist in the historical image frame, and loop detection on the current image frame is not needed, and at this time, the process may return to step S110 to perform loop detection on the next image frame.
And S130, if the target two-dimensional code exists in the current image frame, determining a historical loopback image frame with the target two-dimensional code from the historical image frame.
The target two-dimensional code may be a two-dimensional code which exists in a current image frame and can be used for representing a camera pose corresponding to the current image frame. The historical loop image frame may refer to a historical image frame having a camera pose similar to the camera pose corresponding to the current image frame, such that a pose constraint relationship between the current image frame and the historical loop image frame may be established. If the two image frames have the same two-dimensional code, the camera poses corresponding to the two image frames are similar, namely the camera pose change is small, so that the pose of the second image frame can be calculated based on the pose of the first image frame. If different two-dimensional codes exist in the two image frames, the difference of the camera poses corresponding to the two image frames is large, namely the change of the camera pose is large, at the moment, the pose of the second image frame is calculated based on the pose of the first image frame, namely, the pose constraint relation between the two image frames cannot be established.
Specifically, if it is detected that a two-dimensional code, i.e., a target two-dimensional code, exists in the current image frame, whether the target two-dimensional code exists in each historical image frame can be detected, and the historical image frame in which the target two-dimensional code exists is determined as a historical loop image frame, so that loop detection can be performed more accurately by using the two-dimensional code, and a historical loop image frame with a camera pose similar to that of the current image frame is obtained, so that a pose constraint relation between the current image frame and the historical loop image frame is established subsequently to perform global pose optimization, and accumulated errors are effectively eliminated.
According to the technical scheme, the two-dimension codes are placed in the camera shooting environment in advance, so that the two-dimension codes can exist in image frames shot by the camera, the image frames shot under different camera poses can be distinguished by using the different two-dimension codes, and loop detection can be performed by using the two-dimension codes. If the target two-dimensional code is detected to exist in the shot current video frame, the historical loopback image frame with the target two-dimensional code is determined from the shot historical image frame, so that the historical image frame with the same target two-dimensional code as the current image frame can be used as the historical loopback image frame for loopback, the two-dimensional code is in a black-and-white form and has higher contrast, and the two-dimensional code can be accurately detected under poor ambient lighting conditions, so that the influence of ambient lighting change is not easily caused, the robustness and accuracy of loopback detection are improved, and the accumulated error is effectively eliminated.
On the basis of the above technical solution, after S120, the method may further include: if detecting that at least two-dimension codes exist in the current image frame, acquiring an image area of each two-dimension code in the current image frame; and determining the two-dimensional code with the largest image area as a target two-dimensional code.
Specifically, the two-dimensional code may not exist in the current image frame captured by the camera, one two-dimensional code may also exist, and a plurality of two-dimensional codes may also exist. When detecting that at least two-dimensional codes exist in the current image frame, a target two-dimensional code used as a loop detection reference needs to be determined from the at least two-dimensional codes, at this time, an image area occupied by each two-dimensional code in the current image frame can be obtained, the size of the image area is compared, the two-dimensional code with the largest image area is used as the target two-dimensional code, that is, the two-dimensional code with the largest image area is used as the target two-dimensional code, so that the two-dimensional code closest to the camera can be used as the target two-dimensional code, the accuracy of loop detection is further improved, and the pose optimization effect is further improved.
On the basis of the above technical solution, the step S130 of "determining the historical loop image frame in which the target two-dimensional code exists from the historical image frames" may include: determining candidate historical image frames with target two-dimensional codes based on the corresponding relation between the two-dimensional codes and the historical image frames and the target two-dimensional codes; if only one candidate historical image frame exists, determining the candidate historical image frame as a historical loopback image frame; and if at least two candidate historical image frames exist, determining a historical loop image frame from the candidate historical image frames based on the image area of the target two-dimensional code in each candidate historical image frame.
The correspondence between the two-dimensional code and the historical image frame can be used for representing the two-dimensional code existing in each historical image frame, and the correspondence can be established based on the two-dimensional code identification and the historical image frame identification.
Specifically, in the loop detection process for each image frame, a target two-dimensional code existing in each image frame detected in real time may be stored, so as to establish a correspondence between the two-dimensional code and a history image frame. Based on the corresponding relationship and the target two-dimensional code existing in the current image frame, each history image frame corresponding to the target two-dimensional code, that is, each candidate history image frame in which the target two-dimensional code exists, can be determined from all history image frames. If there is only one candidate history image frame, the candidate history image frame can be directly used as the history loop image frame. If at least two candidate historical image frames exist, the image area of the target two-dimensional code in each candidate historical image frame can be obtained, the size of the image area is compared, the candidate historical image frame with the largest image area is used as a historical loop image frame, namely, the candidate historical image frame with the largest image area can be used as the historical loop image frame, so that the candidate historical image frame shot when the target two-dimensional code is closest to the camera can be used as the historical loop image frame, the historical loop image frame with the best loop constraint effect can be obtained, the pose optimization effect is further improved, and the accumulated error is effectively eliminated.
For example, determining a candidate historical image frame in which the target two-dimensional code exists based on the target two-dimensional code and the corresponding relationship between the two-dimensional code and the historical image frame may include: determining an image frame loop range corresponding to a current image frame, and determining a to-be-selected historical image frame in the image frame loop range from the historical image frames; and determining candidate historical image frames with the target two-dimensional codes from the to-be-selected historical image frames based on the corresponding relation between the two-dimensional codes and the historical image frames and the target two-dimensional codes.
The image frame loop range corresponding to the current image frame refers to a history image frame range that can be subjected to loop constraint with the current image frame, that is, a loop detection range of the history image frame. For example, the image frame loop range may be set by a time constraint or a distance constraint, for example, a history image frame that is separated from the current image frame by more than 20 seconds may be used as the history image frame range, or a history image frame that is separated from the current image frame by more than 20 frames may be used as the history image frame range.
Specifically, an image frame looping range corresponding to the current image frame can be determined based on the shooting time or the shooting frame number of the current image frame, and each history image frame located in the image frame looping range is used as a history image frame to be selected, so that a candidate history image frame with a target two-dimensional code is determined in the history image frame to be selected, looping detection in the history image frame near the current image frame can be avoided, the detected history looping image frame can be a history image frame with an earlier shooting time, a looping constraint effect is further improved, and accumulated errors are eliminated more effectively.
Example two
Fig. 2 is a flowchart of a loop detection method according to a second embodiment of the present invention, and in this embodiment, on the basis of the foregoing embodiments, after a historical loop image frame corresponding to a current image frame is determined, a process of performing global pose optimization by using the historical loop image frame is described in detail. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 2, the loop detection method provided in this embodiment specifically includes the following steps:
s210, obtaining a current image frame and a historical image frame shot by a camera, wherein a plurality of different two-dimensional codes are placed in the shooting environment of the camera in advance.
And S220, detecting whether the two-dimensional code exists in the shot current image frame.
And S230, if the target two-dimensional code exists in the current image frame, determining a historical loopback image frame with the target two-dimensional code from the historical image frame.
And S240, determining a loop relative pose between the current image frame and the historical loop image frame.
The loop relative pose may refer to a relative pose between the current image frame and the historical loop image frame. The embodiment can determine the loop relative pose between the current image frame and the historical loop image frame based on the position relationship between the target two-dimensional code and the current image frame and the historical loop image frame respectively.
Exemplarily, S240 may include: determining a first relative pose between the target two-dimensional code and a current image frame and a second relative pose between the target two-dimensional code and a target historical image frame based on the actual angular point interval distance corresponding to the target two-dimensional code; and determining a loop relative pose between the current image frame and the historical loop image frame according to the first relative pose and the second relative pose.
The actual angular point spacing distance may refer to an actual distance between two adjacent angular points in the target two-dimensional code. The actual angular point spacing distance in this embodiment is known and fixed, so that pose estimation scale information can be provided.
In particular, canDetermining a first relative pose between the target two-dimensional code and the current image frame based on the actual angular point interval distance corresponding to the target two-dimensional code by an apriltag two-dimensional code algorithm
Figure BDA0002717880370000101
And a second relative pose between the target two-dimensional code and the target historical image frame
Figure BDA0002717880370000102
Obtaining a loop relative pose between the current image frame and the historical loop image frame by multiplying the first relative pose and the second relative pose
Figure BDA0002717880370000103
Namely, it is
Figure BDA0002717880370000104
Thereby establishing a loop-back constraint relationship.
Exemplarily, determining a first relative pose between the target two-dimensional code and the current image frame based on the actual angular point separation distance corresponding to the target two-dimensional code includes: extracting four corner information of a target two-dimensional code in a current image frame; constructing a homography matrix based on the four corner point information and the actual corner point spacing distance corresponding to the target two-dimensional code; and determining a first relative pose between the target two-dimensional code and the current image frame by solving the homography matrix.
Specifically, the target two-dimensional code is located on a plane, so that a homography matrix can be constructed based on four corner information of the target two-dimensional code and an actual corner spacing distance, and a rotation matrix and a translation vector, namely a first relative pose between the target two-dimensional code and a current image frame, are resolved in a mode of solving the homography matrix. Similarly, a second relative pose between the target two-dimensional code and the target historical image frame can be determined through the method. The pose estimation can be carried out based on the actual angular point spacing distance, so that the actual scale information can be introduced, and the accuracy and precision of the pose estimation are further improved.
And S250, acquiring initial relative poses corresponding to two adjacent image frames between the current image frame and the historical loop image frame.
Among them, the SLAM system is generally composed of a front-end Visual Odometer (VO) for estimating the motion of a camera between adjacent image frames and a local map, and a back-end optimization. The back-end optimization is used for optimizing the estimated motion and the information of loop detection to obtain a globally consistent track and map. The initial relative pose in the present embodiment may be an initial relative pose between two adjacent image frames obtained using a visual odometer. The initial relative pose corresponding to two adjacent image frames between the current image frame and the historical loop image frame may include: the image processing method comprises the steps of obtaining an initial relative pose between a previous image frame and a current image frame, an initial relative pose between two adjacent intermediate image frames and an initial relative pose between a historical loop image frame and a next image frame. The intermediate image frame may refer to an image frame located between the current image frame and the history loop image frame.
Specifically, the present embodiment may determine, for each captured image frame, an initial relative pose between each image frame and the previous image frame based on a preset camera pose estimation manner. For example, when a current image frame captured by a camera is obtained, a loop detection operation of the current image frame and an operation of estimating an initial relative pose between the current image frame and a previous image frame using a visual odometer may be simultaneously performed with respect to the current image frame. During global optimization, the initial relative poses corresponding to the two adjacent image frames between the current image frame and the historical loop image frame can be screened out from the pre-obtained initial relative poses corresponding to each two adjacent image frames.
The preset camera pose estimation mode can comprise a direct method and a characteristic point method. Wherein, the direct method specifically comprises the following steps: firstly, extracting high-gradient pixel points from one image frame, then setting an initial camera pose, constructing a loss function of luminosity errors of the pixel points extracted in the previous step between two adjacent image frames according to the initial camera pose, and solving and determining an optimal solution of the pose in a nonlinear optimization mode. The direct method can save the time for calculating the feature points and the descriptors and can be applied to application occasions with some missing features. The characteristic point method is that characteristic points in two adjacent image frames are respectively extracted and descriptors are calculated, and then the initial relative pose between the two image frames is solved through characteristic matching between the two image frames. The feature point method may specifically determine an initial relative pose between a current image frame and a previous image frame by: n feature points in the current image frame and the previous image frame are respectively extracted, and a descriptor of each feature point is calculated. And carrying out feature matching on the feature points of the current image frame and the feature points of the previous image frame by using a descriptor to obtain N feature matching pairs. Based on the N feature matching pairs, N pair-level constraint equations may be constructed. The N pairwise constraint equations may be solved by singular value decomposition or least squares to obtain a rotation matrix and a translation vector transformed from a current image frame to a previous image frame, i.e., an initial relative pose between the current image frame and the previous image frame. Wherein, the epipolar constraint equation corresponding to each feature matching pair may be:
p1=K(RK-1p2+t)
wherein p is1Is the pixel coordinate, p, of the feature point in the previous image frame2Is the pixel coordinates of the feature point in the current image frame; k is a known camera intrinsic parameter; r is a rotation matrix for converting the current image frame to the previous image frame; t is the translation vector for the current image frame to the previous image frame.
The characteristic point method is not based on the assumption of unchanged gray level, so that the method has stronger tolerance to overexposure or rapid movement of a camera, is difficult to track loss and failure, and has strong robustness. The present embodiment may determine the initial relative pose between two adjacent image frames using a feature point method.
And S260, optimizing the camera pose corresponding to each image frame between the current image frame and the historical loop image frame according to the loop relative pose and each initial relative pose, and determining the target camera pose corresponding to each image frame.
The camera pose corresponding to the image frame may refer to a camera pose at a shooting time corresponding to the image frame. The camera pose can be characterized using the rotation matrix and translation vector of the camera. The target camera pose may refer to a final camera pose obtained after optimization. The target camera pose in this embodiment may refer to a camera pose in a world coordinate system. For example, the target camera pose corresponding to each image frame may refer to the camera pose of each image frame relative to the first image frame taken for the first time.
Specifically, according to the loop relative pose between the current image frame and the historical loop image frame and the initial relative pose corresponding to each two adjacent image frames between the current image frame and the historical loop image frame, the camera poses corresponding to all the image frames between the current image frame and the historical loop image frame can be globally optimized, and the target camera pose corresponding to each optimized image frame is obtained, so that accumulated errors can be effectively eliminated, the pose optimization effect is improved, and more accurate camera poses are obtained.
Illustratively, S260 may include: establishing an objective function containing the camera pose corresponding to each image frame between the current image frame and the historical loop image frame according to the loop relative pose and each initial relative pose; and minimizing the target function based on a least square mode to obtain the target camera pose corresponding to each image frame.
The objective function can be a loss function to be optimized, and the optimized pose of the objective camera is obtained by minimizing the objective function. The least squares optimization method may be, but is not limited to, gauss-newton method, levenberg-marquardt method. Illustratively, the objective function may be:
Figure BDA0002717880370000131
wherein c represents the current image frame; h represents a historical loop image frame; i represents any image frame between the current image frame and the historical loop image frame;
Figure BDA0002717880370000132
representing an initial relative pose between the (i + 1) th image frame and the ith image frame;
Figure BDA0002717880370000133
representing the camera pose corresponding to the ith image frame in a world coordinate system;
Figure BDA0002717880370000134
representing the camera pose corresponding to the (i + 1) th image frame in the world coordinate system;
Figure BDA0002717880370000135
representing a loop relative pose between a current image frame and a historical loop image frame;
Figure BDA0002717880370000136
representing the corresponding camera pose of the historical loop image frame under a world coordinate system;
Figure BDA0002717880370000137
and representing the corresponding camera pose of the current image frame in a world coordinate system. Specifically, the camera pose corresponding to the ith image frame in the objective function
Figure BDA0002717880370000138
Camera pose corresponding to i +1 th image frame
Figure BDA0002717880370000141
Camera pose corresponding to historical loop image frame
Figure BDA0002717880370000142
And the camera pose corresponding to the current image frame
Figure BDA0002717880370000143
Is the parameter to be optimized.
Specifically, by using a least square optimization mode, the least square optimization can be performed on the objective function, so as to obtain the optimal estimation of the target camera pose corresponding to each image frame.
According to the technical scheme of the embodiment, the camera poses corresponding to all the image frames between the current image frame and the historical loop image frame are subjected to global pose optimization according to the loop relative pose between the current image frame and the historical loop image frame and the initial relative pose corresponding to each two adjacent image frames between the current image frame and the historical loop image frame, so that accumulated errors can be effectively eliminated, the pose optimization effect is improved, and more accurate camera poses are obtained.
The following is an embodiment of a loop detection apparatus provided in an embodiment of the present invention, and the apparatus and the loop detection method in the foregoing embodiments belong to the same inventive concept, and details that are not described in detail in the embodiment of the loop detection apparatus may refer to the embodiment of the loop detection method described above.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a loop detection apparatus according to a third embodiment of the present invention, which is applicable to a situation of loop detection on a current image frame, and the apparatus specifically includes: an image frame acquisition module 310, a current image frame detection module 320 and a historical loop image frame determination module 330;
the image frame acquiring module 310 is configured to acquire a current image frame and a historical image frame captured by a camera, where a plurality of different two-dimensional codes are placed in advance in a camera capturing environment; a current image frame detection module 320, configured to detect whether a two-dimensional code exists in a captured current image frame; the history loop image frame determining module 330 is configured to determine, if the target two-dimensional code exists in the current image frame, a history loop image frame in which the target two-dimensional code exists from the history image frame.
According to the technical scheme, the two-dimension codes are placed in the camera shooting environment in advance, so that the two-dimension codes can exist in image frames shot by the camera, the image frames shot under different camera poses can be distinguished by using the different two-dimension codes, and loop detection can be performed by using the two-dimension codes. If the target two-dimensional code is detected to exist in the shot current video frame, the historical loopback image frame with the target two-dimensional code is determined from the shot historical image frame, so that the historical image frame with the same target two-dimensional code as the current image frame can be used as the historical loopback image frame for loopback, the two-dimensional code is in a black-and-white form and has higher contrast, and the two-dimensional code can be accurately detected under poor ambient lighting conditions, so that the influence of ambient lighting change is not easily caused, the robustness and accuracy of loopback detection are improved, and the accumulated error is effectively eliminated.
Optionally, the historical loop image frame determination module 330 includes:
the candidate historical image frame determining submodule is used for determining a candidate historical image frame with a target two-dimensional code based on the corresponding relation between the two-dimensional code and the historical image frame and the target two-dimensional code;
the history loop image frame determining submodule is used for determining the candidate history image frame as the history loop image frame if only one candidate history image frame exists; and if at least two candidate historical image frames exist, determining a historical loop image frame from the candidate historical image frames based on the image area of the target two-dimensional code in each candidate historical image frame.
Optionally, the candidate historical image frame determining sub-module is specifically configured to: determining an image frame loop range corresponding to a current image frame, and determining a to-be-selected historical image frame in the image frame loop range from the historical image frames; and determining candidate historical image frames with the target two-dimensional codes from the to-be-selected historical image frames based on the corresponding relation between the two-dimensional codes and the historical image frames and the target two-dimensional codes.
Optionally, the apparatus further comprises:
the target two-dimensional code determining module is used for acquiring an image area of each two-dimensional code in the current image frame if at least two-dimensional codes exist in the current image frame after detecting whether the two-dimensional codes exist in the shot current image frame; and determining the two-dimensional code with the largest image area as a target two-dimensional code.
Optionally, the apparatus further comprises:
the loop relative pose determining module is used for determining a loop relative pose between a current image frame and a historical loop image frame after determining the historical loop image frame with the target two-dimensional code from the historical image frame;
the initial relative pose acquisition module is used for acquiring initial relative poses corresponding to two adjacent image frames between the current image frame and the historical loop image frame;
and the camera pose optimization module is used for optimizing the camera pose corresponding to each image frame between the current image frame and the historical loop image frame according to the loop relative pose and each initial relative pose, and determining the target camera pose corresponding to each image frame.
Optionally, the loop relative pose determination module includes:
the relative pose determining submodule is used for determining a first relative pose between the target two-dimensional code and the current image frame and a second relative pose between the target two-dimensional code and the target historical image frame based on the actual angular point interval distance corresponding to the target two-dimensional code;
and the loop relative pose determining submodule is used for determining a loop relative pose between the current image frame and the historical loop image frame according to the first relative pose and the second relative pose.
Optionally, the relative pose determination submodule is specifically configured to:
extracting four corner information of a target two-dimensional code in a current image frame; constructing a homography matrix based on the four corner point information and the actual corner point spacing distance corresponding to the target two-dimensional code; and determining a first relative pose between the target two-dimensional code and the current image frame by solving the homography matrix.
Optionally, the camera pose optimization module is specifically configured to:
establishing an objective function containing the camera pose corresponding to each image frame between the current image frame and the historical loop image frame according to the loop relative pose and each initial relative pose; and minimizing the target function based on a least square mode to obtain the target camera pose corresponding to each image frame.
Optionally, the objective function is:
Figure BDA0002717880370000171
wherein c represents the current image frame; h represents a historical loop image frame; i represents any image frame between the current image frame and the historical loop image frame;
Figure BDA0002717880370000172
representing an initial relative pose between the (i + 1) th image frame and the ith image frame;
Figure BDA0002717880370000173
representing the camera pose corresponding to the ith image frame in a world coordinate system;
Figure BDA0002717880370000174
representing the camera pose corresponding to the (i + 1) th image frame in the world coordinate system;
Figure BDA0002717880370000175
representing a loop relative pose between a current image frame and a historical loop image frame;
Figure BDA0002717880370000176
representing the corresponding camera pose of the historical loop image frame under a world coordinate system;
Figure BDA0002717880370000177
and representing the corresponding camera pose of the current image frame in a world coordinate system.
The loop detection device provided by the embodiment of the invention can execute the loop detection method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects for executing the loop detection method.
It should be noted that, in the embodiment of the loop detection apparatus, each included unit and module are only divided according to functional logic, but are not limited to the above division as long as the corresponding function can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
Example four
Fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention. FIG. 4 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 4 is only an example and should not bring any limitation to the function and the scope of use of the embodiment of the present invention.
As shown in FIG. 4, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, and commonly referred to as a "hard drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. System memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in system memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with electronic device 12, and/or with any devices (e.g., network card, modem, etc.) that enable electronic device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, implementing a loop detection method provided by the embodiment of the present invention, the method includes:
acquiring a current image frame and a historical image frame shot by a camera, wherein a plurality of different two-dimensional codes are placed in a shooting environment of the camera in advance;
detecting whether a two-dimensional code exists in a shot current image frame;
and if the target two-dimensional code exists in the current image frame, determining a historical loopback image frame with the target two-dimensional code from the historical image frame.
Of course, those skilled in the art can understand that the processor can also implement the technical solution of the loop detection method provided in any embodiment of the present invention.
EXAMPLE five
The present embodiment provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the loop detection method steps as provided by any of the embodiments of the present invention, the method comprising:
acquiring a current image frame and a historical image frame shot by a camera, wherein a plurality of different two-dimensional codes are placed in a shooting environment of the camera in advance;
detecting whether a two-dimensional code exists in a shot current image frame;
and if the target two-dimensional code exists in the current image frame, determining a historical loopback image frame with the target two-dimensional code from the historical image frame.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer-readable storage medium may be, for example but not limited to: an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It will be understood by those skilled in the art that the modules or steps of the invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of computing devices, and optionally they may be implemented by program code executable by a computing device, such that it may be stored in a memory device and executed by a computing device, or it may be separately fabricated into various integrated circuit modules, or it may be fabricated by fabricating a plurality of modules or steps thereof into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (12)

1. A loop detection method, comprising:
acquiring a current image frame and a historical image frame shot by a camera, wherein a plurality of different two-dimensional codes are placed in a shooting environment of the camera in advance;
detecting whether a two-dimensional code exists in the shot current image frame;
and if the target two-dimensional code exists in the current image frame, determining a historical loopback image frame with the target two-dimensional code from the historical image frame.
2. The method of claim 1, wherein determining a historical loop image frame in which the target two-dimensional code exists from the historical image frames comprises:
determining candidate historical image frames with the target two-dimensional codes based on the corresponding relation between the two-dimensional codes and the historical image frames and the target two-dimensional codes;
if only one candidate historical image frame exists, determining the candidate historical image frame as a historical loop image frame;
if at least two candidate historical image frames exist, determining a historical loopback image frame from the candidate historical image frames based on the image area of the target two-dimensional code in each candidate historical image frame.
3. The method of claim 2, wherein determining the candidate historical image frame in which the target two-dimensional code exists based on the target two-dimensional code and the corresponding relationship between the two-dimensional code and the historical image frame comprises:
determining an image frame looping range corresponding to the current image frame, and determining a to-be-selected historical image frame in the image frame looping range from the historical image frames;
and determining candidate historical image frames with the target two-dimensional codes from the to-be-selected historical image frames based on the corresponding relation between the two-dimensional codes and the historical image frames and the target two-dimensional codes.
4. The method according to claim 1, after detecting whether the two-dimensional code exists in the current image frame taken, further comprising:
if detecting that at least two-dimension codes exist in the current image frame, acquiring an image area of each two-dimension code in the current image frame;
and determining the two-dimensional code with the largest image area as a target two-dimensional code.
5. The method according to any one of claims 1 to 4, wherein after determining the historical loop image frame in which the target two-dimensional code exists from the historical image frames, the method further comprises:
determining a loop relative pose between the current image frame and the historical loop image frame;
acquiring initial relative poses corresponding to two adjacent image frames between the current image frame and the historical loop image frame;
and optimizing the camera pose corresponding to each image frame between the current image frame and the historical loop image frame according to the loop relative pose and each initial relative pose, and determining the target camera pose corresponding to each image frame.
6. The method of claim 5, wherein determining a loop relative pose between the current image frame and the historical loop image frame comprises:
determining a first relative pose between the target two-dimensional code and the current image frame and a second relative pose between the target two-dimensional code and the target historical image frame based on an actual angular point interval distance corresponding to the target two-dimensional code;
determining a loop relative pose between the current image frame and the historical loop image frame according to the first relative pose and the second relative pose.
7. The method of claim 6, wherein determining a first relative pose between the target two-dimensional code and the current image frame based on an actual corner separation distance corresponding to the target two-dimensional code comprises:
extracting four corner information of the target two-dimensional code in the current image frame;
constructing a homography matrix based on the four corner point information and the actual corner point spacing distance corresponding to the target two-dimensional code;
and determining a first relative pose between the target two-dimensional code and the current image frame by solving the homography matrix.
8. The method of claim 5, wherein optimizing a camera pose corresponding to each image frame between the current image frame and the historical loop image frames based on the loop relative pose and each of the initial relative poses to determine a target camera pose corresponding to each image frame comprises:
establishing an objective function containing a camera pose corresponding to each image frame between the current image frame and the historical loop image frame according to the loop relative pose and each initial relative pose;
and minimizing the target function based on a least square mode to obtain the target camera pose corresponding to each image frame.
9. The method of claim 8, wherein the objective function is:
Figure FDA0002717880360000031
wherein c represents the current image frame; h represents the historical loop image frame; i represents any image frame between the current image frame and the historical loop image frame;
Figure FDA0002717880360000032
representing an initial relative pose between the (i + 1) th image frame and the ith image frame;
Figure FDA0002717880360000033
representing the camera pose corresponding to the ith image frame in a world coordinate system;
Figure FDA0002717880360000034
representing the camera pose corresponding to the (i + 1) th image frame in the world coordinate system;
Figure FDA0002717880360000035
representing a loop relative pose between the current image frame and the historical loop image frame;
Figure FDA0002717880360000036
representing the corresponding camera pose of the historical loop image frame under a world coordinate system;
Figure FDA0002717880360000037
and representing the camera pose corresponding to the current image frame in a world coordinate system.
10. A loop detection apparatus, comprising:
the image frame acquisition module is used for acquiring a current image frame and a historical image frame shot by a camera, wherein a plurality of different two-dimensional codes are placed in the shooting environment of the camera in advance;
the current image frame detection module is used for detecting whether the two-dimensional code exists in the shot current image frame;
and the historical loop image frame determining module is used for determining the historical loop image frame with the target two-dimensional code from the historical image frame if the target two-dimensional code exists in the current image frame.
11. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the loop detection method of any one of claims 1-9.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out a loop detection method according to any one of claims 1 to 9.
CN202011079676.6A 2020-10-10 2020-10-10 Loop detection method and device, electronic equipment and storage medium Active CN113763466B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011079676.6A CN113763466B (en) 2020-10-10 2020-10-10 Loop detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011079676.6A CN113763466B (en) 2020-10-10 2020-10-10 Loop detection method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113763466A true CN113763466A (en) 2021-12-07
CN113763466B CN113763466B (en) 2024-06-14

Family

ID=78785813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011079676.6A Active CN113763466B (en) 2020-10-10 2020-10-10 Loop detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113763466B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115982399A (en) * 2023-03-16 2023-04-18 北京集度科技有限公司 Image searching method, mobile device, electronic device and computer program product
CN116358533A (en) * 2023-05-31 2023-06-30 小米汽车科技有限公司 Loop detection data processing method and device, storage medium and vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007142227A1 (en) * 2006-06-07 2007-12-13 Nec Corporation Image direction judging device, image direction judging method and image direction judging program
WO2014114118A1 (en) * 2013-01-28 2014-07-31 Tencent Technology (Shenzhen) Company Limited Realization method and device for two-dimensional code augmented reality
CN109556596A (en) * 2018-10-19 2019-04-02 北京极智嘉科技有限公司 Air navigation aid, device, equipment and storage medium based on ground texture image
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN110689562A (en) * 2019-09-26 2020-01-14 深圳市唯特视科技有限公司 Trajectory loop detection optimization method based on generation of countermeasure network
US20200116498A1 (en) * 2018-10-16 2020-04-16 Ubtech Robotics Corp Visual assisted distance-based slam method and mobile robot using the same
CN111598149A (en) * 2020-05-09 2020-08-28 鹏城实验室 Loop detection method based on attention mechanism
CN111696118A (en) * 2020-05-25 2020-09-22 东南大学 Visual loopback detection method based on semantic segmentation and image restoration in dynamic scene

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007142227A1 (en) * 2006-06-07 2007-12-13 Nec Corporation Image direction judging device, image direction judging method and image direction judging program
WO2014114118A1 (en) * 2013-01-28 2014-07-31 Tencent Technology (Shenzhen) Company Limited Realization method and device for two-dimensional code augmented reality
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
US20200116498A1 (en) * 2018-10-16 2020-04-16 Ubtech Robotics Corp Visual assisted distance-based slam method and mobile robot using the same
CN109556596A (en) * 2018-10-19 2019-04-02 北京极智嘉科技有限公司 Air navigation aid, device, equipment and storage medium based on ground texture image
CN110689562A (en) * 2019-09-26 2020-01-14 深圳市唯特视科技有限公司 Trajectory loop detection optimization method based on generation of countermeasure network
CN111598149A (en) * 2020-05-09 2020-08-28 鹏城实验室 Loop detection method based on attention mechanism
CN111696118A (en) * 2020-05-25 2020-09-22 东南大学 Visual loopback detection method based on semantic segmentation and image restoration in dynamic scene

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115982399A (en) * 2023-03-16 2023-04-18 北京集度科技有限公司 Image searching method, mobile device, electronic device and computer program product
CN116358533A (en) * 2023-05-31 2023-06-30 小米汽车科技有限公司 Loop detection data processing method and device, storage medium and vehicle
CN116358533B (en) * 2023-05-31 2023-08-29 小米汽车科技有限公司 Loop detection data processing method and device, storage medium and vehicle

Also Published As

Publication number Publication date
CN113763466B (en) 2024-06-14

Similar Documents

Publication Publication Date Title
CN110322500B (en) Optimization method and device for instant positioning and map construction, medium and electronic equipment
CN109242913B (en) Method, device, equipment and medium for calibrating relative parameters of collector
CN108805917B (en) Method, medium, apparatus and computing device for spatial localization
CN107633526B (en) Image tracking point acquisition method and device and storage medium
EP3028252B1 (en) Rolling sequential bundle adjustment
US7554575B2 (en) Fast imaging system calibration
CN111445526B (en) Method, device and storage medium for estimating pose of image frame
CN110111388B (en) Three-dimensional object pose parameter estimation method and visual equipment
CN110349212B (en) Optimization method and device for instant positioning and map construction, medium and electronic equipment
CN108839016B (en) Robot inspection method, storage medium, computer equipment and inspection robot
CN112734852A (en) Robot mapping method and device and computing equipment
KR102169309B1 (en) Information processing apparatus and method of controlling the same
CN111383204A (en) Video image fusion method, fusion device, panoramic monitoring system and storage medium
CN113763466B (en) Loop detection method and device, electronic equipment and storage medium
CN113902932A (en) Feature extraction method, visual positioning method and device, medium and electronic equipment
CN117253022A (en) Object identification method, device and inspection equipment
CN111951328A (en) Object position detection method, device, equipment and storage medium
JP2014102805A (en) Information processing device, information processing method and program
US20220405954A1 (en) Systems and methods for determining environment dimensions based on environment pose
CN112802112B (en) Visual positioning method, device, server and storage medium
CN112262411B (en) Image association method, system and device
CN117115434A (en) Data dividing apparatus and method
CN113763468A (en) Positioning method, device, system and storage medium
Porzi et al. An automatic image-to-DEM alignment approach for annotating mountains pictures on a smartphone
CN117726656B (en) Target tracking method, device, system and medium based on super-resolution image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant