CN114998624A - Image searching method and device - Google Patents

Image searching method and device Download PDF

Info

Publication number
CN114998624A
CN114998624A CN202210489045.4A CN202210489045A CN114998624A CN 114998624 A CN114998624 A CN 114998624A CN 202210489045 A CN202210489045 A CN 202210489045A CN 114998624 A CN114998624 A CN 114998624A
Authority
CN
China
Prior art keywords
image
similarity
calibration
saturation
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210489045.4A
Other languages
Chinese (zh)
Inventor
冉祥
陈小川
邓志伟
刘欣冉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Micro Chain Daoi Technology Co ltd
Original Assignee
Beijing Micro Chain Daoi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Micro Chain Daoi Technology Co ltd filed Critical Beijing Micro Chain Daoi Technology Co ltd
Priority to CN202210489045.4A priority Critical patent/CN114998624A/en
Publication of CN114998624A publication Critical patent/CN114998624A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/532Query formulation, e.g. graphical querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5838Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The invention discloses an image searching method and device with high identification accuracy, free adoption of various calibration board patterns, labor saving and good light environment adaptability, which comprises the steps of reading an image example to be searched, performing consistency optimization processing capable of reducing adverse effects caused by strong exposure and/or shaking, and obtaining an enhanced image Q; calculating the image characteristics of the enhanced image Q; calculating hue similarity, saturation similarity and brightness similarity between the enhanced image Q and the reference image feature information according to the reference image feature information in the image feature database; respectively multiplying the hue value similarity, the saturation value similarity and the brightness similarity by corresponding weights, and summing to generate a similarity result; finally, determining the similarity of the two images according to the similarity result; if the determination is yes, information corresponding to the reference image is output as a search result.

Description

Image searching method and device
Technical Field
The invention relates to the technical field of machine vision, in particular to a camera automatic calibration algorithm based on two-dimensional feature points.
Background
Due to the influence of machining errors, assembly errors and other factors, the compliant assembly robot arm is selected. The actual arm length and the theoretical arm length of the robot have certain deviation, and the first connecting rod and the second connecting rod of the robot are difficult to ensure to be strictly on the same straight line, so that zero point offset is caused, and the absolute positioning accuracy of the robot can be influenced by the factors. Parameter calibration is an important method for improving the absolute positioning accuracy of the robot. Calibration of the camera is the first step in its use and is also a necessary step. The existing line camera calibration requires the manufacture of a specific calibration plate, such as a checkerboard pattern. In the prior art, for example, chinese patent application with an authorization publication number of CN113459084A discloses a method, an apparatus, a device and a storage medium for calibrating robot parameters. The method comprises the following steps: acquiring at least three calibration points on a calibration plate and theoretical joint parameters of a robot arm; acquiring the height of the tail end of the robot arm when a calibration board image acquired by the tail end of the robot arm is a target image; controlling the tail end of the robot arm to sequentially move to each calibration point under the height, and sequentially acquiring a first coordinate when the tail end of the robot arm moves to each calibration point and an actual joint parameter of the robot arm; and determining a joint parameter compensation value of the robot arm according to the actual joint parameter, the first coordinate and the theoretical joint parameter, and calibrating the theoretical joint parameter of the robot arm.
Disclosure of Invention
The invention aims to provide an image searching method which has high identification accuracy, can freely adopt various calibration plate patterns, saves manpower and has good light environment adaptability aiming at the prior art, and comprises the following steps of,
reading an image example to be searched, and performing consistency optimization processing capable of reducing adverse effects caused by strong exposure and/or shaking to obtain an enhanced image Q; calculating the image characteristics of the enhanced image Q; calculating hue similarity, saturation similarity and brightness similarity between the enhanced image Q and the reference image feature information according to the reference image feature information in the image feature database; respectively multiplying the hue value similarity, the saturation value similarity and the brightness similarity by corresponding weights, and summing to generate a similarity result; finally, determining the similarity of the two images according to the similarity result; if the determination is yes, information corresponding to the reference image is output as a search result. The steps of the image search method described above are performed recursively by selecting different reference images from the image database and comparing them with the input image until a correct search result is produced. The consistency optimization is adopted to reduce adverse effects caused by exposure and jitter, the calibration precision and speed can be improved, and the equipment can be endowed with identification and comparison of the image sample to be searched and the reference image in the database.
In order to further optimize the technical scheme, the adopted optimization measures further comprise:
the image characteristics include: hue invariant moment, saturation invariant moment, brightness invariant moment, and key hue level, key saturation level, and key brightness level. The central moment of an image can be used to extract invariant moments that are invariant to movement, scaling and rotation, and feature discrimination of the image is performed by extracting these features.
The above-described consistency optimization algorithm is as follows,
1) decomposing RGB channels of the image I (x, y) to obtain sub-band coefficients of each scale and direction of each channel image;
2) according to the formula (1) and the formula (2), the final estimated light map is obtained by utilizing the low-frequency subband coefficient
Figure 139233DEST_PATH_IMAGE002
3) According to the formulas (3) to (6), the high-frequency direction subband coefficients of each color channel are shrunkAnd obtaining the final estimated illumination map after processing
Figure 590461DEST_PATH_IMAGE004
4) According to formula (7), using
Figure 901357DEST_PATH_IMAGE006
And
Figure 827725DEST_PATH_IMAGE008
the inverse transform is performed to obtain an enhanced image Q. In the NSCT domain, obtaining a final estimated illumination map by using the estimated illumination map of the low-frequency sub-band; and the coefficient separation and elimination of noise and details are realized in the high-frequency sub-band through the atrophy setting function. Through consistency optimization processing, the distinguishing characteristics of the similarity, the saturation and the brightness are amplified, the significance of a similarity result can be improved, and misjudgment of image searching is prevented.
The image searching method and apparatus as claimed in claim 1, wherein: the searching image is exemplified by a calibration plate pattern; the calibration board has various patterns. Through an efficient image searching method, the calibration pattern can be quickly found for comparison, automatic calibration can be carried out, and calibration can be carried out through various calibration patterns. Even if the calibration plate is lost, calibration cannot be influenced.
The invention also discloses a computer program for realizing the image searching method.
The invention also discloses a storage medium storing the computer program.
The invention also discloses a robot arm calibration device, which is characterized in that: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
a camera for taking a calibration image,
and, a storage medium, the computer program described above; the device can calibrate images into various calibration patterns. The various patterns of calibration board can improve the convenience, the calibration board of compatible different machines. The image enhancement algorithm is more suitable for the requirement that high-contrast patterns such as calibration plate patterns are normally calibrated under complex ambient light.
Because the invention adopts the steps of reading the image example to be searched and carrying out consistency optimization processing which can reduce adverse effects caused by strong exposure and/or shaking, an enhanced image Q is obtained; calculating the image characteristics of the enhanced image Q; and calculating processing modes such as hue similarity, saturation similarity, brightness similarity and the like between the enhanced image Q and the reference image feature information according to the reference image feature information in the database. The steps of the image search method described above are performed recursively by selecting different reference images from the image database and comparing them with the input image until a correct search result is produced. The consistency optimization is adopted to reduce adverse effects caused by exposure and jitter, the calibration precision and speed can be improved, and the equipment can be endowed with identification and comparison of the image sample to be searched and a reference image in a database. Therefore, the invention has the advantages of high identification efficiency, less errors and high speed.
Drawings
FIG. 1 is a schematic sequence of steps according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an application scenario of an apparatus according to an embodiment of the present invention;
fig. 3 is an example of a calibration plate according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following examples.
Example 1:
referring to fig. 1 to 3, an image search method includes reading an example of an image to be searched and performing a consistency optimization process capable of reducing adverse effects caused by strong exposure and/or jitter to obtain an enhanced image Q; calculating the image characteristics of the enhanced image Q; calculating hue similarity, saturation similarity and brightness similarity between the enhanced image Q and the reference image feature information according to the reference image feature information in the image feature database; respectively multiplying the hue value similarity, the saturation value similarity and the brightness similarity by corresponding weights, and summing to generate a similarity result; finally, determining the similarity of the two images according to the similarity result; if the determination is yes, information corresponding to the reference image is output as a search result. The steps of the image search method described above are performed recursively by selecting different reference images from the image database and comparing them with the input image until a correct search result is produced. The consistency optimization is adopted to reduce adverse effects caused by exposure and jitter, the calibration precision and speed can be improved, and the equipment can be endowed with identification and comparison of the image sample to be searched and the reference image in the database.
The image characteristics include: hue invariant moment, saturation invariant moment, brightness invariant moment, and key hue level, key saturation level, and key brightness level. The central moment of an image can be used to extract invariant moments that are invariant to movement, scaling and rotation, and feature discrimination of the image is performed by extracting these features.
The above-described consistency optimization algorithm is as follows,
1) decomposing RGB channels of the image I (x, y) to obtain sub-band coefficients of each scale and direction of each channel image;
2) obtaining a final estimated illumination map by using the low-frequency subband coefficient according to the formula (1) and the formula (2)
Figure 907676DEST_PATH_IMAGE010
3) According to the formulas (3) to (6), the high-frequency direction sub-band coefficients of each color channel are shrunk, and a processed final estimated illumination map is obtained
Figure 261297DEST_PATH_IMAGE012
4) According to formula (7), using
Figure 59489DEST_PATH_IMAGE014
And
Figure 523968DEST_PATH_IMAGE016
the inverse transform is performed to obtain an enhanced image Q. In the NSCT domain, obtaining a final estimated illumination map by using the estimated illumination map of the low-frequency sub-band; and the coefficient separation and elimination of noise and details are realized in the high-frequency sub-band through the atrophy setting function. Through consistency optimization processing, the distinguishing characteristics of similarity, saturation and brightness are amplified, and the similarity can be improvedAnd the significance of the result prevents the misjudgment of image searching.
The image lookup method and apparatus as claimed in claim 1, wherein: the searching image is exemplified by a calibration plate pattern; the calibration board has various patterns. Through the efficient image searching method, the calibration pattern can be quickly found for comparison, automatic calibration can be carried out, and calibration can be carried out through various calibration patterns. Even if the calibration plate is lost, calibration cannot be influenced.
Example 2:
on the basis of the above embodiment, this embodiment further illustrates the technical solution:
calculating the feature data of the image feature database: reading an RGB image to be searched and carrying out consistency optimization which can reduce adverse effects caused by strong exposure and/or shaking; preprocessing the image to be searched and converting the image to be searched into a Hue Saturation Value (HSV) mode; generating a cumulative histogram; selecting a key hue grade, a key saturation grade and a key brightness grade; calculating the invariant moment of each key grade; and storing the obtained hue invariant moment, saturation invariant moment and brightness invariant moment, and the obtained key hue grade, key saturation grade and key brightness grade as image characteristics in the image characteristic database. Hu et al, in "Visual Pattern Recognition by Moment elements" 1962, indicate that the center Moment of an image can be used to extract moments of invariance that are invariant to movement, scaling and rotation. The scholars introduce key hue levels, key saturation levels and key brightness levels into the computation of invariant moments and extract image features from them. However, it has been found through further research that the invariant mode described above may have a large extraction error when the reference image has jitter or strong exposure.
The method of converting an image into a Hue Saturation Value (HSV) model, each pixel including a hue level (H), a saturation level (S), and a brightness level (V), may accomplish the conversion of the image from RGB to HSV space. The value of q in the combination of the following formulas ranges from 0 to 360 degrees.
Figure RE-DEST_PATH_IMAGE017
Figure RE-DEST_PATH_IMAGE019
Figure RE-DEST_PATH_IMAGE021
Figure 51902DEST_PATH_IMAGE022
The consistency optimization algorithm is as follows,
1) decomposing RGB channels of the image I (x, y) to obtain sub-band coefficients of each scale and direction of each channel image;
2) obtaining a final estimated illumination map by using the low-frequency subband coefficient according to the formula (1) and the formula (2)
Figure 310845DEST_PATH_IMAGE024
First calculating basic estimation light map
Figure 596332DEST_PATH_IMAGE026
Figure 864503DEST_PATH_IMAGE028
Wherein the content of the first and second substances,maxto representcÎ{r,g,bThe three color channels are operated at maximum, e is a small positive number, which is used to keep
Figure 653467DEST_PATH_IMAGE030
Not equal to 0 and has a smaller limit value of [0.03,0.09 ]]。
Figure 614470DEST_PATH_IMAGE032
Representing low frequency subbandsThe coefficients of which are such that,
Figure 121675DEST_PATH_IMAGE034
the high-frequency direction subband coefficient is represented, and contains the edge, the texture detail and almost all the noise information of the image.
Figure 927957DEST_PATH_IMAGE036
Equation (2) is a base estimation illumination map
Figure 102586DEST_PATH_IMAGE038
Gamma calibration is carried out, then mean value filtering is carried out, and the final estimated illumination map is obtained
Figure 703332DEST_PATH_IMAGE040
. mean () denotes the mean filtering operation, QUOTE
Figure 983919DEST_PATH_IMAGE042
Figure 593892DEST_PATH_IMAGE042
Is a value of [0.5,0.55]。
3) According to the formulas (3) to (6), the high-frequency direction sub-band coefficients of each color channel are shrunk, and the processed atrophy setting function is obtained
Figure 357449DEST_PATH_IMAGE044
(ii) a Realizing coefficient separation and elimination of noise and details in a high-frequency sub-band;
Figure 660254DEST_PATH_IMAGE046
k j indicating that the decomposition metric j corresponds to a threshold scaling factor,T j,s the root mean square of each directional subband of NSCT in the frequency domain is shown.
Figure 876472DEST_PATH_IMAGE048
Is the noise variance of each color channel of the image. The parameter calculation method comprises the following steps:
Figure 290136DEST_PATH_IMAGE050
m represents andI c (x,y)matrices with the same M x N dimensions.
Figure RE-DEST_PATH_IMAGE037
Representing NSCT using the same scale and direction number decomposition M Z The obtained j, s coefficients of the sub-bands in each scale direction,
Figure RE-DEST_PATH_IMAGE039
representing a two-dimensional fourier transform, and | x | representing an absolute value operation.
Figure RE-DEST_PATH_IMAGE041
Figure DEST_PATH_IMAGE058
And the noise variance of each color channel of the observed image is obtained by a wavelet noise estimation empirical formula. In the above formulamid () Median operation is taken, | x | represents an absolute value,
Figure DEST_PATH_IMAGE060
is a "sym 8" wavelet pairIc (x, y) And performing single-scale decomposition to obtain the high-frequency sub-band coefficient.
Figure DEST_PATH_IMAGE061
4) According to formula (7), utilize
Figure DEST_PATH_IMAGE063
And
Figure DEST_PATH_IMAGE065
performing an inverse transformation to obtain an enhanced image
Figure DEST_PATH_IMAGE067
And obtaining a final estimated illumination map and a high-frequency sub-band coefficient with a shrunk threshold value, if the details of the high-frequency sub-band of the highlighted part are not considered, combining the low-frequency sub-band and the shrunken high-frequency sub-band to implement NSCT inverse transformation, and then calculating according to the formula (7) to obtain the target enhanced image.
Figure DEST_PATH_IMAGE069
max () denotes taking the maximum value, I l (x, y) and I h (x, y) are the low frequency component and the high frequency component, respectively, of the image to be enhanced.
A camera at the tail end of the robot arm collects a calibration plate image above the calibration plate; the target image may be a calibration board image expected to be acquired during the calibration process of the robot parameters, and may be, for example, a focused calibration board image or a calibration board image with the highest definition. The technical scheme of the invention only relates to image processing, identification and searching, and the mechanical part is not described in detail with reference to the prior art.
Example 3:
on the basis of the above embodiment, this embodiment further illustrates the technical solution:
a robot arm calibration device comprises a calibration device,
a camera for taking calibration images, and a storage medium storing a computer program enabling the implementation of the method of the invention; the robot arm calibration device can calibrate images into various calibration patterns. The calibration method basically adopts the prior art, and at least three calibration points on a calibration plate and theoretical joint parameters of a robot arm are obtained; in the prior art, a calibration plate image acquired at the tail end of a robot arm is determined to be a target image and the height of the tail end of the robot arm is tested, the invention adopts any pattern calibration plate with known area diameter or side length, a reference image of the image is also stored on the storage medium, the consistency is optimized by adopting the method of the invention, then the searching and comparison are carried out, and the matching between the shot calibration plate image and the reference image is confirmed; controlling the tail end of the robot arm to move to each calibration point in sequence under the height, and sequentially acquiring a first coordinate and an actual joint parameter of the robot arm when the tail end of the robot arm moves to each calibration point; the joint parameter compensation value of the robot arm is determined according to the actual joint parameter, the first coordinate and the theoretical joint parameter, the theoretical joint parameter of the robot arm is calibrated according to the joint parameter compensation value, the problems that the robot parameter calibration needs to depend on external measuring equipment, operation is complex, full automation of a parameter calibration process cannot be achieved, and precision is low are solved, full automation of the robot parameter calibration process is achieved, the calibration operation process is simplified, and the efficiency and precision of the robot parameter calibration process are improved. The method comprises the steps of selecting a preset calibration board for calibration, and carrying out calibration on the preset calibration board.
When the robot carries the camera to each different position, the a matrix can be obtained by a calibration plate picture taken by the camera, for example, using Zhang's calibration method. The B matrix can be read from the robot system, so that the exact X and Z values need to be solved during the whole calibration process.
To solve for this value, we use a two-step calculation, the first step resulting in a preliminary matrix of X and Z.
The first step, calculation mode one, is when we assume the following conversions are equal:
robot base- > world- > camera
Base- > anchor clamps- > camera
Then we can arrive at the following equation:
Figure DEST_PATH_IMAGE070
where i represents each position the robot goes to.
The second calculation method is the first one, when we assume the following conversions are equal:
world camera
World- > robot base- > clamp- > camera
Figure DEST_PATH_IMAGE071
Where i represents each position to which the robot goes.
It can be seen that in the above two calculation formulas, we only consider the interconversion relationships in the respective coordinate systems. Thus, in theory E1 and E2 should be 0. Therefore, a nonlinear optimization algorithm is used, and a matrix of X and Z can be obtained. Meanwhile, other calibration methods can also obtain the values of X and Z, such as a Tsai robot hand-eye calibration algorithm and the like.
After the calibration matrix of X and Z is obtained preliminarily, other useful information can be used to obtain a more accurate predicted value. In this case, we assume that the a and B matrices are known data, but in practice they have errors, especially a is calculated from the image, which will generate large errors when the image data has blur, and the predicted depth values are often not accurate enough when the calibration plate is small. Therefore, to solve the above problem, we propose the following optimization algorithm.
Figure DEST_PATH_IMAGE072
Where i denotes each position to which the robot goes and j denotes the position of each point in the calibration plate. X ij Representing the three-dimensional coordinates of the jth point on the calibration plate at the ith position. X j Indicating the location of the jth point in the world coordinate system, and the calibration plate coordinate system.
Theoretically, the value obtained by E3 should be 0, so we can optimize to obtain a more accurate Z matrix, namely the position relationship of the hand grip to the camera by using a nonlinear optimization algorithm.
While the invention has been described in connection with a preferred embodiment, it is not intended to limit the invention, and it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the spirit and scope of the invention.

Claims (7)

1. The image searching method is characterized in that: reading an image example to be searched, and performing consistency optimization processing capable of reducing adverse effects caused by strong exposure and/or shaking to obtain an enhanced image Q; calculating the image characteristics of the enhanced image Q; calculating hue similarity, saturation similarity and brightness similarity between the enhanced image Q and the reference image feature information according to the reference image feature information in the image feature database; respectively multiplying the hue value similarity, the saturation value similarity and the brightness similarity by corresponding weights, and summing to generate a similarity result; finally, determining the similarity of the two images according to the similarity result; if the determination is yes, information corresponding to the reference image is output as a search result.
2. The image lookup method and apparatus as claimed in claim 1, wherein: the image characteristics include: hue invariant moment, saturation invariant moment, brightness invariant moment, and key hue level, key saturation level, and key brightness level.
3. The image lookup method and apparatus as claimed in claim 1, wherein: the consistency optimization algorithm is described as follows,
1) decomposing RGB channels of the image I (x, y) to obtain sub-band coefficients of each scale and direction of each channel image;
2) obtaining a final estimated illumination map by using the low-frequency subband coefficient according to the formula (1) and the formula (2)
Figure DEST_PATH_IMAGE001
3) According to the formulas (3) to (6), the high-frequency direction sub-band coefficients of each color channel are shrunk, and a processed final estimated illumination map is obtained
Figure 958157DEST_PATH_IMAGE002
4) According to formula (7), using
Figure DEST_PATH_IMAGE003
The inverse transform is performed to obtain an enhanced image Q.
4. The image lookup method and apparatus as claimed in claim 1, wherein: the searching image is exemplified by a calibration plate pattern; the calibration board has various patterns.
5. Computer program implementing the image finding method and apparatus as claimed in claim 1.
6. A storage medium storing the computer program of claim 5.
7. The utility model provides a robot arm calibration device which characterized by: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
a camera for taking a calibration image,
and a storage medium storing the computer program of claim 5;
the calibration image of the device is a plurality of calibration patterns.
CN202210489045.4A 2022-05-07 2022-05-07 Image searching method and device Pending CN114998624A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210489045.4A CN114998624A (en) 2022-05-07 2022-05-07 Image searching method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210489045.4A CN114998624A (en) 2022-05-07 2022-05-07 Image searching method and device

Publications (1)

Publication Number Publication Date
CN114998624A true CN114998624A (en) 2022-09-02

Family

ID=83025640

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210489045.4A Pending CN114998624A (en) 2022-05-07 2022-05-07 Image searching method and device

Country Status (1)

Country Link
CN (1) CN114998624A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115890654A (en) * 2022-10-09 2023-04-04 北京微链道爱科技有限公司 Depth camera automatic calibration algorithm based on three-dimensional feature points

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110158519A1 (en) * 2009-12-31 2011-06-30 Via Technologies, Inc. Methods for Image Characterization and Image Search
JP2012083873A (en) * 2010-10-08 2012-04-26 Utsunomiya Univ Image retrieval device, image retrieval method, image retrieval program, recording medium and image similarity calculation program
CN106651817A (en) * 2016-11-03 2017-05-10 电子科技大学成都研究院 Non-sampling contourlet-based image enhancement method
JP2017219984A (en) * 2016-06-07 2017-12-14 大日本印刷株式会社 Image retrieval system, image dictionary creation system, image processing system and program
WO2022088039A1 (en) * 2020-10-30 2022-05-05 Harman International Industries, Incorporated Unified calibration between dvs and camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110158519A1 (en) * 2009-12-31 2011-06-30 Via Technologies, Inc. Methods for Image Characterization and Image Search
JP2012083873A (en) * 2010-10-08 2012-04-26 Utsunomiya Univ Image retrieval device, image retrieval method, image retrieval program, recording medium and image similarity calculation program
JP2017219984A (en) * 2016-06-07 2017-12-14 大日本印刷株式会社 Image retrieval system, image dictionary creation system, image processing system and program
CN106651817A (en) * 2016-11-03 2017-05-10 电子科技大学成都研究院 Non-sampling contourlet-based image enhancement method
WO2022088039A1 (en) * 2020-10-30 2022-05-05 Harman International Industries, Incorporated Unified calibration between dvs and camera

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
AMY TABB等: "Solving the robot-world hand-eye(s) calibration problem with iterative methods", MACHINE VISION AND APPLICATIONS *
朱诺: "《行人交通安全 基于视频监测和元胞自动机的人群疏散机理研究》", 哈尔滨:东北林业大学出版社 *
王满利等: "基于非下采样轮廓波变换的矿井图像增强算法", 《煤炭学报》 *
田鹏飞等: "结合精度补偿的机器人优化手眼标定方法", 西安交通大学学报 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115890654A (en) * 2022-10-09 2023-04-04 北京微链道爱科技有限公司 Depth camera automatic calibration algorithm based on three-dimensional feature points
CN115890654B (en) * 2022-10-09 2023-08-11 北京微链道爱科技有限公司 Depth camera automatic calibration algorithm based on three-dimensional feature points

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
Ahmed Comparative study among Sobel, Prewitt and Canny edge detection operators used in image processing
DE112012005350B4 (en) Method of estimating the pose of an object
CN111340797A (en) Laser radar and binocular camera data fusion detection method and system
CN108381549B (en) Binocular vision guide robot rapid grabbing method and device and storage medium
CN110147162B (en) Fingertip characteristic-based enhanced assembly teaching system and control method thereof
CN111553949A (en) Positioning and grabbing method for irregular workpiece based on single-frame RGB-D image deep learning
CN110110618B (en) SAR target detection method based on PCA and global contrast
CN112734844B (en) Monocular 6D pose estimation method based on octahedron
CN110598795A (en) Image difference detection method and device, storage medium and terminal
CN108320310B (en) Image sequence-based space target three-dimensional attitude estimation method
CN111739071A (en) Rapid iterative registration method, medium, terminal and device based on initial value
CN114998624A (en) Image searching method and device
CN110334727B (en) Intelligent matching detection method for tunnel cracks
CN111583342A (en) Target rapid positioning method and device based on binocular vision
CN108992033B (en) Grading device, equipment and storage medium for vision test
CN112184785B (en) Multi-mode remote sensing image registration method based on MCD measurement and VTM
CN111681271B (en) Multichannel multispectral camera registration method, system and medium
CN112819935A (en) Method for realizing three-dimensional reconstruction of workpiece based on binocular stereo vision
CN116486092A (en) Electromagnetic probe calibration piece identification method based on improved Hu invariant moment
CN115034577A (en) Electromechanical product neglected loading detection method based on virtual-real edge matching
CN112233176A (en) Target posture measurement method based on calibration object
CN111583317B (en) Image alignment method and device and terminal equipment
CN113706620B (en) Positioning method, positioning device and movable platform based on reference object
CN112668585B (en) Object identification and positioning method in dynamic environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination