CN109766896B - Similarity measurement method, device, equipment and storage medium - Google Patents

Similarity measurement method, device, equipment and storage medium Download PDF

Info

Publication number
CN109766896B
CN109766896B CN201811418275.1A CN201811418275A CN109766896B CN 109766896 B CN109766896 B CN 109766896B CN 201811418275 A CN201811418275 A CN 201811418275A CN 109766896 B CN109766896 B CN 109766896B
Authority
CN
China
Prior art keywords
grid
virtual image
image
determining
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811418275.1A
Other languages
Chinese (zh)
Other versions
CN109766896A (en
Inventor
胡志恒
宋翔
杨小平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SF Technology Co Ltd
Original Assignee
SF Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SF Technology Co Ltd filed Critical SF Technology Co Ltd
Priority to CN201811418275.1A priority Critical patent/CN109766896B/en
Publication of CN109766896A publication Critical patent/CN109766896A/en
Application granted granted Critical
Publication of CN109766896B publication Critical patent/CN109766896B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The application discloses a similarity measurement method, a device, equipment and a storage medium. The method comprises the following steps: determining a virtual image of a target according to a three-dimensional model of the target and camera parameters; determining a feature weighting matrix for emphasizing an outer contour of the object in the virtual image; dividing the virtual image and the real image of the target into a plurality of grids with fixed scales respectively; determining a feature vector of each grid in the virtual image and the real image based on the feature weighting matrix; and measuring the similarity of the virtual image and the real image based on the feature vector of each grid of the virtual image and the real image. The technical scheme solves the problem that the traditional similarity measurement method cannot be used for measuring the similarity between the virtual image and the real image.

Description

Similarity measurement method, device, equipment and storage medium
Technical Field
The present disclosure relates generally to the field of image processing technologies, and in particular, to a similarity measurement method, apparatus, device, and storage medium.
Background
In the construction of intelligent airports, a basic requirement is that the attitude and position of the aircraft in the airport be determined by monitoring videos, and only if this information is obtained, other intelligent applications can be completed on this basis. In order to realize the function, one feasible scheme is to utilize the internal and external parameter information of the camera to back project the 3D model of the airplane into the image to form a virtual airplane, then compare the virtual airplane with the actual airplane in the photo, and the projection with the highest similarity has the projection parameters of the required pose and position of the airplane. Thus, whether this scheme is feasible depends entirely on whether the similarity measure is valid or stable.
Common similarity evaluation methods in the field of image processing include a histogram-based method, a feature point-based method, and a structural similarity-based method.
The principle of the histogram-based method is that the histogram of gray values in an image is counted, and then similarity measurement is carried out according to a certain distance measurement standard.
The feature point-based method is to extract a plurality of feature points from an image, so that the similarity measurement is converted into feature point matching, but the calculation speed of the method is relatively slow, when the target image is relatively fuzzy, the condition that the feature points are not detected can occur, and meanwhile, the condition of mismatching can also occur.
The method based on the structural similarity measures the image from three aspects of brightness, contrast and structure to obtain an index of similarity, and has poor effect when the image is subjected to translation, scaling and rotation.
In the application scene of an airport, in the calibration and use process of a camera, because of uncontrollable factors such as manual installation or long-time use, slight deviation of the position of the camera is unavoidable, which can cause errors in virtual imaging and actual imaging when an aircraft model is back projected into an image through calibrated parameters during installation. This requires that the similarity measure be invariant to translation, scaling, rotation, otherwise a mismatch may occur. Histogram-based methods have translational, scaling, rotational invariance, but cannot meet application requirements due to the loss of spatial location information. The feature point-based method also satisfies the above 3 features, but requires a clear image detail, which cannot be satisfied because the projection model can only reflect the outer contour, and the inner detail is not available. The method based on the structural similarity does not have invariance of translation, scaling and rotation and can not meet the requirements. In addition, there is a significant factor that the texture of the virtual image projected by the model and the actual image are greatly different, which results in that the conventional similarity measurement method cannot meet the requirement.
Disclosure of Invention
In view of the above-described drawbacks or shortcomings of the prior art, it is desirable to provide a scheme that can measure the similarity between a virtual image and a real image.
In a first aspect, an embodiment of the present application provides a similarity measurement method, where the method includes:
determining a virtual image of a target according to a three-dimensional model of the target and camera parameters;
generating a feature weighting matrix for strengthening the external contour of the target in the virtual image according to the virtual image;
dividing the virtual image and the real image of the target into a plurality of grids with fixed scales respectively;
determining a feature vector of each grid in the virtual image and the real image based on the feature weighting matrix;
and measuring the similarity of the virtual image and the real image based on the feature vector of each grid of the virtual image and the real image.
In a second aspect, an embodiment of the present application provides a similarity measurement apparatus, including:
the virtual image determining module is used for determining a virtual image of the target according to the three-dimensional model of the target and the camera parameters;
the characteristic weighting matrix generation module is used for generating a characteristic weighting matrix for strengthening the external contour of the target in the virtual image according to the virtual image;
the grid dividing module is used for dividing the virtual image and the real image of the target into a plurality of grids with fixed scales respectively;
a feature vector determining module, configured to determine a feature vector of each grid in the virtual image and the real image based on the feature weighting matrix;
and the similarity measurement module is used for measuring the similarity of the virtual image and the real image based on the characteristic vector of each grid of the virtual image and the real image.
In a third aspect, an embodiment of the present application provides an apparatus, including: at least one processor, at least one memory, and computer program instructions stored in the memory that when executed by the processor implement the similarity measurement method as described above.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement a similarity measure method as described above.
According to the similarity measurement scheme provided by the embodiment of the application, the external contour of the target is enhanced by using the feature weighting matrix, so that adverse effects caused by inconsistent textures and inconsistent backgrounds of the virtual image and the real image are reduced, the feature vector of each grid in the virtual image and the real image is determined based on the feature weighting matrix, and the similarity of the virtual image and the real image is calculated. The technical scheme solves the problem that the traditional similarity measurement method cannot be used for measuring the similarity between the virtual image and the real image.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
fig. 1 is a schematic flow chart of a similarity measurement method according to an embodiment of the present application;
fig. 2 is a block diagram of a similarity measurement device according to an embodiment of the present application;
FIG. 3 illustrates a schematic diagram of a computer system 300 suitable for use in implementing a server of an embodiment of the present application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be noted that, for convenience of description, only the portions related to the application are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
Referring to fig. 1, a flow chart of a similarity measurement method according to an embodiment of the present application is shown.
As shown in fig. 1, the similarity measurement method includes the following steps:
step 110, determining a virtual image of the target according to the three-dimensional model of the target and the camera parameters.
In practical application, the similarity measurement method provided by the embodiment of the application can be applied to the determination process of the gesture and the position of the airplane in the airport, the target can be, but is not limited to, the airplane, and the three-dimensional model of the airplane can be constructed by using three-dimensional modeling software according to the airplane drawing. The camera may be a camera in an airport that monitors an aircraft, and the camera parameters may include internal parameters and external parameters.
The virtual image of the target is determined according to the three-dimensional model of the target and the camera parameters, and the virtual image is actually an image obtained by back-projecting the three-dimensional model of the target, and the implementation process is a known technology and will not be described herein.
Step 120, determining a feature weighting matrix for enhancing the outer contour of the object in the virtual image.
In the embodiment of the application, the external contour of the target can be extracted from the virtual image to obtain the binary edge image containing the external contour, and then the Gaussian blur calculation is performed on the binary edge image to generate the feature weighting matrix.
The feature weighting matrix generated according to the method can be marked as W (x, y), and the feature weighting matrix acts on the virtual image and the real image simultaneously, so that the influence of edge position deviation can be reduced, and the adverse influence of the background and the image texture on similarity evaluation can be removed.
Step 130, dividing the virtual image and the real image of the target into a plurality of fixed-scale grids, respectively.
The real image of the target is obtained by shooting the target by using the camera.
In the process of gridding the virtual image and the real image of the target, the remaining part which cannot be completely covered by the grid can be covered after being filled according to the edge pixels of the virtual image and the real image.
Step 140, determining a feature vector for each grid in the virtual image and the real image based on the feature weighting matrix.
Specifically, for each grid, performing:
firstly, calculating a gradient edge value of each pixel point in a grid;
specifically, gradient edge values of each pixel point in the X direction and the Y direction are calculated through the formula (1) and the formula (2) respectively. Wherein, the X direction is: in the horizontal direction in the coordinate system with the upper left corner fixed point of the grid as the origin, the Y direction is: in a vertical direction in a coordinate system with the upper left corner of the grid pointing as the origin.
G x (x,y)=P i (x+1,y)-P i (x-1,y) (1)
G y (x,y)=P i (x,y+1)-P i (x,y-1) (2)
Wherein P is i Referring to pixel values of pixel points in the ith grid of the image, e.g. P i (x+1, y) refers to the pixel value of the pixel point with coordinates (x+1, y) in the ith grid.
Then, calculating the gradient direction of each pixel point in the grid based on the gradient edge value of each pixel point in the grid;
specifically, the gradient direction of each pixel point in the grid is calculated through the formula (3).
Wherein A is i The total direction interval of (x, y) is 0-180 DEG]This is because the contrast between the brightness of the virtual image and the contrast of the brightness of the real image on both sides of the edge are not related, and the directions of the real image and the virtual image on the edgeMay be opposite, thus ranging its total direction from 0, 360 degrees]Mapping to 0, 180 DEG]
Finally, determining the feature vector of the grid based on the feature weighting matrix and the gradient direction of each pixel point in the grid.
Specifically, determining the feature vector of the grid may be implemented according to the following steps:
1. according to the characteristic weighting matrix and the gradient direction of each pixel point in the grid, carrying out histogram statistics on the gradient directions of all the pixel points in the grid in each direction interval to obtain n-dimensional vectors corresponding to the grid;
wherein the total direction interval of the gradient direction is [0-180 ° ], and the gradient direction is divided into n direction intervals on average;
each component in the n-dimensional vector may be determined according to equation (4).
Wherein V is i j For the j-th component in the n-dimensional vector corresponding to the i-th grid, P i For the image in the ith grid, w i (x, y) is a feature weighting matrix corresponding to the pixel point with the coordinates of (x, y) in the ith grid,the gradient direction of the pixel point with the coordinates of (x, y) in the ith grid in the jth direction interval is shown; the value range of j is 0 to n, and n is an integer greater than 0;
n in the embodiments of the present application may be, but is not limited to, 6.
2. And carrying out normalization processing on each component in the n-dimensional vector to obtain a processed feature vector.
Specifically, each component in the n-dimensional vector may be normalized according to the formula (5), to obtain a processed feature vector.
Wherein V is i j' Is the j-th component in the processed feature vector.
And 150, measuring the similarity of the virtual image and the real image based on the feature vector of each grid of the virtual image and the real image.
The process may be implemented as follows:
firstly, determining all first grids containing targets from grids of a virtual image, and determining all second grids corresponding to the first grids in position one by one from grids of a real image;
and then, carrying out similarity measurement on the virtual image and the real image according to the feature vector of the first grid and the feature vector of the second grid.
Specifically, according to the feature vector of the first grid and the feature vector of the second grid, determining cosine similarity between each pair of the first grid and the second grid;
wherein, cosine similarity formula
Wherein V1 i V2 as the feature vector of the ith first grid i And n is the dimension of the feature vector, which is the feature vector of the ith second grid.
Then determining the similarity between the virtual image and the real image according to a formula (6);
wherein N is the total number of grids in the virtual image, and M is the number of grids in the virtual image which do not contain the target; cos i θ is cosine similarity between the ith grid in the virtual image and the ith grid in the real image, and Total is similarity between the virtual image and the real image.
According to the similarity measurement scheme provided by the embodiment of the application, the external contour of the target is enhanced by using the feature weighting matrix, so that adverse effects caused by inconsistent textures and inconsistent backgrounds of the virtual image and the real image are reduced, the feature vector of each grid in the virtual image and the real image is determined based on the feature weighting matrix, and the similarity of the virtual image and the real image is calculated. The technical scheme solves the problem that the traditional similarity measurement method cannot be used for measuring the similarity between the virtual image and the real image.
It should be noted that although the operations of the method of the present application are depicted in the drawings in a particular order, this does not require or imply that the operations must be performed in that particular order or that all of the illustrated operations be performed in order to achieve desirable results. Rather, the steps depicted in the flowcharts may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
With further reference to fig. 2, a block diagram of a similarity measurement device according to an embodiment of the present application is shown.
As shown in fig. 2, the similarity measurement apparatus includes:
a virtual image determining module 21, configured to determine a virtual image of a target according to a three-dimensional model of the target and camera parameters;
a feature weight matrix generation module 22 for determining a feature weight matrix for emphasizing an outer contour of the object in the virtual image;
a mesh dividing module 23, configured to divide the virtual image and the real image of the target into a plurality of meshes with fixed scales;
a feature vector determining module 24, configured to determine a feature vector of each grid in the virtual image and the real image based on the feature weighting matrix;
a similarity measurement module 25, configured to measure similarity between the virtual image and the real image based on the feature vector of each grid of the virtual image and the real image.
It should be understood that the units or modules described in the above apparatus correspond to the individual steps in the method described with reference to fig. 1. Thus, the operations and features described above with respect to the method are equally applicable to the apparatus described above and the units contained therein, and are not described in detail herein.
Referring now to FIG. 3, there is shown a schematic diagram of a computer system suitable for use in implementing a server of an embodiment of the present application.
As shown in fig. 3, the computer system includes a Central Processing Unit (CPU) 301 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage section 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data required for the operation of the system 300 are also stored. The CPU 301, ROM 302, and RAM 303 are connected to each other through a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
The following components are connected to the I/O interface 305: an input section 306 including a keyboard, a mouse, and the like; an output portion 307 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 308 including a hard disk or the like; and a communication section 309 including a network interface card such as a LAN card, a modem, or the like. The communication section 309 performs communication processing via a network such as the internet. The drive 310 is also connected to the I/O interface 305 as needed. A removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 310 as needed, so that a computer program read therefrom is installed into the storage section 308 as needed.
In particular, according to embodiments of the present disclosure, the process described above with reference to fig. 1 may be implemented as a computer software program. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the method of fig. 1. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 309, and/or installed from the removable medium 311.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules involved in the embodiments of the present application may be implemented in software or in hardware. The described units or modules may also be provided in a processor, for example, as: a processor includes a virtual image determination module unit, a feature weight matrix generation module, a meshing module, a feature vector determination module, and a similarity measurement module. The names of these units or modules do not in some way constitute a limitation of the unit or module itself, and for example, the similarity measurement module may also be described as "a unit for measuring similarity".
As another aspect, the present application also provides a computer-readable storage medium, which may be a computer-readable storage medium contained in the apparatus described in the above embodiment; or may be a computer-readable storage medium, alone, that is not assembled into a device. The computer-readable storage medium stores one or more programs that, when executed by one of the electronic devices, cause the electronic devices to implement the similarity measurement method as described in the above embodiments.
The above description is only illustrative of the preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the application referred to in the present application is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept. Such as the above-mentioned features and the technical features disclosed in the present application (but not limited to) having similar functions are replaced with each other.

Claims (8)

1. A method of similarity measurement, the method comprising:
determining a virtual image of a target according to a three-dimensional model of the target and camera parameters;
determining a feature weighting matrix for emphasizing an outer contour of the object in the virtual image;
dividing the virtual image and the real image of the target into a plurality of grids with fixed scales respectively;
determining a feature vector of each grid in the virtual image and the real image based on the feature weighting matrix;
based on the feature vector of each grid of the virtual image and the real image, measuring the similarity of the virtual image and the real image;
wherein determining feature vectors for images within each grid in the virtual image and the real image based on the feature weighting matrix comprises:
for each grid, performing:
calculating a gradient edge value of each pixel point in the grid;
calculating the gradient direction of each pixel point in the grid based on the gradient edge value of each pixel point in the grid;
determining a feature vector of the grid based on the feature weighting matrix and the gradient direction of each pixel point in the grid;
determining the feature vector of the grid based on the feature weighting matrix and the gradient direction of each pixel point in the grid, including:
according to the characteristic weighting matrix and the gradient direction of each pixel point in the grid, carrying out histogram statistics on the gradient directions of all the pixel points in the grid in each direction interval to obtain n-dimensional vectors corresponding to the grid;
wherein the total direction interval of the gradient direction is 0-180 degrees, and the gradient direction is divided into n direction intervals on average; each component in the n-dimensional vector is formulatedA representation; v (V) i j For the j-th component in the n-dimensional vector corresponding to the i-th grid, P i For the image in the ith grid, w i (x, y) is a feature weighting matrix corresponding to the pixel point with the coordinates of (x, y) in the ith grid,/for the pixel point>The gradient direction of the pixel point with the coordinates of (x, y) in the ith grid in the jth direction interval is shown; the value range of j is 0 to n, and n is an integer greater than 0;
and carrying out normalization processing on each component in the n-dimensional vector to obtain a processed feature vector.
2. The method of claim 1, wherein determining a feature weighting matrix for emphasizing an outer contour of the object in the virtual image comprises:
extracting the external contour of the target from the virtual image to obtain a binary edge image containing the external contour;
and carrying out Gaussian blur calculation on the binary edge image to generate the characteristic weighting matrix.
3. The method of claim 1, wherein normalizing each component in the n-dimensional vector results in a processed feature vector, comprising:
according to the formulaCarrying out normalization processing on each component in the n-dimensional vector to obtain a processed feature vector; wherein V is i j ' is the j-th component in the processed feature vector.
4. The method of claim 1, wherein performing a similarity metric on the virtual image and the real image based on the feature vector of each grid of the virtual image and the real image comprises:
determining all first grids containing the target from grids of the virtual image, and determining all second grids corresponding to the first grids in position one by one from grids of the real image;
and carrying out similarity measurement on the virtual image and the real image according to the feature vector of the first grid and the feature vector of the second grid.
5. The method of claim 4, wherein performing a similarity measure on the virtual image and the real image based on the feature vector of the first grid and the feature vector of the second grid comprises:
according to the feature vector of the first grid and the feature vector of the second grid, determining cosine similarity between each pair of the first grid and the second grid;
according to the formulaDetermining a similarity between the virtual image and the real image; wherein N is the total number of meshes in the virtual image, and M is the number of meshes in the virtual image that do not contain the target; the cos i θ is cosine similarity between the ith grid in the virtual image and the ith grid in the real image, and Total is similarity between the virtual image and the real image.
6. A similarity measurement apparatus, the apparatus comprising:
the virtual image determining module is used for determining a virtual image of the target according to the three-dimensional model of the target and the camera parameters;
a feature weight matrix generation module for determining a feature weight matrix for enhancing an external contour of the object in the virtual image;
the grid dividing module is used for dividing the virtual image and the real image of the target into a plurality of grids with fixed scales respectively;
a feature vector determining module, configured to determine a feature vector of each grid in the virtual image and the real image based on the feature weighting matrix;
and the similarity measurement module is used for measuring the similarity of the virtual image and the real image based on the characteristic vector of each grid of the virtual image and the real image.
7. A similarity measurement apparatus, comprising: at least one processor, at least one memory, and computer program instructions stored in the memory, which when executed by the processor, implement the method of any one of claims 1-5.
8. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the method of any of claims 1-5.
CN201811418275.1A 2018-11-26 2018-11-26 Similarity measurement method, device, equipment and storage medium Active CN109766896B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811418275.1A CN109766896B (en) 2018-11-26 2018-11-26 Similarity measurement method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811418275.1A CN109766896B (en) 2018-11-26 2018-11-26 Similarity measurement method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109766896A CN109766896A (en) 2019-05-17
CN109766896B true CN109766896B (en) 2023-08-29

Family

ID=66449108

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811418275.1A Active CN109766896B (en) 2018-11-26 2018-11-26 Similarity measurement method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109766896B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111126254A (en) * 2019-12-23 2020-05-08 Oppo广东移动通信有限公司 Image recognition method, device, equipment and storage medium
CN111292268B (en) * 2020-02-07 2023-07-25 抖音视界有限公司 Image processing method, device, electronic equipment and computer readable storage medium
CN112733895B (en) * 2020-12-30 2024-03-15 杭州海康威视数字技术股份有限公司 Method, device and computer storage medium for determining image similarity
CN113570726B (en) * 2021-08-10 2024-06-11 中海油田服务股份有限公司 Multi-electric button while-drilling electric imaging image generation method and device and computing equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100468465C (en) * 2007-07-13 2009-03-11 中国科学技术大学 Stereo vision three-dimensional human face modelling approach based on dummy image
CN100516776C (en) * 2007-11-06 2009-07-22 北京航空航天大学 Road network model based on virtual nodes
EP3679881A1 (en) * 2012-08-14 2020-07-15 Intuitive Surgical Operations, Inc. Systems and methods for registration of multiple vision systems

Also Published As

Publication number Publication date
CN109766896A (en) 2019-05-17

Similar Documents

Publication Publication Date Title
CN109766896B (en) Similarity measurement method, device, equipment and storage medium
CN107679537B (en) A kind of texture-free spatial target posture algorithm for estimating based on profile point ORB characteristic matching
EP3576017A1 (en) Method, apparatus, and device for determining pose of object in image, and storage medium
CN111192293B (en) Moving target pose tracking method and device
CN112489099B (en) Point cloud registration method and device, storage medium and electronic equipment
CN112946679B (en) Unmanned aerial vehicle mapping jelly effect detection method and system based on artificial intelligence
CN117422884A (en) Three-dimensional target detection method, system, electronic equipment and storage medium
CN112561986A (en) Secondary alignment method, device, equipment and storage medium for inspection robot holder
CN114511661A (en) Image rendering method and device, electronic equipment and storage medium
CN117197388A (en) Live-action three-dimensional virtual reality scene construction method and system based on generation of antagonistic neural network and oblique photography
CN113592706B (en) Method and device for adjusting homography matrix parameters
CN114926316A (en) Distance measuring method, distance measuring device, electronic device, and storage medium
CN117115358B (en) Automatic digital person modeling method and device
CN116912645A (en) Three-dimensional target detection method and device integrating texture and geometric features
CN111126508A (en) Hopc-based improved heterogeneous image matching method
CN115409949A (en) Model training method, visual angle image generation method, device, equipment and medium
CN113670268B (en) Binocular vision-based unmanned aerial vehicle and electric power tower distance measurement method
CN117011324A (en) Image processing method, device, electronic equipment and storage medium
CN117252914A (en) Training method and device of depth estimation network, electronic equipment and storage medium
Wang et al. An airlight estimation method for image dehazing based on gray projection
Zhang et al. Texture feature-based local adaptive Otsu segmentation and Hough transform for sea-sky line detection
CN116416290A (en) Method and device for calculating speckle structure light depth and electronic equipment
CN116434316B (en) Identity recognition method, device, equipment and medium based on X86 industrial control main board
Zhang et al. Edge Detection
CN108364013A (en) Image key points feature descriptor extracting method, system based on the distribution of neighborhood gaussian derivative

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant