CN112070810B - Positioning method, mobile device, and computer-readable storage medium - Google Patents

Positioning method, mobile device, and computer-readable storage medium Download PDF

Info

Publication number
CN112070810B
CN112070810B CN202010899193.4A CN202010899193A CN112070810B CN 112070810 B CN112070810 B CN 112070810B CN 202010899193 A CN202010899193 A CN 202010899193A CN 112070810 B CN112070810 B CN 112070810B
Authority
CN
China
Prior art keywords
image
frequency domain
ground texture
map point
registration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010899193.4A
Other languages
Chinese (zh)
Other versions
CN112070810A (en
Inventor
杨冬冬
王磊
于非
缪寅明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Aiguan Vision Technology Co ltd
Original Assignee
Anhui Aiguan Vision Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Aiguan Vision Technology Co ltd filed Critical Anhui Aiguan Vision Technology Co ltd
Priority to CN202010899193.4A priority Critical patent/CN112070810B/en
Publication of CN112070810A publication Critical patent/CN112070810A/en
Application granted granted Critical
Publication of CN112070810B publication Critical patent/CN112070810B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/14Fourier, Walsh or analogous domain transformations, e.g. Laplace, Hilbert, Karhunen-Loeve, transforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/759Region-based matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Software Systems (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Algebra (AREA)
  • Navigation (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses a positioning method, wherein the method comprises the following steps: n registration information corresponding to a map point at a current position is acquired in the moving process of the movable equipment, a ground texture image at the current position is acquired, and the N registration information and the ground texture image are matched and registered on a frequency domain to obtain N groups of registration parameters. And finally, correcting the current position according to the N groups of registration parameters to obtain a global position corresponding to the ground texture image. By implementing the method and the device, the problem that the position matching positioning cannot be realized under the condition of high speed or large position deviation map error in the prior art can be solved.

Description

Positioning method, mobile device, and computer-readable storage medium
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a positioning method, a mobile device, and a computer readable storage medium.
Background
With the rapid development of artificial intelligence technology, mobile robots are increasingly used. Positioning is used as a basic module of the mobile robot, and is a key technology for realizing automatic navigation of the mobile robot. Common positioning methods of mobile robots include magnetic stripe navigation positioning, inertial navigation positioning, laser navigation positioning and visual navigation positioning.
The visual navigation positioning method also comprises a visual synchronous positioning and mapping VSLAM, a two-dimensional code navigation method and a ground texture navigation-based method. The VLAM method has high requirements on environment, low positioning accuracy in a characteristic scarce or dynamic environment, and is easy to lose positioning. Two-dimensional code navigation is needed to be deployed, and the method is not suitable for large-scale scenes. Meanwhile, when the two-dimensional code is worn or shielded, robustness is not achieved.
In the existing ground texture navigation positioning method, a certain range of coincidence rate between an image to be matched and a reference image is required. Under the limitation, when a large error or high-speed movement occurs in the positioning of the mobile robot, the acquired image to be matched and the reference image have low coincidence rate and even cannot coincide, and a matching failure result is easy to occur. Therefore, there is a need for a method that can properly match the position in the case of high-speed motion or large errors in the deviation from the map points.
Disclosure of Invention
The embodiment of the application provides a positioning method, a positioning device, mobile equipment and a computer readable storage medium, which can solve the problem that the position matching positioning cannot be realized under the condition of high speed or large position deviation map error in the prior art.
In a first aspect, a positioning method is provided, the method comprising: in the moving process of the movable equipment, N pieces of registration information corresponding to map points at the current position are acquired, wherein the N pieces of registration information are in one-to-one correspondence with N map point images; acquiring a ground texture image of the current position; matching and registering the N registration information and the ground texture image on a frequency domain to obtain N groups of registration parameters; and correcting the current position according to the N groups of registration parameters to obtain a global position corresponding to the ground texture image.
In some embodiments, the matching registration of the N registration information and the ground texture image in the frequency domain includes: preprocessing the ground texture image to obtain a first original image f of the ground texture image 2 And a first frequency domain image F 3 The method comprises the steps of carrying out a first treatment on the surface of the F of the ground texture image 2 And F 3 Corresponding to the N map point imagesTexture registration is carried out on N registration information of the image sensor to obtain N groups of registration parameters; wherein the registration information comprises a global position of the map point image, a first frequency domain image F 0 And a second frequency domain image F 3 The first frequency domain image F 0 And a second frequency domain image F 3 In order to obtain any map point image in the N map point images after preprocessing, the map point images are stored in a map in advance, and calculation is not needed in the registration process.
In some embodiments, the step of combining f of the ground texture image 2 And F 3 Texture registration is carried out on N registration information corresponding to the N map point images, and N groups of registration parameters are obtained, wherein the N groups of registration parameters comprise:
f of the ground texture image 3 Second frequency domain images F corresponding to the N map point images 3 Performing phase correlation calculation to obtain peak point coordinates (p) of N cross power spectrums x ,p y );
According to the N peak point coordinates (p x ,p y ) Calculating respective rotation angles of the ground texture image relative to the N map point images;
f of the ground texture image 2 Rotating the rotation angles corresponding to the N map point images and performing Fourier transform to obtain N third frequency domain images F 4
The N third frequency domain images F 4 First frequency domain images F corresponding to the N map point images 0 Phase correlation calculation is performed to obtain N cross power spectrum peak coordinates (r x ,r y ) And N response values, wherein the first frequency domain image F 0 The method comprises the steps of preprocessing any one image of the N map point images to obtain an image;
according to the N cross power spectrum peak coordinates (r x ,r y ) Calculating the preset pixel resolution to obtain N translation quantities;
and calculating N image coincidence degrees according to the rotation angle and the translation amount of the ground texture image relative to the N map point images.
In some embodiments, the correcting the current position according to the N sets of registration parameters to obtain a global position corresponding to the ground texture image includes:
selecting at least one target image from the N map point images, wherein the target image is an image corresponding to the N sets of registration parameters, and the response value and the image overlap ratio of the image are both larger than corresponding threshold values;
the weight distribution is carried out on the at least one target image according to the response value of the at least one target image, and the average value of the at least one image is calculated to obtain the relative position of the ground texture image relative to the map points;
and correcting the global position of the map point according to the relative position to obtain the global position corresponding to the ground texture image.
In some embodiments, the positioning method further comprises: and correcting the global position corresponding to the ground texture image according to the moving speed of the movable equipment, the acquisition time and the current time of the ground texture image to obtain the global position of the movable equipment.
In some embodiments, the positioning method further comprises:
loading a global map comprising at least one map point when the mobile device is started;
acquiring an initial position of the movable equipment, and determining map points corresponding to the initial position from a global map;
and matching and registering N registration information corresponding to N map point images contained in the map points corresponding to the initial position with the ground texture image of the initial position to obtain the initial global position of the movable equipment.
In a second aspect, there is provided a positioning device operable to perform the method of the first aspect or any of the alternative embodiments of the first aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more units corresponding to the functions described above. The unit may be software and/or hardware.
In a third aspect, there is provided a removable device comprising: a processor and a memory coupled to the processor; wherein the memory includes computer readable instructions; the processor is configured to execute the computer readable instructions in the memory to cause the removable device to perform the aspects of the first aspect or any of the alternative embodiments of the first aspect.
In a fourth aspect, there is provided a computer program product which, when run on a computer, causes the computer to perform the method of the first aspect or any of the alternative embodiments of the first aspect.
In a fifth aspect, a chip product is provided, performing the method of the first aspect or any of the alternative embodiments of the first aspect.
In a seventh aspect, a computer readable storage medium is provided, having instructions stored therein, which when run on a computer, cause the computer to perform the method of the first aspect or any of the alternative embodiments of the first aspect.
Drawings
Fig. 1 is a flow chart of a positioning method according to an embodiment of the present application.
Fig. 2 is a schematic diagram of map points corresponding to a plurality of map point images according to an embodiment of the present application.
Fig. 3 is a schematic flow chart of another pretreatment method according to an embodiment of the present application.
Fig. 4 is a schematic flow chart of image matching according to an embodiment of the present application.
Fig. 5 is a schematic flow chart of texture registration according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of a positioning device according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of a mobile device according to an embodiment of the present application.
Detailed Description
Specific embodiments of the present application are described in further detail below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a positioning method provided in the present application. The method shown in fig. 1 comprises the following steps:
s101, the mobile equipment acquires N pieces of registration information corresponding to map points of the current position, and the N pieces of registration information correspond to the N map point images one by one.
The movable equipment can establish a global map related to ground texture in advance under a movable equipment coordinate system, and the global map comprises a global position corresponding to at least one map point and registration information of each map point. And load the global map into a database of the removable device. When the movable equipment is started, the relevant information of all map points can be read into the memory from the database. Further, the movable device supports initialization near any map point position, for example, an initial position of an initial map point is input, a bottom camera is triggered to take a picture to obtain a ground texture image of the initial position, and a global position corresponding to the ground texture image is obtained through matching and registering with the corresponding map point. And further updating the global position corresponding to the ground texture image to the initial global position of the movable equipment, and finishing initialization.
During the movement of the movable equipment, the movable equipment can acquire the information of the current position, the movement speed, the movement acceleration and the like of the movable equipment from the odometer thread. It is appreciated that there is one odometer thread executing within the mobile device, including but not limited to a wheel odometer, a inertial odometer, a front-facing camera vision odometer, or a combination thereof, and the like. The mobile device may obtain information from the odometer thread, such as the current location of the mobile device, the speed of movement, and the like.
After the current position of the movable equipment is obtained, judging whether the current position has adjacent map points in the global map, specifically inquiring the map points closest to the current position from a map point list, and judging that the current position has adjacent map points if the distance between the position of the map points and the current position is smaller than a certain threshold value; otherwise, no adjacent map points exist. Further, the mobile device may directly obtain N registration information corresponding to the map point of the current position from the memory, and details of the description of the registration information are described below.
It should be noted that, the global map may be created by manually measuring or automatically acquiring map points at equal intervals, and each map point acquires N map point images, including but not limited to 1*2, 2*1, 2×2, 3*3, etc. Referring specifically to fig. 2, a schematic diagram of a plurality of map point images corresponding to map points is shown. As in fig. 2, black dots represent positions of map points, each of which corresponds to N map point images. This also allows matching to map point images on map points when the positioning error of the mobile device is large, i.e. when the offset map points are far away. And in the high-speed movement process of the movable equipment, the number of times of map point images shot at each map point is increased, namely N is increased, so that a stable high-precision positioning result can be obtained. It should be noted that, the N map point images of each map point may be acquired by using a monocular camera, or may also be acquired by using a multi-view camera (including a binocular camera), where the matching manner of the map point corresponding to the multiple map point images is within the protection scope of the present patent.
S102, the movable equipment acquires a ground texture image of the current position.
When the movable equipment judges that the adjacent map points exist at the current position, the bottom camera can be triggered to shoot a ground texture image of the current position.
And S103, the mobile equipment performs matching and registration processing on the N map point images and the ground texture image on a frequency domain to obtain N groups of registration parameters.
In one example, the N map point images and the ground texture image may also be pre-processed prior to matching and registration. Referring to fig. 3, a schematic flow chart of a preprocessing method provided in the present application is shown. As shown in fig. 3, includes:
s301, changing the size of the map point image (resize) to obtain an original image f 1 . Wherein changing the size of the map point image can increase the calculation speed in the frequency domain, and on the other hand, adapt the length and width of the map point image to the fast fourier transform.
S302, pair f 1 Windowing operation is carried out to obtain an original image f 2 (first original image). The windowing operation can add a window such as a Haining window, and is beneficial to clearing boundary influence.
S303, pair f 2 Performing fast Fourier transform to obtain a frequency domain image F 0 (first frequency domain image).
S304, pair F 0 High-pass filtering to obtain frequency domain image F 1
S305, pair F 1 Performing antipodal transformation to obtain a frequency domain image F 2
In order to obtain a higher-precision angle result, the application calculates an epipolar transformation mapping matrix (namely, calculates F 2 ) In this case, the matrix width in the angular direction is w×coff. Wherein w is F 1 Coff is the magnification of the angular resolution. This way, the epipolar transformation result F is used subsequently 2 When the phase correlation method calculation is performed, the precision range of the angle is as followsHowever, the angular accuracy of the conventional epipolar transformation result is in the range +.>
S306, pair F 2 Windowing and Fourier transforming to obtain frequency domain image F 3 (second frequency domain image).
S307, F in the preprocessing result 0 、F 3 And the actual physical position p corresponding to the map point image is combined into registration information of the map point image<p i >. Wherein i represents an i-th map point image, and i is a positive integer of N or less. Distribution of each map point imageThe quasi information can exist in a binary form in a database, so that the subsequent use is convenient. Similarly, the ground texture image is preprocessed to obtain the original image f of the ground texture image 2 (first original image) and frequency domain image F 3 (second frequency domain image).
After preprocessing, the mobile device may match and register the ground texture image and the N map point images in the frequency domain. Referring specifically to fig. 4, a schematic flow chart of map point image matching is shown. The method as shown in fig. 4 includes:
s401, acquiring registration information (P) of each of a map point P nearest to a current position and N map point images corresponding to the map point P in a global map i )。
S402, preprocessing the acquired ground texture image to obtain an original image f of the ground texture image 2 (first original image) and frequency domain image F 3 (second frequency domain image).
S403, f of ground texture image 2 And F 3 And performing texture registration in parallel with registration information of the map points P to obtain N groups of registration parameters. The matching parameters in the present application include a registration response value res i Image overlap ratio i Rotation amount delta theta i And a translation amount Δp i
In one example, referring to fig. 5, a specific embodiment of texture matching will be described taking an i-th map point image corresponding to a map point as an example. Wherein fig. 5 includes:
S501F of ground texture image 3 F of ith map point image corresponding to map point 3 i And performing phase correlation method calculation.
Specifically, first for two F 3 Performing cross power spectrum calculation, performing Fourier transform on the cross power spectrum, searching peak points, and calculating peak point coordinates (p) of the cross power spectrum by sub-pixel method near the peak points x ,p y )。
S502, ordinate p passing through peak point coordinates y Can calculate the corresponding rotation angleWherein w×coff is F 3 Is a width of (c).
S503, according to the rotation angle, determining the rotation angle delta theta of the ground texture image relative to the ith map point image corresponding to the map point i
The phase correlation method has rotation singularity, delta theta i And delta theta i +180°, these two angles corresponding to the same peak point. The rotation angle of the map point image corresponding to the map point is recorded as theta 0 Respectively calculate θ 0 +Δθ i And theta 0 +Δθ i +180° where the two angles differ from the current position angle acquired by the odometer thread. Selecting the rotation angle with the smallest difference as the rotation angle delta theta of the ground texture image relative to the map point image corresponding to the map point i
S504, f of ground texture image 2 Rotate- Δθ around the center of the image i Degree, obtain original image f 3 . And to f 3 Performing fast Fourier transform to obtain a frequency domain image F 4 (third frequency domain image).
S505, F 4 Frequency domain image F of map point image corresponding to map point 0 i (first frequency domain image) to obtain cross power spectrum peak value coordinate (r) x ,r y ) And peak value res i . The peak value res i I.e. the response value.
S506, according to the cross power spectrum peak value coordinate (r x ,r y ) Calculating the pixel resolution calibrated in advance to obtain the translation delta p i
S507, according to the translation amount delta p i And a rotation angle delta theta i Calculating the image overlapping rate of the ground texture image relative to the ith map point image corresponding to the map points i
S104, correcting the current position according to the N groups of registration parameters to obtain a global position corresponding to the ground texture image.
After the mobile device obtains N sets of registration parameters in S403, the current location may be corrected by using the N sets of registration, to obtain a global location corresponding to the ground texture image. Please refer to steps S404-S407 shown in fig. 4. The method comprises the following steps:
s404, N response values res in N registration parameters i Overlap with response threshold value, N images i And comparing with the coincidence threshold value, and selecting at least one target image from the N images. The target image is the response value res in N images i And image overlap i An image that is greater than the corresponding threshold.
S405, according to the response value res of the at least one target image i And carrying out weight distribution on the at least one target image, and calculating the average value of the at least one target image to obtain the relative position of the ground texture image relative to the map points.
Assuming that the number of target images is M (M is more than or equal to 1 and less than or equal to N), the corresponding response value is res i The rotation angle is delta theta i Amount of translation Δp i . Where i is a positive integer from 1 to M. The weight of each target image is accordingly:the relative rotation angle is Δθ=sum (s i ×Δθ i ) The relative translational amount is Δp=sum (s i ×Δp i )。
S406, correcting the current position according to the relative position to obtain a global position corresponding to the ground texture image.
Specifically, the mobile device may correct the current position according to the weight of each target image, the relative rotation angle of the ground texture image with respect to the i-th map point image corresponding to the map point, and the relative translation amount, so as to obtain the global position corresponding to the map texture image. The optional map 4 may further include S407:
s407, correcting the global position corresponding to the ground texture image according to the moving speed of the movable equipment, the acquisition time of the map texture image and the current time to obtain the global position of the movable equipment.
Assuming that global position p and moving speed corresponding to ground texture image are v, and acquisition time is t 1 At the current time t 2 Then the global position of the movable device is p' =p+v× (t 2 -t 1 )。
By implementing the method and the device, the mobile equipment can be matched normally under the condition of high-speed running or larger error of deviating map points, and a stable high-precision positioning result is provided.
Fig. 6 is a schematic structural diagram of a positioning device according to an embodiment of the present application. The positioning apparatus 600 as shown in fig. 6 includes an acquisition unit 601, a registration unit 602, and a correction unit 603. Wherein,
the acquiring unit 601 is configured to acquire N pieces of registration information corresponding to map points at a current position, where the N pieces of registration information correspond to the N map point images one by one;
the acquiring unit 601 is further configured to acquire a ground texture image of the current position;
the registration unit 602 is configured to perform matching and registration processing on the N registration information and the ground texture image on a frequency domain, so as to obtain N sets of registration parameters;
the correcting unit 603 is configured to perform correction processing on the current position according to the N sets of registration parameters, so as to obtain a global position corresponding to the ground texture image.
In some embodiments, the registration unit 602 is specifically configured to:
preprocessing the ground texture image to obtain a first original image f of the ground texture image 2 And a second frequency domain image F 3
F of the ground texture image 2 And F 3 Texture registration is carried out on N registration information corresponding to the N map point images, so that N groups of registration parameters are obtained; wherein the registration information comprises a global position of the map point image, a first frequency domainImage processing apparatusAnd a second frequency domain image->Said->And->And preprocessing any one image of the N map point images.
In some embodiments, the registration unit 604 is specifically configured to:
f of the ground texture image 3 Second frequency domain images corresponding to the N map point images respectivelyPerforming phase correlation calculation to obtain peak point coordinates (p) of N cross power spectrums x ,p y );
According to the N peak point coordinates (p x ,p y ) Calculating respective rotation angles of the ground texture image relative to the N images;
f of the ground texture image 2 Rotation is performed by a rotation angle-delta theta relative to each of the N map point images i And fourier transforming to obtain N third frequency domain images
The N are divided intoFirst frequency domain images corresponding to the N map point images respectively>Performing phase correlation calculation to obtainTo N cross power spectrum peak coordinates (r x ,r y ) And N response values res i Wherein the first frequency domain image +.>The method comprises the steps of preprocessing any one of the N images to obtain an image;
according to the N cross power spectrum peak coordinates (r x ,r y ) And calculating N translation amounts deltap according to the preset pixel resolution i
According to the rotation angle delta theta of the ground texture image relative to the N map point images i And a translation amount Δp i Calculating the overlapping degree of the texture image relative to N images i
In some embodiments, the correction unit 603 is specifically configured to:
selecting at least one target image from the N images, wherein the target image is an image corresponding to the N sets of registration parameters, and the response value and the image overlap ratio of the response value and the image overlap ratio are both larger than corresponding threshold values;
the weight distribution is carried out on the at least one target image according to the response value of the at least one target image, and the average value of the at least one image is calculated to obtain the relative position of the ground texture image relative to the map points;
and correcting the global position of the map point according to the relative position to obtain the global position corresponding to the ground texture image.
In some embodiments, the correction unit 603 is specifically configured to:
and correcting the global position corresponding to the ground texture image according to the moving speed of the movable equipment, the acquisition time of the ground texture image and the current time to obtain the global position of the movable equipment.
In some embodiments, the obtaining unit 601 is further configured to load a global map including at least one map point when the mobile device is started; acquiring an initial position of the movable equipment, and determining map points corresponding to the initial position from a global map;
the registration unit 602 is further configured to perform matching and registration processing on N map point images corresponding to the map points corresponding to the initial position and a ground texture image of the initial position, so as to obtain an initial global position of the mobile device.
Fig. 7 is a schematic structural diagram of a mobile device according to an embodiment of the present application. The removable device 700 as shown in fig. 7 includes: at least one input device 701; at least one output device 702; at least one processor 703, such as a CPU; and a memory 704, the input device 701, the output device 702, the processor 703, and the memory 704 being connected by a bus 705.
The input device 701 may specifically be a touch panel of a mobile terminal, including a touch screen and a touch screen, and is configured to detect an operation instruction on the touch panel of the terminal.
The output device 702 may be a display screen of a mobile terminal, and is used for outputting and displaying information.
The memory 704 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as a disk memory. The memory 704 is used for storing a set of program codes, and the input device 701, the output device 702 and the processor 703 are used for calling the program codes stored in the memory 704 to perform the following operations:
the processor 703 is configured to obtain N pieces of registration information corresponding to map points at a current position during the moving process of the mobile device, where the N pieces of registration information are in one-to-one correspondence with N map point images;
acquiring a ground texture image of the current position;
matching and registering the N registration information and the ground texture image on a frequency domain to obtain N groups of registration parameters;
and correcting the current position according to the N groups of registration parameters to obtain a global position corresponding to the ground texture image.
In some embodiments, the processor 703 is specifically configured toPreprocessing the ground texture image to obtain a first original image f of the ground texture image 2 And a second frequency domain image F 3
F of the ground texture image 2 And F 3 Texture registration is carried out on N registration information corresponding to the N map point images, so that N groups of registration parameters are obtained;
wherein the registration information comprises a global position of the map point image, a first frequency domain imageAnd a second frequency domain image->Said first frequency domain image->And a second frequency domain image->And preprocessing any one of the N images.
In some embodiments, the processor 703 is specifically configured to compare the F of the ground texture image 3 Second frequency domain images corresponding to the N map point images respectivelyPerforming phase correlation calculation to obtain peak point coordinates (p) of N cross power spectrums x ,p y );
According to the N peak point coordinates (p x ,p y ) Calculating respective rotation angles of the ground texture image relative to the N map point images;
f of the ground texture image 2 Rotating the rotation angles corresponding to the N map point images and performing Fourier transform to obtain N third frequency domain images F 4
The N third frequency domain images F 4 And the N sheetsFirst frequency domain image F corresponding to each image 0 Phase correlation calculation is performed to obtain N mutual power coordinates (r x ,r y ) And N response values, wherein the first frequency domain image F 0 The method comprises the steps of preprocessing any one image of the N map point images to obtain an image;
according to the N mutual power coordinates (r x ,r y ) Calculating the preset pixel resolution to obtain N translation quantities;
and calculating N image coincidence degrees according to the rotation angle and the translation amount of the ground texture image relative to the N map point images.
In some embodiments, the processor 703 is specifically configured to select at least one target image from the N map image images, where the target image is an image corresponding to a response value and an image overlap ratio in the N sets of registration parameters that are both greater than a corresponding threshold;
the weight distribution is carried out on the at least one target image according to the response value of the at least one target image, and the average value of the at least one image is calculated to obtain the relative position of the ground texture image relative to the map points;
and correcting the global position of the map point according to the relative position to obtain the global position corresponding to the ground texture image.
In some embodiments, the processor 703 is further configured to correct the global position corresponding to the ground texture image according to the moving speed of the mobile device, the acquisition time of the ground texture image, and the current time, so as to obtain the global position of the mobile device.
In some embodiments, the processor 703 is further configured to load a global map comprising at least one map point upon startup of the mobile device;
acquiring an initial position of the movable equipment, and determining map points corresponding to the initial position from a global map;
and matching and registering N map point images corresponding to the map points corresponding to the initial position with the ground texture image of the initial position to obtain the initial global position of the movable equipment.
Based on the same inventive concept, the principle of solving the problem by the terminal provided in the embodiments of the present application is similar to that of solving the problem by the terminal in the embodiments of the method of the present application, so that the implementation of each device may refer to the implementation of the method, and for brevity, a description is not repeated here.
In the foregoing embodiments, the descriptions of the embodiments are focused on, and for those portions of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs.
The modules in the mobile device of the embodiment of the invention can be combined, divided and deleted according to actual needs.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (7)

1. A positioning method, characterized in that it is applied in a mobile device, said positioning method comprising:
in the moving process of the movable equipment, N pieces of registration information corresponding to map points at the current position are acquired, wherein the N pieces of registration information are in one-to-one correspondence with N map point images;
acquiring a ground texture image of the current position;
matching and registering the N registration information and the ground texture image on a frequency domain to obtain N groups of registration parameters;
correcting the current position according to the N groups of registration parameters to obtain a global position corresponding to the ground texture image;
wherein the matching and registering the N registration information and the ground texture image in the frequency domain includes:
preprocessing the ground texture image to obtain a first original image f of the ground texture image 2 And a second frequency domain image F 3 The method comprises the steps of carrying out a first treatment on the surface of the A first original image f of the ground texture image 2 And a second frequency domain image F 3 Texture registration is carried out on the N registration information to obtain N groups of registration parameters; wherein the registration information comprises a global position of the map point image, a first frequency domain imageAnd a second frequency domain image->Said first frequency domain image->And a second frequency domain image->The method comprises the steps of preprocessing any map point image in the N map point images;
wherein the first original image f of the ground texture image 2 And a second frequency domain image F 3 Texture registration is carried out with the N registration information, and N groups of registration parameters are obtained, wherein the N groups of registration parameters comprise: a second frequency domain image F of the ground texture image 3 Second frequency domain images corresponding to the N map point images respectivelyPerforming phase correlation calculation to obtain peak point coordinates (p) of N cross power spectrums x ,p y ) The method comprises the steps of carrying out a first treatment on the surface of the According to the peak point coordinates (p x ,p y ) Calculating respective rotation angles of the ground texture image relative to the N map point images; a first original image f of the ground texture image 2 Rotating the rotation angles of the N map point images and performing Fourier transform to obtain N third frequency domain images F 4 The method comprises the steps of carrying out a first treatment on the surface of the The N third frequency domain images F 4 First frequency domain images corresponding to the N map point images respectively>Phase correlation calculation is performed to obtain N cross power spectrum peak coordinates (r x ,r y ) And N response values, wherein the first frequency domain image +.>The method comprises the steps of preprocessing any map point image in the N map point images to obtain an image; according to the N cross power spectrum peak coordinates (r x ,r y ) Calculating the preset pixel resolution to obtain N translation quantities; and calculating N image coincidence degrees according to the rotation angle and the translation amount of the ground texture image relative to the N map point images.
2. The positioning method according to claim 1, wherein the correcting the current position according to the N sets of registration parameters to obtain the global position corresponding to the ground texture image includes:
selecting at least one target image from the N map point images, wherein the target image is a map point image corresponding to a response value and an image overlapping ratio in the N sets of registration parameters which are both larger than a corresponding threshold value;
the weight distribution is carried out on the at least one target image according to the response value of the at least one target image, and the average value of the at least one image is calculated to obtain the relative position of the ground texture image relative to the map points;
and correcting the current position according to the relative position to obtain a global position corresponding to the ground texture image.
3. The positioning method according to any one of claims 1-2, characterized in that the positioning method further comprises:
and correcting the global position corresponding to the ground texture image according to the moving speed of the movable equipment, the acquisition time of the ground texture image and the current time to obtain the global position of the movable equipment.
4. The positioning method according to claim 1, characterized in that the positioning method further comprises:
loading a global map comprising at least one map point when the mobile device is started;
acquiring an initial position of the movable equipment, and determining map points corresponding to the initial position from a global map;
and matching and registering N registration information corresponding to the map points corresponding to the initial position with the ground texture image of the initial position to obtain the initial global position of the movable equipment.
5. A positioning device is characterized by comprising an acquisition unit, a registration unit and a correction unit, wherein,
the acquisition unit is used for acquiring N pieces of registration information corresponding to the map points at the current position, wherein the N pieces of registration information are in one-to-one correspondence with the N map point images;
the acquisition unit is also used for acquiring the ground texture image of the current position;
the registration unit is used for carrying out matching and registration processing on the N registration information and the ground texture image on a frequency domain to obtain N groups of registration parameters;
the correction unit is used for correcting the current position according to the N groups of registration parameters to obtain a global position corresponding to the ground texture image;
wherein, the registration unit is specifically configured to: preprocessing the ground texture image to obtain a first original image f of the ground texture image 2 And a second frequency domain image F 3 The method comprises the steps of carrying out a first treatment on the surface of the A first original image f of the ground texture image 2 And a second frequency domain image F 3 Texture registration is carried out on the N registration information to obtain N groups of registration parameters; wherein the registration information comprises a global position of the map point image, a first frequency domain imageAnd a second frequency domain image->Said first frequency domain image->And a second frequency domain image->The method comprises the steps of preprocessing any map point image in the N map point images;
wherein the first original image f of the ground texture image 2 And a second frequency domain image F 3 Texture registration is carried out with the N registration information, and N groups of registration parameters are obtained, wherein the N groups of registration parameters comprise: a second frequency domain image F of the ground texture image 3 Second frequency domain images corresponding to the N map point images respectivelyPerforming phase correlation calculation to obtain peak point coordinates (p) of N cross power spectrums x ,p y ) The method comprises the steps of carrying out a first treatment on the surface of the According to the peak point coordinates (p x ,p y ) Calculating respective rotation angles of the ground texture image relative to the N map point images; a first original image f of the ground texture image 2 Rotating each of the N map point imagesThe self rotation angle is subjected to Fourier transformation to obtain N third frequency domain images F 4 The method comprises the steps of carrying out a first treatment on the surface of the The N third frequency domain images F 4 First frequency domain images corresponding to the N map point images respectively>Phase correlation calculation is performed to obtain N cross power spectrum peak coordinates (r x ,r y ) And N response values, wherein the first frequency domain image +.>The method comprises the steps of preprocessing any map point image in the N map point images to obtain an image; according to the N cross power spectrum peak coordinates (r x ,r y ) Calculating the preset pixel resolution to obtain N translation quantities; and calculating N image coincidence degrees according to the rotation angle and the translation amount of the ground texture image relative to the N map point images.
6. A removable device comprising a processor and a memory coupled to the processor, wherein the memory comprises computer readable instructions, and wherein the processor is configured to execute the computer readable instructions in the memory to implement the method of any of claims 1-4.
7. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method according to any of claims 1-4.
CN202010899193.4A 2020-08-31 2020-08-31 Positioning method, mobile device, and computer-readable storage medium Active CN112070810B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010899193.4A CN112070810B (en) 2020-08-31 2020-08-31 Positioning method, mobile device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010899193.4A CN112070810B (en) 2020-08-31 2020-08-31 Positioning method, mobile device, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN112070810A CN112070810A (en) 2020-12-11
CN112070810B true CN112070810B (en) 2024-03-22

Family

ID=73666277

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010899193.4A Active CN112070810B (en) 2020-08-31 2020-08-31 Positioning method, mobile device, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN112070810B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113469885A (en) * 2021-07-15 2021-10-01 上海明略人工智能(集团)有限公司 Target image determination method and device, storage medium and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544710A (en) * 2013-11-08 2014-01-29 河南工业大学 Image registration method
CN103714531A (en) * 2013-12-05 2014-04-09 南京理工大学 FPGA-based phase correlation method image registration system and method
CN106996777A (en) * 2017-04-21 2017-08-01 合肥井松自动化科技有限公司 A kind of vision navigation method based on ground image texture
CN109556596A (en) * 2018-10-19 2019-04-02 北京极智嘉科技有限公司 Air navigation aid, device, equipment and storage medium based on ground texture image
CN109711486A (en) * 2019-01-21 2019-05-03 湖北省国土资源研究院 Based on the relevant high degree of overlapping remote sensing image full scale tie point matching process of phase
CN110189331A (en) * 2018-05-31 2019-08-30 上海快仓智能科技有限公司 Build drawing method, image acquisition and processing system and localization method
CN111415390A (en) * 2020-03-18 2020-07-14 上海懒书智能科技有限公司 Positioning navigation method and device based on ground texture

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003903511A0 (en) * 2003-07-08 2003-07-24 Canon Kabushiki Kaisha Image registration method improvement
TW201727418A (en) * 2016-01-26 2017-08-01 鴻海精密工業股份有限公司 Analysis of the ground texture combined data recording system and method for analysing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544710A (en) * 2013-11-08 2014-01-29 河南工业大学 Image registration method
CN103714531A (en) * 2013-12-05 2014-04-09 南京理工大学 FPGA-based phase correlation method image registration system and method
CN106996777A (en) * 2017-04-21 2017-08-01 合肥井松自动化科技有限公司 A kind of vision navigation method based on ground image texture
CN110189331A (en) * 2018-05-31 2019-08-30 上海快仓智能科技有限公司 Build drawing method, image acquisition and processing system and localization method
CN109556596A (en) * 2018-10-19 2019-04-02 北京极智嘉科技有限公司 Air navigation aid, device, equipment and storage medium based on ground texture image
CN109711486A (en) * 2019-01-21 2019-05-03 湖北省国土资源研究院 Based on the relevant high degree of overlapping remote sensing image full scale tie point matching process of phase
CN111415390A (en) * 2020-03-18 2020-07-14 上海懒书智能科技有限公司 Positioning navigation method and device based on ground texture

Also Published As

Publication number Publication date
CN112070810A (en) 2020-12-11

Similar Documents

Publication Publication Date Title
CN109035320B (en) Monocular vision-based depth extraction method
CN111354042B (en) Feature extraction method and device of robot visual image, robot and medium
CN109146980B (en) Monocular vision based optimized depth extraction and passive distance measurement method
CN106408609B (en) A kind of parallel institution end movement position and posture detection method based on binocular vision
CN111263142B (en) Method, device, equipment and medium for testing optical anti-shake of camera module
US20080050042A1 (en) Hardware-in-the-loop simulation system and method for computer vision
CN104182982A (en) Overall optimizing method of calibration parameter of binocular stereo vision camera
CN105118021A (en) Feature point-based image registering method and system
CN107516322B (en) Image object size and rotation estimation calculation method based on log polar space
CN105066962B (en) A kind of high-precision photogrammetric apparatus of the big angle of visual field of multiresolution
CN106570907B (en) Camera calibration method and device
JP4941565B2 (en) Corresponding point search apparatus and corresponding point searching method
JP2017097402A (en) Surrounding map preparation method, self-location estimation method and self-location estimation device
KR20220054582A (en) Visual positioning method and related apparatus, device and computer readable storage medium
CN112070810B (en) Positioning method, mobile device, and computer-readable storage medium
CN113112553B (en) Parameter calibration method and device for binocular camera, electronic equipment and storage medium
CN117745845A (en) Method, device, equipment and storage medium for determining external parameter information
CN109859313B (en) 3D point cloud data acquisition method and device, and 3D data generation method and system
JP2009146150A (en) Method and device for detecting feature position
JP4747293B2 (en) Image processing apparatus, image processing method, and program used therefor
CN115719387A (en) 3D camera calibration method, point cloud image acquisition method and camera calibration system
CN113012279B (en) Non-contact three-dimensional imaging measurement method and system and computer readable storage medium
CN111583108B (en) Tunnel lining surface linear array image TOF fusion splicing method and device and storage medium
CN117671007B (en) Displacement monitoring method and device, electronic equipment and storage medium
Shan et al. Research on 3D pose measurement algorithm based on binocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: A14-5, 13th Floor, Building A, Building J1, Phase II, Innovation Industrial Park, No. 2800, Chuangxin Avenue, High-tech Zone, Hefei City, Anhui Province, 230088

Applicant after: Anhui aiguan Vision Technology Co.,Ltd.

Address before: Room 305, Building E, No. 492 Anhua Road, Changning District, Shanghai, 200020

Applicant before: SHANGHAI EYEVOLUTION TECHNOLOGY Co.,Ltd.

Country or region before: China

GR01 Patent grant
GR01 Patent grant