CN110006420B - Picture construction method, image acquisition and processing system and positioning method - Google Patents

Picture construction method, image acquisition and processing system and positioning method Download PDF

Info

Publication number
CN110006420B
CN110006420B CN201811475564.5A CN201811475564A CN110006420B CN 110006420 B CN110006420 B CN 110006420B CN 201811475564 A CN201811475564 A CN 201811475564A CN 110006420 B CN110006420 B CN 110006420B
Authority
CN
China
Prior art keywords
picture
connection
parameter
guided vehicle
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811475564.5A
Other languages
Chinese (zh)
Other versions
CN110006420A (en
Inventor
孙宇
罗磊
周韬宇
肖尚华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Quicktron Intelligent Technology Co Ltd
Original Assignee
Shanghai Quicktron Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Quicktron Intelligent Technology Co Ltd filed Critical Shanghai Quicktron Intelligent Technology Co Ltd
Priority to PCT/CN2019/075741 priority Critical patent/WO2019154435A1/en
Priority to JP2019531677A priority patent/JP6977921B2/en
Priority to JP2019531456A priority patent/JP7083472B2/en
Priority to PCT/CN2019/084185 priority patent/WO2019154444A2/en
Publication of CN110006420A publication Critical patent/CN110006420A/en
Application granted granted Critical
Publication of CN110006420B publication Critical patent/CN110006420B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)
  • Image Generation (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method for mapping a field, which comprises the following steps: establishing or acquiring a coordinate system of the field; scanning the field, and acquiring a picture of a calibration point, a picture of a position to be positioned, and a position parameter and a posture parameter corresponding to the picture; correcting the position parameter and the posture parameter of the picture to be positioned based on the picture of the calibration point, the picture of the position to be positioned, the position parameter and the posture parameter.

Description

Picture construction method, image acquisition and processing system and positioning method
Technical Field
The invention relates to the field of intelligent storage, in particular to a drawing method, an image acquisition processing system and a positioning method which can be used for intelligent storage.
Background
In the existing intelligent warehouse, a physical coordinate system is often required to be measured on a positioning site, the physical coordinate system takes common distance units as measurement units, such as meters, decimeters and centimeters, and is allowed to be described in integer, decimal and fractional forms, such as 1 meter, 1 decimeter, 1 centimeter, 0.55 meter, 0.2 decimeter, 1.4 centimeter and one-half meter, and the direction of the coordinate system is generally parallel to building enclosing walls or parallel to the northeast, southwest and northwest directions.
Automatic guided vehicles AGVs that transport goods in intelligent warehouses often require precise positioning of their locations. However, the accuracy of existing positioning methods often fails to meet operational requirements, especially when accurate determination of the position and attitude parameters of the AGV is required. This is very disadvantageous for the operation and control of the operator.
Accordingly, there is a strong need in the art for a method and apparatus that enables more accurate mapping and positioning.
The matters in the background section are only those known to the inventors and do not, of course, represent prior art in the field.
Disclosure of Invention
Aiming at one or more of the problems in the prior art, the invention provides a method for mapping a site, which comprises the following steps: establishing or acquiring a coordinate system of the field; scanning the field, and acquiring a picture of a calibration point, a picture of a position to be positioned, and a position parameter and a posture parameter corresponding to the picture; correcting the position parameter and the posture parameter of the picture to be positioned based on the picture of the calibration point, the picture of the position to be positioned, the position parameter and the posture parameter.
According to one aspect of the invention, the position parameters include an abscissa and an ordinate, preferably a vertical, and the attitude parameters include a heading angle, preferably a pitch angle and a roll angle.
According to one aspect of the invention, the step of modifying comprises: constructing a set of connection points, wherein each connection point comprises a picture, the position parameter and the posture parameter corresponding to the picture, and whether the picture corresponds to a standard point; and correcting the position parameter and the posture parameter of the picture of the position to be positioned based on the set of connection points.
According to one aspect of the invention, wherein the step of modifying comprises: acquiring two connection points with the distance not exceeding a preset value from the set of connection points as a connection, and establishing a set of connection; calculating connection confidence degrees between two connection points included in each connection in the connection set, and filtering out the connection with the connection confidence degrees higher than a preset threshold value as a mapping connection set; and correcting the position parameter and the posture parameter of the picture at the position to be positioned based on the map-building connection set.
According to one aspect of the invention, the correcting step further comprises: and executing a gradient descent method on the map-building connection set, wherein the position parameter and the gesture parameter of the picture of the connection point of the non-standard point are taken as initial iteration parameters of the gradient descent method when the initialization step of the gradient descent method is executed.
According to one aspect of the invention, wherein the correcting step further comprises: and executing the gradient descent method until the iterative change rate is lower than a preset threshold value.
According to one aspect of the invention, wherein a plurality of picture acquisitions are performed on some or all of the index points, and a position parameter and an attitude parameter corresponding to each picture acquisition are acquired. .
According to one aspect of the invention, the method further comprises: storing the coordinate system, the picture of the locating point, the picture of the position to be located, the position parameter and the posture parameter of the picture of the locating point, and the corrected position parameter and posture parameter of the picture of the position to be located into a database or a file, and establishing a map library.
According to one aspect of the invention, the coordinate system is a physical coordinate system.
According to one aspect of the invention, the predetermined value is half the length or width of the picture.
The invention also provides an automatic guided vehicle for image acquisition, comprising: a base; the camera is mounted on the base and configured to collect pictures of an area below the base; and the measuring component is installed on the base and is configured to measure or calculate the position parameter and the posture parameter of the automatic guided vehicle corresponding to the picture.
According to one aspect of the invention, the automated guided vehicle further comprises a light emitting device mounted on the base and configured to illuminate an area under the base for the camera to take pictures.
According to one aspect of the invention, the automated guided vehicle further comprises a control device mounted on the base, the camera and the measurement assembly being coupled to the control device, the control device being configured to control the trolley to travel to a marking point and a position to be positioned to take a picture of the marking point and a picture of the position to be positioned.
According to one aspect of the invention, the automated guided vehicle further comprises a processing device coupled to the camera and the measurement assembly and correcting the position and pose parameters of the picture of the position to be located based on the picture and the position and pose parameters.
According to one aspect of the invention, the processing device corrects the position parameter and the attitude parameter of the picture of the position to be positioned by: constructing a set of connection points, wherein each connection point comprises a picture, the position parameter and the posture parameter corresponding to the picture, and whether the picture corresponds to a standard point; acquiring two connection points with the distance not exceeding a preset value from the set of connection points as a connection, and establishing a set of connection; calculating connection confidence degrees between two connection points included in each connection in the connection set, and filtering out the connection with the connection confidence degrees higher than a preset threshold value as a mapping connection set; and executing a gradient descent method on the graph building connection set until the iteration change rate is lower than a preset pre-count, wherein the position parameter and the gesture parameter of the picture of the connection point of the non-standard point are used as initial iteration parameters of the gradient descent method when the initialization step of the gradient descent method is executed.
According to one aspect of the invention, the automated guided vehicle further comprises a light shield mounted on the base for softening light emitted by the light emitting device, the light emitting device preferably being mounted around the light shield.
According to one aspect of the invention, wherein the measurement component is an inertial navigation measurement component.
According to one aspect of the invention, wherein the position parameters comprise an abscissa and an ordinate, preferably a vertical, and the attitude parameters comprise a heading angle, preferably a pitch angle and a roll angle.
According to one aspect of the invention, wherein the measuring assembly comprises a laser SLAM measuring device and/or a vision SLAM measuring device.
According to one aspect of the invention, the processing device is configured to store the coordinate system, the picture of the calibration point, the picture of the position to be positioned, the position parameter and the posture parameter of the picture of the calibration point, and the corrected position parameter and posture parameter of the picture of the position to be positioned into a database or a file, and establish a map library.
The invention also provides an image acquisition and processing system, comprising: an automatic guided vehicle as described above; and a processing device coupled to the camera and the measurement assembly and configured to modify the position and orientation parameters of the picture based on the picture and the position and orientation parameters.
According to one aspect of the invention, the processing device is configured to perform the mapping method described above.
The invention also provides a map building and positioning system for the automatic guided vehicle, which comprises the following steps: the camera is arranged to collect images below the automatic guiding vehicle; a light emitting device configured to illuminate a lower side of the automated guided vehicle; an inertial navigation measurement component configured to measure a position parameter and a pose parameter of the automated guided vehicle; and a processing device, wherein the camera and the inertial navigation measurement component are coupled to the processing device, and the control device is configured to correct the position parameter and the posture parameter of the picture based on the image, the position parameter and the posture parameter.
According to one aspect of the invention, wherein the processing means are configured to perform the mapping method as claimed in any one of claims 1-10.
The invention also provides equipment for mapping the field, which comprises the following steps: means configured to establish or acquire a coordinate system of the site; means configured to scan the field, obtain a picture of a calibration point and a plurality of pictures of positions to be positioned, and position parameters and attitude parameters corresponding to the pictures; and means configured to correct the position parameter and the posture parameter of the picture of the position to be positioned based on the picture, the position parameter and the posture parameter.
The invention also provides a positioning method, which comprises the following steps: loading or obtaining a map obtained by the method of any one of the above; acquiring or obtaining a picture of a position to be positioned and a position parameter and a posture parameter corresponding to the picture; and searching a picture closest to the picture at the position to be positioned according to the map.
According to one aspect of the invention, the positioning method further comprises: and calculating the confidence coefficient, the position parameter offset and the attitude parameter offset between the picture at the position to be positioned and the picture closest to the position to be positioned by using a phase correlation method.
According to one aspect of the invention, when the confidence calculated by using the phase correlation method is lower than a preset value, discarding the picture closest to the position to be positioned, and retrieving the picture closest to the position to be positioned and with the confidence higher than the preset value.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a flow chart of a mapping method according to one embodiment of the invention;
FIG. 2 is a schematic diagram of physical coordinates according to one embodiment of the invention;
FIG. 3 is a schematic diagram of logical coordinates according to one embodiment of the invention;
FIG. 4 is a schematic diagram of a connection point according to one embodiment of the invention;
FIG. 5 is a schematic diagram of a calibration point according to one embodiment of the invention;
FIG. 6 is a flow chart of a method of correcting position and orientation parameters of a position picture to be positioned according to one embodiment of the invention;
FIG. 7 is an example of picture coincidence calculated by phase correlation according to one embodiment of the present invention;
FIG. 8 is a schematic diagram of a connection according to one embodiment of the invention;
FIG. 9 shows a map screenshot after mapping of the physical and logical coordinate systems is completed;
FIG. 10 is a schematic illustration of an automated guided vehicle for image acquisition according to one embodiment of the invention;
FIG. 11 is a flow chart of a positioning method according to one embodiment of the invention; and
FIG. 12 is a block diagram of a computer program product according to one embodiment of the invention.
Detailed Description
Hereinafter, only certain exemplary embodiments are briefly described. As will be recognized by those of skill in the pertinent art, the described embodiments may be modified in various different ways without departing from the spirit or scope of the present invention. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.
In the description of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be fixedly connected, detachably connected, or integrally connected, and may be mechanically connected, electrically connected, or may communicate with each other, for example; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In the present invention, unless expressly stated or limited otherwise, a first feature "above" or "below" a second feature may include both the first and second features being in direct contact, as well as the first and second features not being in direct contact but being in contact with each other through additional features therebetween. Moreover, a first feature being "above," "over" and "on" a second feature includes the first feature being directly above and obliquely above the second feature, or simply indicating that the first feature is higher in level than the second feature. The first feature being "under", "below" and "beneath" the second feature includes the first feature being directly above and obliquely above the second feature, or simply indicating that the first feature is less level than the second feature.
The following disclosure provides many different embodiments, or examples, for implementing different features of the invention. In order to simplify the present disclosure, components and arrangements of specific examples are described below. They are, of course, merely examples and are not intended to limit the invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, which are for the purpose of brevity and clarity, and which do not themselves indicate the relationship between the various embodiments and/or arrangements discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art will recognize the application of other processes and/or the use of other materials.
The preferred embodiments of the present invention will be described below with reference to the accompanying drawings, it being understood that the preferred embodiments described herein are for illustration and explanation of the present invention only, and are not intended to limit the present invention.
Referring first to fig. 1, a mapping method 100 according to a first embodiment of the present invention is described, which may be used, for example, to map a site.
In step S101, a coordinate system of the field is established or acquired. It is within the scope of the present invention that the coordinate system may be a physical coordinate system or a logical coordinate system. The definition of the coordinate system generally includes the position of the origin, the direction of the XY coordinate axis, and the like.
For example, a physical coordinate system can be established for a field to be positioned, the physical coordinate system takes common distance units as measurement units, such as meters, decimeters and centimeters, and is allowed to be described in integer, decimal and fractional forms, such as 1 meter, 1 decimeter, 1 centimeter, 0.55 meter, 0.2 decimeter, 1.4 centimeter and one half meter, and the coordinate system is generally parallel to building fences or parallel to the northwest direction, and coordinates established according to the principle are called physical coordinate systems in the system, as shown in fig. 2.
The coordinate system set according to the actual condition of the service is called a logical coordinate system in the system. By way of example and not limitation, logical and physical coordinate systems may differ, for example, in that logical coordinate systems are generally described in terms of integers, such as (1, 2), (5, 10), and coordinate system orientations do not necessarily coincide with physical coordinate systems, and distance units of logical coordinate systems are not necessarily common physical units, instead, the definition is performed according to the actual operation requirement, for example, in fig. 3, the logical coordinates of point a, point B and point C are (3, 7), the logical coordinates of point a are (3, 8), the logical coordinates of point C are (4, 7), the point at the lower left corner is the origin, and if the distance between each logical position is 1.35 meters, the physical coordinates of point a are (4.05,9.45). Therefore, the logical position and the physical position may be completely identical, or may have a certain conversion relationship. The reason for having the logical position is to facilitate planning business logic or facilitate drawing calculation, for example, the shelf is put as an example, the positions of the shelf are all saved by the position of a logical coordinate system, for example, (3, 7) positions, if the physical position is used, the description of (4.05,9.45) appears, which is very bad for understanding and operation of operators, if the physical position is needed, the physical position can be converted through a conversion relation, and a coefficient, called a logical position interval, is multiplied when the conversion is generally performed, and the physical position can be different in the X direction and the Y direction. For example, if the shelf in the warehouse is 1.3 m by 1.3 m and the shelf spacing is 0.05 m, the logic position spacing can be defined to be 1.35 m, if the shelf is 1.2 m by 1.0 m, the logic position spacing can be defined to be 1.25 m in the X-axis direction and 1.05 m in the Y-axis direction, so that the equipment needing to perform physical positioning can find the corresponding physical position shelf. The above conversion is only a conventional conversion method, and more complicated conversion methods, such as coordinate system rotation conversion, nonlinear conversion, and the like, are not developed in detail herein. The above description of a logical coordinate system is merely exemplary and not limiting. The logical coordinate system is a coordinate system set according to the actual business situation. Under the concept of the present invention, the position parameter in the logical coordinate system is not limited to an integer, but may be a decimal. These are all within the scope of the present invention. If the physical coordinate system or the logical coordinate system of the site is established in advance, the physical coordinate system or the logical coordinate system can be acquired from a corresponding file or database. The following description will take a physical coordinate system as an example.
In step S102, the site is scanned, and a picture of the calibration point (for definition of the calibration point, please refer to below), a picture of the position to be positioned (preferably a plurality of pictures of the position to be positioned), and position parameters and posture parameters corresponding to the calibration point picture and the picture of the position to be positioned are acquired.
The field may be scanned, for example, using an automatic guided vehicle carrying the apparatus of the present invention (to be described below), to obtain a picture of the position to be located, a picture of the calibration point, and position and attitude parameters corresponding to the above two pictures. The position to be located here can be determined on the basis of the actual situation, for example, the position to be reached by the automated guided vehicle.
For example, the position parameter is the abscissa and the ordinate of the picture in the physical coordinate system (i.e. the horizontal position, such as the coordinate of the center of the picture, or the coordinate of a certain angle of the picture) at a certain calibration point or the position to be positioned, and may be the horizontal distance and the longitudinal distance relative to a certain base point; the gesture parameters are, for example, the angle of the acquired picture, such as the angle (i.e., heading angle) with respect to the horizontal or vertical axis. According to a preferred embodiment of the present invention, parameters such as a pitch angle, a roll angle, a vertical height, etc. corresponding to the picture (i.e., a pitch angle, a roll angle, a vertical height, etc. when the trolley is automatically guided to obtain a picture) may also be obtained. According to a preferred embodiment of the present invention, the inertial navigation measurement device onboard an automated guided vehicle of the present invention may be employed to provide the data described above. Inertial navigation measuring devices include, for example, wheel encoders, accelerometers (1-3 axes), gyroscopes (1-3 axes), magnetic flux sensors (1-3 axes), barometric pressure sensors, and measuring devices capable of feeding back heading angle, pitch angle, roll angle, horizontal position, vertical position. The data obtained by the wheel encoder, the accelerometer, the gyroscope, the magnetic flux sensor and the air pressure sensor are utilized to obtain heading angle (namely, angle of a picture relative to a horizontal axis or a vertical axis), pitch angle, roll angle, horizontal position and vertical position through calculation, the obtained data are overlapped to the picture to form seven-element group data combination (namely, picture, heading angle (namely, picture angle), pitch angle, roll angle, horizontal position (namely, x-axis abscissa and y-axis ordinate) and vertical position, and whether the seven-element group data are standard points or not), and the seven-element group data are called connection points in the system as shown in fig. 4 and are input as subsequent pictures. Of course, those skilled in the art will appreciate that the connection point need not be provided with all data, such as a quaternary data combination including (picture, heading angle, horizontal position, whether or not it is a calibration point) to achieve the objects of the invention. It should be noted that according to a preferred embodiment of the present invention, as many images of the calibration points and corresponding position parameters and posture parameters as possible can be collected, which is helpful for more accurate establishment of a positioning map and more accurate positioning, and during collection, the images can be collected multiple times through the same area, and the positioning map can be more accurate. Of course, the protection scope of the present invention is not limited to the coordinates in the physical coordinate system, but may be the coordinates in the logical coordinate system.
With respect to the calibration points, they represent points where those coordinates have been precisely determined. Points a, B, C, as indicated in fig. 3, whose coordinates have been confirmed, are artificially defined, a priori.
An example of a calibration point is shown in fig. 5, where calibration point a is shown with logical coordinates (5, 8) and physical coordinates (3.75,4.10). Of course, in the present invention, the index points are not limited to having both logical and physical coordinates. Various means may be employed to identify and confirm the calibration points. For example, one is that a cross line is arranged on an image, the position is marked on the cross line, and the marked point and the position coordinate thereof can be identified after the image is acquired; still another is that encoded information, such as a bar code or a two-dimensional code, is provided thereon, and after image acquisition, the encoded information can be decoded using a program, and the decoded content is the position coordinates of the index point. According to one embodiment of the present invention, since the coordinates of the calibration point are confirmed in advance, in step S102, the position parameter of the calibration point picture is adopted instead of the position parameter of the calibration point picture measured by the inertial navigation measurement device.
In step S103, the position parameter and the posture parameter of the picture to be positioned are corrected based on the picture of the calibration point, the picture of the position to be positioned, the position parameter and the posture parameter.
For the picture of the position to be positioned, the position parameter and the attitude parameter are obtained through measurement, for example, the picture is measured and obtained through inertial navigation measurement equipment, and measurement errors exist in the working condition of the site, so that further correction is needed to improve the precision of the picture. The picture of the marked point can be used as a good reference for correcting the position parameter and the posture parameter of the picture of the position to be positioned.
One embodiment of step S103 is described below with reference to fig. 6.
In step S1031: a collection of connection points is constructed. As described above, each connection point includes a seven-tuple data combination (picture, heading angle (i.e., picture angle), pitch angle, roll angle, horizontal position, vertical position, whether a calibration point) or a four-tuple data combination (picture, heading angle, horizontal position, whether a calibration point). These connection points are used to construct a collection of connection points. Regarding the parameter of "whether it is a calibration point", if a calibration point appears in the picture and the a priori position parameter of the calibration point is normally obtained, the parameter is "yes calibration point"; otherwise the parameter is "non-setpoint". May also be represented by a logic 0 or 1.
In step S1032, a set of connections is established and output based on the set of connection points. Inputting a connection point set, and performing group pairing operation according to the horizontal position included in the connection point, namely xy axis coordinates, wherein the principle of the group pairing operation is as follows: the distance between the marked positions of the two pictures does not exceed a predetermined value, for example 50%, 30% or 20% of the length or width of the pictures. For example, the connection point a is (0, 0) and the connection point B is (5, 0), then the distance from a to B is 5, if the picture size is 10 x 10, then A, B meets the criterion of not more than 50% of the picture size, and a pair can be formed, in the present system such a combination is called a connection, each connection includes two connection points, and the output can form all connections, in the present system called a connection set.
In step S1033: the generated connection set is input, for each connection in the connection set, two connection points A, B in the connection are extracted, for convenience of description, a connection point A is called a reference point, a connection point B is called a neighbor point, the reference point is taken as an origin, the neighbor point is taken as an offset, the reference point picture and the neighbor point picture are taken as inputs, for example, a phase correlation method is performed to obtain connection confidence (conf) (representing the similarity between the two), x-direction relative displacement (delta_x), y-direction relative displacement (delta_y) and rotation relative angle (theta), a 4-tuple formed by (conf, delta_x, delta_y and theta) is called a cross-correlation result in the system, the corresponding connection is put into a corresponding connection storage, the connection storage with the confidence greater than a certain threshold (for example, 10 is understood that the probability that the cross-correlation result randomly occurs is smaller than the probability value represented by the sigma position of the normal distribution 10), and a new connection set containing the cross-correlation result is output after filtering, and the connection set is called a map building connection set in the system. The above connection confidence is output by the phase correlation method, and is calculated by calculating the sharpness of the peak value or the distribution near the peak of the phase correlation method, and if the distribution is normal, the confidence can be calculated by knowing the peak value and the average value. The cross-correlation result is calculated by calculating the correlation of the two pictures according to the phase correlation method above. In the process of executing the phase correlation method, cross-power spectrum calculation is involved, cross-correlation levels under different displacement conditions can be obtained by utilizing a cross-power spectrum function, the cross-correlation levels are assumed to be subjected to normal distribution, the correlation parameters of the normal distribution can be calculated by a statistical method, and the connection confidence can be calculated by dividing the parameters by the maximum cross-correlation values.
According to one embodiment, the set of diagrammed connections does not contain connections where both points are the target points.
As shown in fig. 7, the top layer is a picture a and the bottom layer is a picture B, and the overlapping area of the two pictures is illustrated, and the overlapping is calculated by phase correlation. For example, the cross-correlation calculated for two pictures a and B in fig. 7 results in: confidence 131.542, x-direction relative displacement 33.4, y-direction phase displacement 10.7, rotation angle 0.3 degrees.
Fig. 8 shows a schematic diagram of a connection, including a reference point and an adjacent point.
In step S1034: and executing a gradient descent method on the map-building connection set, and correcting the position parameters and the posture parameters of the picture at the position to be positioned. Wherein according to one embodiment, the abscissa and the ordinate of the calibration point picture and the angle are unchanged, the gradient adjustment takes the parameter of the non-calibration point picture as a variable, and the calibration picture can be regarded as constant. Or a set of diagramming connections may be defined as a connection that does not contain both points as the calibration points are meaningless, since such adjustments would not have been made, nor would they be resolved when solving the gradient. The optimization function is for example as shown in equation 1:
equation 1:
equation 2:
Equation 3:
equation 4:
equation 5:
Equation 6:
Equation 7:
Wherein N represents a graph construction connection set containing N connections in total, i represents an ith connection in the graph construction connection set, A i represents a reference point of the ith connection, B i represents an adjacent point of the ith connection, R i represents a cross-correlation result of the ith connection, Course angle representing datum point,/>Representing heading angle of neighboring point,/>G θ(Ai,Bi) can be understood as the difference between the angle of the reference point and the neighboring point under the inertial navigation measurement unit, g θ(Ai,Bi)-uθ(Ri) can be understood as the difference between the angle difference under the inertial navigation measurement unit and the rotation relative angle in the cross-correlation unit (the rotation angle in the cross-correlation unit is theta calculated by the phase correlation method, which represents how much angle the neighboring point picture needs to rotate to be parallel to the reference point picture), where f θ is a course angle weight function used to represent the difference between the properties of the neighboring points during course angle fitting (for example: the labeled and non-labeled points) are weighted differently in the map iteration (as one example, typically the labeled points are weighted more heavily, e.g., 1000, and the non-labeled points are weighted less, e.g., 1); v θ is a weight function of the angle difference in the cross-correlation result, which is used to represent the weight of different connection properties (e.g. connection between two non-calibration points, connection between calibration point and non-calibration point) to the cross-correlation result angle (the degree of change should be equal, or equal, because the positions of two codes are equal, but the degree of change of calibration point and non-calibration point is unequal, the degree of change of non-calibration point is significantly larger than that of calibration point, and thus the weight is controlled by the weight control. According to a preferred embodiment, for the connection of two non-standard points, the weight can be 1, adjusted according to the same level; for connection of scaled and non-scaled points, the weight may also be 1, since the scaled points are constant and do not participate in the gradient calculation, the gradient may be considered unchanged. The weight ratio of the connection of the calibration point to the non-calibration point may be as high as 99 to 1 if fine tuning of the calibration point is considered.
The rest formula descriptions are similar to the description above, and the difference of the relative displacement of the X-axis direction difference under the inertial navigation measurement assembly and the X-axis direction in the cross-correlation result and the difference of the relative displacement of the Y-axis direction difference under the inertial navigation measurement assembly and the Y-axis direction in the cross-correlation result are calculated respectively, and the weight functions can be adjusted according to the service condition and the algorithm adaptation condition.X-axis coordinates representing reference points,/>Representing the x-axis coordinates of the neighboring points,/>Representing the x-direction relative displacement in the cross-correlation result, g x(Ai,Bi) can be understood as the difference of the x-direction coordinates of the reference point and the neighboring point under the inertial navigation measurement unit, g x(Ai,Bi)-ux(Ri) can be understood as the difference of the x-direction coordinate difference under the inertial navigation measurement unit and the x-direction relative displacement in the cross-correlation result (the x-direction relative displacement in the cross-correlation result is delta_x calculated by a phase correlation method, which represents how much distance the neighboring point picture needs to be translated along the x-direction to be aligned with the reference point picture), where f x is an x-axis weight function used to represent the difference of the properties of the neighboring point during the x-axis coordinate fitting process (for example: the labeled and non-labeled points) are weighted differently in the map iteration (as one example, typically the labeled points are weighted more heavily, e.g., 1000, and the non-labeled points are weighted less, e.g., 1); v x is the adjustment weight of the relative displacement of the cross-correlation result with respect to the x-axis, and may take on a value of 1, for example.
Representing the y-axis coordinates of the fiducial points,/>Representing the y-axis coordinates of the neighboring points,/>Representing the y-direction relative displacement in the cross-correlation result, g y(Ai,Bi) can be understood as the y-direction coordinate difference between the reference point and the neighboring point under the inertial navigation measurement unit, g y(Ai,Bi)-uy(Ri) can be understood as the difference between the y-direction coordinate difference under the inertial navigation measurement unit and the y-direction relative displacement in the cross-correlation result (the y-direction relative displacement in the cross-correlation result is delta_y calculated by the phase correlation method, which represents how much distance the neighboring point picture needs to be translated along the y-direction to be aligned with the reference point picture), where f y is an x-axis weight function used to represent the difference between the y-axis coordinate fitting process and the different points of interest (for example: the labeled and non-labeled points) are weighted differently in the map iteration (as one example, typically the labeled points are weighted more heavily, e.g., 1000, and the non-labeled points are weighted less, e.g., 1); v y is an adjustment weight of the relative displacement of the cross-correlation result with respect to the y-axis, and may take on a value of 1, for example.
Lambda 1、λ2、λ3 represents the weights of theta, x, y variations in the final fitting result, respectively, and some scenes are more sensitive to changes in theta, and lambda 1 can be adjusted up. According to a preferred embodiment, lambda 1、λ2、λ3 is 1.
The independent variable in equation 1 isBy deriving the independent variables of equation 1, the direction of gradient descent of the independent variables, or a set of gradients, is obtained for gradient descent
And executing an initialization step of a gradient descent method, and taking the position parameters and the posture parameters of the inertial navigation marks as initial positions of the pictures. The input of the gradient descent method is one of the last iteration set, one of the gradient and one of the step size, wherein the gradient is obtained by deriving the formula 1, the iteration initial set is assigned by position parameters and attitude parameters of, for example, inertial navigation labels, and the step size is fixed or variable.
After the initial set of gradients and iterations is determined, a step length decrease is performed in the direction of the gradients to optimize using equation 1. The step algorithm can be customized according to the requirement, and the system preferably adopts a fixed step to carry out gradient descent. The process is repeated until the iterative rate of change is less than the set threshold, for example, the present system sets the threshold to 0.1%. The change rate is, for example, the difference between the value obtained by the previous calculation and the value obtained by the current iterative calculation, divided by the value obtained by the previous calculation, and is the change rate. And finally, obtaining physical coordinates and posture parameters of each picture base point (such as a center point) as corrected position parameters and posture parameters of the position to be positioned.
Note that the position parameters and the posture parameters of the calibration point picture do not change during the execution of the gradient descent method on the map-building connection set.
The gradient descent method described above is performed using the x-axis coordinates, y-axis coordinates, and heading angle of the picture. According to a preferred embodiment of the invention, it is also possible to include the vertical coordinates, pitch angle and roll angle corresponding to the pictures, which is very helpful, especially in the case of uneven ground. These are all within the scope of the present invention.
According to a preferred embodiment of the present invention, a plurality of picture acquisitions are performed on some or all of the calibration points, and a position parameter and an attitude parameter corresponding to each picture acquisition are acquired. Through multiple times of acquisition of the pictures of the standard points, iteration results can be more accurate, and the number of connections is increased.
According to a preferred embodiment of the present invention, further comprising: storing the coordinate system, the picture of the locating point, the picture of the position to be located, the position parameter and the posture parameter of the picture of the locating point, and the corrected position parameter and posture parameter of the picture of the position to be located into a database or a file, and establishing a map. According to a preferred embodiment, the set of connections and/or the set of diagramming connections are stored simultaneously into the database or file as part of a map. Fig. 9 shows a graphical representation of a map built in accordance with the present invention.
Preferably, the manual checksum fine adjustment is performed on the iterated map, so that stable mapping of the physical coordinate system and the logical coordinate system is completed for subsequent positioning.
An automated guided vehicle 10 for image acquisition according to another embodiment of the present invention is described below with reference to fig. 10. As shown in fig. 10, the interior components of the automated guided vehicle 10 are shown, while the housing and the like thereof are omitted for clarity. The automatic guided vehicle 10 includes: a base 6; a light emitting device 5-2 mounted on the base and configured to illuminate an area below the base; a camera 5-3 mounted on the base and configured to capture a picture of an area beneath the base, for example, an area illuminated by the light emitting device; and a measuring assembly 3 mounted on the base and configured to measure or calculate a position parameter and an attitude parameter of the automatic guided vehicle corresponding to the picture.
The driving wheel 1 is mounted on the stand 6 and comprises a motor, a speed reducer and an encoder, wherein the motor provides driving force, the speed reducer amplifies the driving force, and the encoder is used for acquiring the rotation angle of the motor, so that the horizontal position of the automatic guided vehicle or the driving wheel can be acquired. The driving wheel 2 and the driving wheel 1 are matched to complete motion control. The measuring assembly 3 is for example an inertial navigation measuring device, which can provide one or several of instantaneous speed, instantaneous angle, instantaneous position, for example abscissa, ordinate, vertical, heading angle, pitch angle and roll angle. According to one embodiment of the invention, the encoder of the drive wheel may also be part of the measuring assembly 3. The control device 4 is mounted on said base 6 and coupled to the measuring assembly 3 and the camera 5-3. The control device 4 is configured to control the trolley to travel to a marking point and a position to be positioned to acquire a picture of the marking point and a picture of the position to be positioned, and to be able to synchronize the camera 5-3 and the measuring assembly 3 such that the measuring assembly 3 is able to measure the position parameters and the posture parameters of the trolley, i.e. to obtain the position parameters and the posture parameters corresponding to the picture, while the camera is acquiring the picture.
The camera 5-3 is, for example, a down-view camera, and forms an image capturing device 5 together with a light emitting device 5-2 and a light shielding cover 5-1, wherein the camera 5-3 is used for capturing an image under an automatic guided vehicle, and the light emitting device 5-2 is mounted on a base for illuminating a shooting area of the down-view camera. The light shield 5-1 is mounted on the base for softening the light of the light emitting device and preventing the occurrence of the reflection phenomenon. The light emitting device is preferably mounted around the light shield.
According to a preferred embodiment of the invention, the automated guided vehicle 10 further comprises processing means (not shown) coupled to the camera 5-3 and to the measuring assembly 3 for receiving the picture acquired by the camera and the position and orientation parameters measured by the measuring assembly and for modifying the position and orientation parameters of the picture of the position to be located based on the picture and the position and orientation parameters. Those skilled in the art will appreciate that the processing means may be integrated into the automated guided vehicle 10 or may be physically separate from the automated guided vehicle and in communication with other components by wired or wireless means. These are all within the scope of the present invention.
According to a preferred embodiment of the present invention, the processing means corrects the position parameters and the orientation parameters of the picture of the position to be located by:
constructing a set of connection points, wherein each connection point comprises a picture, the position parameter and the posture parameter corresponding to the picture, and whether the picture corresponds to a standard point;
acquiring two connection points with the distance not exceeding a preset value from the set of connection points as a connection, and establishing a set of connection;
calculating connection confidence degrees between two connection points included in each connection in the connection set, and filtering out the connection with the connection confidence degrees higher than a preset threshold value as a mapping connection set;
And executing a gradient descent method on the graph building connection set until the iteration change rate is lower than a preset pre-count, wherein the position parameter and the gesture parameter of the picture of the connection point of the non-standard point are used as initial iteration parameters of the gradient descent method when the initialization step of the gradient descent method is executed. The specific calculation process is shown in formulas 1-7.
According to a preferred embodiment of the invention, the measuring component is an inertial navigation measuring component, the position parameters comprise an abscissa and an ordinate, preferably a vertical, and the attitude parameters comprise a heading angle, preferably a pitch angle and a roll angle.
According to a preferred embodiment of the invention, the measuring assembly comprises a laser SLAM measuring device and/or a vision SLAM measuring device.
According to a preferred embodiment of the invention, the processing means are configured to store the coordinate system, the picture of the setpoint, the picture of the position to be located, the position and orientation parameters of the picture of the setpoint, and the corrected position and orientation parameters of the picture of the position to be located in a database or in a file, creating a map.
The invention also provides an image acquisition and processing system, comprising: an automatic guided vehicle as described above; and a processing device in communication with the camera and the measurement assembly and configured to modify the position and orientation parameters of the picture based on the picture and the position and orientation parameters. Wherein the processing means is not provided on the automated guided vehicle, for example.
Wherein the processing means is for example configured to perform the mapping method 100 as described above.
The invention also provides a map building and positioning system for the automatic guided vehicle, which comprises the following steps: the camera is arranged to collect images below the automatic guiding vehicle; a light emitting device configured to illuminate a lower side of the automated guided vehicle; an inertial navigation measurement component configured to measure a position parameter and an attitude parameter of the automated guided vehicle; and the processing device is coupled with the camera and the inertial navigation measurement component, and corrects the position parameter and the posture parameter of the picture based on the image, the position parameter and the posture parameter.
Wherein the processing means is for example configured to perform the mapping method 100 as described above.
The invention also provides equipment for mapping the field, which comprises the following steps: means configured to establish or acquire a coordinate system of the site; means configured to scan the field, obtain a picture of a calibration point and a plurality of pictures of positions to be positioned, and position parameters and attitude parameters corresponding to the pictures; and means configured to correct the position parameter and the posture parameter of the picture of the position to be positioned based on the picture, the position parameter and the posture parameter.
The present invention also provides a positioning method 200 based on the map created by the method 100. A positioning method 200 according to the invention is described below with reference to fig. 11.
As shown in fig. 11, in step S201, loading or obtaining the map obtained by the method 100 of the present invention may be performed by, for example, loading or reading a map file or database.
In step S202, a picture of the position to be located and a position parameter and an attitude parameter corresponding to the picture are acquired or obtained. For example, during the operation of the AGV, a picture is acquired and simultaneously, the position parameter and the posture parameter corresponding to the picture are measured.
In step S203, in the map, a picture closest to the picture of the position to be located is retrieved
According to a preferred embodiment of the present invention, the positioning method 200 further comprises: and calculating the confidence coefficient, the position parameter offset and the attitude parameter offset between the picture at the position to be positioned and the picture closest to the position to be positioned by using a phase correlation method.
According to a preferred embodiment of the present invention, when the confidence calculated using the phase correlation method is lower than a preset value, discarding the picture closest to the position to be located, and retrieving the picture closest to the position to be located (excluding the discarded picture) and having a confidence higher than the preset value. When finding the picture with the nearest distance and higher confidence than the preset value, the position parameter of the picture to be positioned can be obtained by adding the offset of the phase correlation method to the position of the searched picture, and then the positioning position of the equipment is updated, namely the positioning is successful. After successful positioning, the next retrieved position is the positioning position.
Fig. 12 is a block diagram of a computer program product 900 arranged in accordance with at least some embodiments of the invention. The signal bearing medium 902 may be implemented as or include a computer readable medium 906, a computer recordable medium 908, a computer communication medium 910, or a combination thereof, that stores programming instructions 904 that configure the processing unit to perform all or some of the previously described processes. The instructions may include, for example, one or more executable instructions for causing one or more processors to: establishing or acquiring a coordinate system of the field; scanning the field, and acquiring a picture of a calibration point, a picture of a position to be positioned, and a position parameter and a posture parameter corresponding to the picture; correcting the position parameter and the posture parameter of the picture to be positioned based on the picture of the calibration point, the picture of the position to be positioned, the position parameter and the posture parameter.
While the foregoing detailed description has set forth various examples of the apparatus and/or methods via the use of block diagrams, flowcharts, and/or examples, such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, and it will be appreciated by those skilled in the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, in a wide range of hardware, software, firmware, or virtually any combination thereof. In one example, portions of the subject matter described herein may be implemented via an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), or other integrated format. However, those skilled in the art will recognize that some aspects of the examples disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and/or firmware would be well within the skill of one of skill in the art in light of this disclosure. For example, if the user determines that speed and accuracy are paramount, the user may select a primary hardware and/or firmware vehicle; if flexibility is paramount, the user may select a primary software implementation; or, again alternatively, the user may select some combination of hardware, software, and/or firmware.
In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative example of the subject matter described herein applies regardless of the particular type of signal bearing media used to actually carry out the distribution. Examples of signal bearing media include, but are not limited to, the following: recordable type media such as a floppy disk, a hard disk drive, a Compact Disk (CD), a Digital Video Disk (DVD), a digital tape, a computer memory, etc.; and transmission type media such as digital and/or analog communication media (e.g., fiber optic cable, waveguide, wired communications link, wireless communication link, etc.).
Those skilled in the art will recognize that it is common in the art to describe devices and/or methods in the manner set forth herein, and thereafter to integrate such described devices and/or methods into a data processing system using engineering practices. That is, at least a portion of the apparatus and/or methods described herein may be integrated into a data processing system via a reasonable amount of experimentation. Those skilled in the art will recognize that a typical data processing system will generally include one or more of the following: a system unit housing, a video display device, a memory such as volatile and non-volatile memory, a processor such as a microprocessor and a digital signal processor, a computing entity such as an operating system, a driver, a graphical user interface, and application programs, one or more interactive devices such as a touch pad or touch screen, and/or a control system including a feedback loop and a control motor (e.g., feedback for sensing position and/or velocity; control motor for moving and/or adjusting components and/or amounts). Typical data processing systems may be implemented using any suitable commercially available components, such as those commonly found in data computing/communication and/or network computing/communication systems.
Finally, it should be noted that: the foregoing description is only a preferred embodiment of the present invention, and the present invention is not limited thereto, but it is to be understood that modifications and equivalents of some of the technical features described in the foregoing embodiments may be made by those skilled in the art, although the present invention has been described in detail with reference to the foregoing embodiments. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (30)

1. A method of mapping a venue comprising:
Establishing or acquiring a coordinate system of the field;
Scanning the field, and acquiring a picture of a calibration point, a picture of a position to be positioned, and a position parameter and a posture parameter corresponding to the picture;
Correcting the position parameter and the posture parameter of the picture at the position to be positioned based on the picture at the calibration point, the picture at the position to be positioned, the position parameter and the posture parameter; the step of correcting the position parameter and the posture parameter of the picture of the position to be positioned comprises the following steps:
constructing a set of connection points, wherein each connection point comprises a picture, the position parameter and the posture parameter corresponding to the picture, and whether the picture corresponds to a standard point;
acquiring two connection points with the distance not exceeding a preset value from the set of connection points as a connection, and establishing a set of connection;
calculating connection confidence degrees between two connection points included in each connection in the connection set, and filtering out the connection with the connection confidence degrees higher than a preset threshold value as a mapping connection set;
and correcting the position parameter and the posture parameter of the picture at the position to be positioned based on the map-building connection set.
2. The method of claim 1, wherein the position parameters include an abscissa and an ordinate, and the attitude parameters include a heading angle.
3. The method of claim 1, wherein the position parameters further comprise vertical coordinates and the attitude parameters further comprise pitch and roll angles.
4. The method of claim 1, wherein the modifying step further comprises:
And executing a gradient descent method on the map-building connection set, wherein the position parameter and the gesture parameter of the picture of the connection point of the non-standard point are taken as initial iteration parameters of the gradient descent method when the initialization step of the gradient descent method is executed.
5. The method of claim 4, wherein the modifying step further comprises: and executing the gradient descent method until the iterative change rate is lower than a preset threshold value.
6. The method of any of claims 1-5, wherein a plurality of picture acquisitions are performed for some or all of the calibration points, and a position parameter and a pose parameter corresponding to each picture acquisition are acquired.
7. The method of any one of claims 1-5, further comprising: storing the coordinate system, the picture of the locating point, the picture of the position to be located, the position parameter and the posture parameter of the picture of the locating point, and the corrected position parameter and posture parameter of the picture of the position to be located into a database or a file, and establishing a map.
8. The method of claim 7, wherein the step of creating a map further comprises: and storing the connection set and/or the diagramming connection set into the database or the file.
9. The method of any one of claims 1-5, wherein the coordinate system is a physical coordinate system.
10. The method of claim 1, wherein the predetermined value is half of a length or width of the picture.
11. An automated guided vehicle for image acquisition, comprising:
a base;
The camera is mounted on the base and configured to collect pictures of an area below the base;
A measuring assembly mounted on the base and configured to measure or calculate a position parameter and an attitude parameter of the automated guided vehicle corresponding to the picture;
A control device mounted on the base, the camera and the measurement assembly both coupled to the control device, the control device configured to control the automated guided vehicle to travel to a calibration point and a location to be positioned to capture a picture of the calibration point and a picture of the location to be positioned;
The processing device is coupled with the camera and the measuring component, and corrects the position parameter and the posture parameter of the picture at the position to be positioned based on the picture, the position parameter and the posture parameter; the step of correcting the position parameter and the posture parameter of the picture of the position to be positioned comprises the following steps:
constructing a set of connection points, wherein each connection point comprises a picture, the position parameter and the posture parameter corresponding to the picture, and whether the picture corresponds to a standard point;
acquiring two connection points with the distance not exceeding a preset value from the set of connection points as a connection, and establishing a set of connection;
calculating connection confidence degrees between two connection points included in each connection in the connection set, and filtering out the connection with the connection confidence degrees higher than a preset threshold value as a mapping connection set;
and correcting the position parameter and the posture parameter of the picture at the position to be positioned based on the map-building connection set.
12. The automated guided vehicle of claim 11, further comprising a light device mounted on the base and configured to illuminate an area under the base for the camera to take pictures.
13. The automated guided vehicle of claim 11, the step of modifying the position and pose parameters of the picture of the position to be positioned based on the mapping connection set comprising:
And executing a gradient descent method on the graph building connection set until the iteration change rate is lower than a preset threshold value, wherein the position parameter and the gesture parameter of the picture of the connection point of the non-standard point are used as initial iteration parameters of the gradient descent method when the initialization step of the gradient descent method is executed.
14. The automated guided vehicle of claim 12, further comprising a light shield mounted on the base for softening light emitted by the light emitting device.
15. The automated guided vehicle of claim 14, wherein the light emitting device is mounted around the light shield.
16. The automated guided vehicle of any of claims 11-15, wherein the measurement component is an inertial navigation measurement component.
17. The automated guided vehicle of any of claims 11-15, wherein the position parameters include an abscissa and an ordinate and the attitude parameters include a heading angle.
18. The automated guided vehicle of claim 17, wherein the position parameters further comprise vertical coordinates and the attitude parameters further comprise pitch and roll angles.
19. The automated guided vehicle of any of claims 11-15, wherein the measurement assembly comprises a laser SLAM measurement device and/or a vision SLAM measurement device.
20. The automated guided vehicle of claim 11, wherein the processing device is configured to store the coordinate system, the picture of the setpoint, the picture of the location to be positioned, the position and orientation parameters of the picture of the setpoint, and the corrected position and orientation parameters of the picture of the location to be positioned in a database or file to create a map.
21. The automated guided vehicle of claim 20, wherein the processing device is configured to store the set of connections and/or the mapped connection set into the database or file.
22. An image acquisition and processing system comprising:
an automated guided vehicle, the automated guided vehicle comprising:
a base;
The camera is mounted on the base and configured to collect pictures of an area below the base;
A measuring assembly mounted on the base and configured to measure or calculate a position parameter and an attitude parameter of the automated guided vehicle corresponding to the picture;
a control device mounted on the base, the camera and the measurement assembly both coupled to the control device, the control device configured to control the automated guided vehicle to travel to a calibration point and a location to be positioned to capture a picture of the calibration point and a picture of the location to be positioned; and
The processing device is coupled with the camera and the measuring component and corrects the position parameter and the posture parameter of the picture based on the picture and the position parameter and the posture parameter; the step of correcting the position parameter and the posture parameter of the picture of the position to be positioned comprises the following steps:
constructing a set of connection points, wherein each connection point comprises a picture, the position parameter and the posture parameter corresponding to the picture, and whether the picture corresponds to a standard point;
acquiring two connection points with the distance not exceeding a preset value from the set of connection points as a connection, and establishing a set of connection;
calculating connection confidence degrees between two connection points included in each connection in the connection set, and filtering out the connection with the connection confidence degrees higher than a preset threshold value as a mapping connection set;
and correcting the position parameter and the posture parameter of the picture at the position to be positioned based on the map-building connection set.
23. The image acquisition and processing system of claim 22, wherein the processing device is configured to perform the method of mapping a venue of any of claims 1-10.
24. A mapping and positioning system for an automated guided vehicle, comprising:
the camera is arranged to collect images below the automatic guiding vehicle;
a light emitting device configured to illuminate a lower side of the automated guided vehicle;
an inertial navigation measurement component configured to measure a position parameter and a pose parameter of the automated guided vehicle;
A processing device, both coupled to the camera and the inertial navigation measurement component, configured to correct position and pose parameters of a picture of a position to be located based on the image, the position parameter, and the pose parameter; the step of correcting the position parameter and the posture parameter of the picture of the position to be positioned comprises the following steps:
constructing a set of connection points, wherein each connection point comprises a picture, the position parameter and the posture parameter corresponding to the picture, and whether the picture corresponds to a standard point;
acquiring two connection points with the distance not exceeding a preset value from the set of connection points as a connection, and establishing a set of connection;
calculating connection confidence degrees between two connection points included in each connection in the connection set, and filtering out the connection with the connection confidence degrees higher than a preset threshold value as a mapping connection set;
and correcting the position parameter and the posture parameter of the picture at the position to be positioned based on the map-building connection set.
25. A mapping and positioning system as claimed in claim 24, wherein the processing means is configured to perform a method of mapping a site as claimed in any one of claims 1 to 10.
26. An apparatus for mapping a venue comprising:
means configured to establish or acquire a coordinate system of the site;
Means configured to scan the field, obtain a picture of a calibration point and a plurality of pictures of positions to be positioned, and position parameters and attitude parameters corresponding to the pictures;
Means configured to correct a position parameter and an attitude parameter of the picture of the position to be positioned based on the picture, the position parameter and the attitude parameter; the step of correcting the position parameter and the posture parameter of the picture to be positioned based on the picture, the position parameter and the posture parameter comprises the following steps:
constructing a set of connection points, wherein each connection point comprises a picture, the position parameter and the posture parameter corresponding to the picture, and whether the picture corresponds to a standard point;
acquiring two connection points with the distance not exceeding a preset value from the set of connection points as a connection, and establishing a set of connection;
calculating connection confidence degrees between two connection points included in each connection in the connection set, and filtering out the connection with the connection confidence degrees higher than a preset threshold value as a mapping connection set;
and correcting the position parameter and the posture parameter of the picture at the position to be positioned based on the map-building connection set.
27. A positioning method, comprising:
loading or obtaining a map obtained by the method of any one of claims 1-10;
Acquiring or obtaining a picture of a position to be positioned and a position parameter and a posture parameter corresponding to the picture;
and searching a picture closest to the picture at the position to be positioned according to the map.
28. The positioning method of claim 27, further comprising: and calculating the confidence coefficient, the position parameter offset and the attitude parameter offset between the picture at the position to be positioned and the picture closest to the position to be positioned by using a phase correlation method.
29. The positioning method as set forth in claim 27, wherein when the confidence calculated using the phase correlation method is lower than a preset value, discarding the picture closest to the position to be positioned, and retrieving the picture closest to the position to be positioned and having the confidence higher than the preset value.
30. A computer readable storage medium comprising computer executable instructions stored thereon, which when executed by a processor implement the method of mapping a site as claimed in any of claims 1 to 10.
CN201811475564.5A 2018-05-31 2018-12-04 Picture construction method, image acquisition and processing system and positioning method Active CN110006420B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/CN2019/075741 WO2019154435A1 (en) 2018-05-31 2019-02-21 Mapping method, image acquisition and processing system, and positioning method
JP2019531677A JP6977921B2 (en) 2018-05-31 2019-02-21 Mapping method, image collection processing system and positioning method
JP2019531456A JP7083472B2 (en) 2018-05-31 2019-04-25 Map construction method, image collection processing system and positioning method
PCT/CN2019/084185 WO2019154444A2 (en) 2018-05-31 2019-04-25 Mapping method, image acquisition and processing system, and positioning method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201810551792 2018-05-31
CN201810551792X 2018-05-31
CN201820865300 2018-05-31
CN201820865300X 2018-05-31

Publications (2)

Publication Number Publication Date
CN110006420A CN110006420A (en) 2019-07-12
CN110006420B true CN110006420B (en) 2024-04-23

Family

ID=67174769

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201811475564.5A Active CN110006420B (en) 2018-05-31 2018-12-04 Picture construction method, image acquisition and processing system and positioning method
CN201822023605.9U Active CN211668521U (en) 2018-05-31 2018-12-04 Automatic guide vehicle for image acquisition and processing system
CN201910303482.0A Active CN110189331B (en) 2018-05-31 2019-04-16 Mapping method, image acquisition and processing system and positioning method

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN201822023605.9U Active CN211668521U (en) 2018-05-31 2018-12-04 Automatic guide vehicle for image acquisition and processing system
CN201910303482.0A Active CN110189331B (en) 2018-05-31 2019-04-16 Mapping method, image acquisition and processing system and positioning method

Country Status (2)

Country Link
JP (2) JP6977921B2 (en)
CN (3) CN110006420B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110006420B (en) * 2018-05-31 2024-04-23 上海快仓智能科技有限公司 Picture construction method, image acquisition and processing system and positioning method
CN112150907A (en) * 2019-10-23 2020-12-29 王博 Method for constructing map based on earth texture and application
CN112070810B (en) * 2020-08-31 2024-03-22 安徽爱观视觉科技有限公司 Positioning method, mobile device, and computer-readable storage medium
CN112465912B (en) * 2020-11-18 2024-03-29 新拓三维技术(深圳)有限公司 Stereo camera calibration method and device
CN112612788B (en) * 2020-12-11 2024-03-01 中国北方车辆研究所 Autonomous positioning method under navigation-free satellite signal
CN112835333B (en) * 2020-12-31 2022-03-15 北京工商大学 Multi-AGV obstacle avoidance and path planning method and system based on deep reinforcement learning
CN113029168B (en) * 2021-02-26 2023-04-07 杭州海康机器人股份有限公司 Map construction method and system based on ground texture information and mobile robot
CN115761311B (en) * 2022-11-03 2023-07-07 广东科力新材料有限公司 Performance detection data analysis method and system for PVC calcium zinc stabilizer

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1497244A (en) * 2002-09-27 2004-05-19 佳能株式会社 Information processing method and information processing deivce
CN103955550A (en) * 2012-11-12 2014-07-30 洛克威尔自动控制技术股份有限公司 Method and device for computer aided design of man-machine interface cartoon graphic element
CN104835173A (en) * 2015-05-21 2015-08-12 东南大学 Positioning method based on machine vision
CN105389819A (en) * 2015-11-13 2016-03-09 武汉工程大学 Robust semi-calibrating down-looking image epipolar rectification method and system
CN105444741A (en) * 2015-12-17 2016-03-30 南京航空航天大学 Double view window based route characteristic identifying, deviation measuring, and accurate positioning method
CN106289285A (en) * 2016-08-20 2017-01-04 南京理工大学 Map and construction method are scouted by a kind of robot associating scene
CN106714110A (en) * 2017-01-19 2017-05-24 深圳大学 Auto building method and system of Wi-Fi position fingerprint map
CN106767854A (en) * 2016-11-07 2017-05-31 纵目科技(上海)股份有限公司 mobile device, garage map forming method and system
CN107146255A (en) * 2017-04-05 2017-09-08 纵目科技(上海)股份有限公司 Panoramic picture error calibration method and device
CN107392848A (en) * 2017-06-14 2017-11-24 江西科技师范大学 Panoramic image display method and device
CN107607110A (en) * 2017-07-29 2018-01-19 刘儿兀 A kind of localization method and system based on image and inertial navigation technique
CN211668521U (en) * 2018-05-31 2020-10-13 上海快仓智能科技有限公司 Automatic guide vehicle for image acquisition and processing system

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7774158B2 (en) * 2002-12-17 2010-08-10 Evolution Robotics, Inc. Systems and methods for landmark generation for visual simultaneous localization and mapping
KR20050103205A (en) * 2003-01-24 2005-10-27 코코 커뮤니케이션즈 코포레이션 Method and apparatus for secure communications and resource sharing between anonymous non-trusting parties with no central administration cross reference to related applications
CN101566471B (en) * 2007-01-18 2011-08-31 上海交通大学 Intelligent vehicular visual global positioning method based on ground texture
CN101354441A (en) * 2008-09-11 2009-01-28 上海交通大学 All-weather operating mobile robot positioning system
CN102201052B (en) * 2010-03-26 2015-08-19 新奥特(北京)视频技术有限公司 A kind of method that in basketball INVENTIONBroadcast video, place is detected
CN102324102B (en) * 2011-10-08 2014-04-16 北京航空航天大学 Method for automatically filling structure information and texture information of hole area of image scene
CN102692188B (en) * 2012-05-08 2014-11-12 浙江工业大学 Dynamic crack length measurement method for machine vision fatigue crack propagation test
US9558588B2 (en) * 2012-06-26 2017-01-31 Schlumberger Technology Corporation Method for building a 3D model of a rock sample
CN102866397B (en) * 2012-10-12 2014-10-01 中国测绘科学研究院 Combined positioning method for multisource heterogeneous remote sensing image
WO2015084533A1 (en) * 2013-12-05 2015-06-11 Schlumberger Canada Limited Digital core model construction
JP6637980B2 (en) * 2014-12-09 2020-01-29 ビーエーエスエフ ソシエタス・ヨーロピアBasf Se Optical detector
CN104732545B (en) * 2015-04-02 2017-06-13 西安电子科技大学 The texture image segmenting method with quick spectral clustering is propagated with reference to sparse neighbour
CN105043383A (en) * 2015-07-10 2015-11-11 哈尔滨医科大学 Posture correction method and apparatus
WO2017106106A1 (en) * 2015-12-15 2017-06-22 Leica Biosystems Imaging, Inc. Automatic nuclear segmentation
CN105426872B (en) * 2015-12-17 2019-06-21 电子科技大学 A kind of facial age estimation method returned based on correlated Gaussian process
CN107918499B (en) * 2016-10-09 2022-09-06 北京墨土科技有限公司 Optical positioning system and method, optical observation equipment for positioning
CN107103293B (en) * 2017-04-13 2019-01-29 西安交通大学 It is a kind of that the point estimation method is watched attentively based on joint entropy
CN106996777B (en) * 2017-04-21 2019-02-12 合肥井松自动化科技有限公司 A kind of vision navigation method based on ground image texture
CN107492105A (en) * 2017-08-11 2017-12-19 深圳市旭东数字医学影像技术有限公司 A kind of variation dividing method based on more statistical informations
CN107966638B (en) * 2017-12-29 2020-09-11 国网北京市电力公司 Method and apparatus for correcting error, storage medium, and processor

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1497244A (en) * 2002-09-27 2004-05-19 佳能株式会社 Information processing method and information processing deivce
CN103955550A (en) * 2012-11-12 2014-07-30 洛克威尔自动控制技术股份有限公司 Method and device for computer aided design of man-machine interface cartoon graphic element
CN104835173A (en) * 2015-05-21 2015-08-12 东南大学 Positioning method based on machine vision
CN105389819A (en) * 2015-11-13 2016-03-09 武汉工程大学 Robust semi-calibrating down-looking image epipolar rectification method and system
CN105444741A (en) * 2015-12-17 2016-03-30 南京航空航天大学 Double view window based route characteristic identifying, deviation measuring, and accurate positioning method
CN106289285A (en) * 2016-08-20 2017-01-04 南京理工大学 Map and construction method are scouted by a kind of robot associating scene
CN106767854A (en) * 2016-11-07 2017-05-31 纵目科技(上海)股份有限公司 mobile device, garage map forming method and system
CN106714110A (en) * 2017-01-19 2017-05-24 深圳大学 Auto building method and system of Wi-Fi position fingerprint map
CN107146255A (en) * 2017-04-05 2017-09-08 纵目科技(上海)股份有限公司 Panoramic picture error calibration method and device
CN107392848A (en) * 2017-06-14 2017-11-24 江西科技师范大学 Panoramic image display method and device
CN107607110A (en) * 2017-07-29 2018-01-19 刘儿兀 A kind of localization method and system based on image and inertial navigation technique
CN211668521U (en) * 2018-05-31 2020-10-13 上海快仓智能科技有限公司 Automatic guide vehicle for image acquisition and processing system

Also Published As

Publication number Publication date
CN211668521U (en) 2020-10-13
CN110006420A (en) 2019-07-12
CN110189331B (en) 2022-08-05
JP6977921B2 (en) 2021-12-08
CN110189331A (en) 2019-08-30
JP2020532775A (en) 2020-11-12
JP2020530598A (en) 2020-10-22
JP7083472B2 (en) 2022-06-13

Similar Documents

Publication Publication Date Title
CN110006420B (en) Picture construction method, image acquisition and processing system and positioning method
CN102538779B (en) Robot system and map updating method
CN111415387B (en) Camera pose determining method and device, electronic equipment and storage medium
US10044996B2 (en) Method for projecting virtual data and device enabling this projection
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
CN109443392B (en) Navigation error determination method and device, navigation control method, device and equipment
CN112363158B (en) Pose estimation method for robot, robot and computer storage medium
CN112964276B (en) Online calibration method based on laser and vision fusion
CN109863547B (en) Apparatus for constructing map using machine learning and image processing
CN111750804B (en) Object measuring method and device
CN111862215B (en) Computer equipment positioning method and device, computer equipment and storage medium
US20130135463A1 (en) Information processing apparatus, information processing method and computer-readable storage medium
CN115205128A (en) Depth camera temperature drift correction method, system, equipment and medium based on structured light
CN112529957A (en) Method and device for determining pose of camera device, storage medium and electronic device
JP5728564B2 (en) Robot system and map updating method
CN116091724A (en) Building digital twin modeling method
CN117824667B (en) Fusion positioning method and medium based on two-dimensional code and laser
CN117824666B (en) Two-dimensional code pair for fusion positioning, two-dimensional code calibration method and fusion positioning method
JP2018084699A (en) Map preparation device
CN116295508A (en) Road side sensor calibration method, device and system based on high-precision map
WO2019154435A1 (en) Mapping method, image acquisition and processing system, and positioning method
JP2000121354A (en) Range finding method
CN113537351A (en) Remote sensing image coordinate matching method for mobile equipment shooting
CN112651393A (en) Method, device and equipment for processing point of interest data and storage medium
JP2015144013A (en) robot system and map update method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant