WO2023000528A1 - Procédé et appareil de positionnement par cartes, support de stockage lisible par ordinateur et dispositif terminal - Google Patents

Procédé et appareil de positionnement par cartes, support de stockage lisible par ordinateur et dispositif terminal Download PDF

Info

Publication number
WO2023000528A1
WO2023000528A1 PCT/CN2021/126719 CN2021126719W WO2023000528A1 WO 2023000528 A1 WO2023000528 A1 WO 2023000528A1 CN 2021126719 W CN2021126719 W CN 2021126719W WO 2023000528 A1 WO2023000528 A1 WO 2023000528A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
identification code
image
target moving
camera
Prior art date
Application number
PCT/CN2021/126719
Other languages
English (en)
Chinese (zh)
Inventor
林灿然
程骏
张惊涛
郭渺辰
庞建新
Original Assignee
深圳市优必选科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市优必选科技股份有限公司 filed Critical 深圳市优必选科技股份有限公司
Publication of WO2023000528A1 publication Critical patent/WO2023000528A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present application belongs to the technical field of positioning, and in particular relates to a map positioning method, device, computer-readable storage medium and terminal equipment.
  • moving devices such as cars can be controlled to walk along the black lines on the white ground through the technical scheme of infrared line inspection.
  • the car When the car is moving, it continuously emits infrared light to the ground.
  • the infrared light meets the white ground, diffuse reflection occurs, and the reflected light is received by the receiving tube on the car; if it encounters a black line, the infrared light will be absorbed.
  • the receiving tube cannot receive infrared light.
  • the single-chip microcomputer system on the trolley determines the position of the black line and the walking route of the trolley based on whether it receives the reflected infrared light or not.
  • the embodiment of the present application provides a map positioning method, device, computer-readable storage medium, and terminal equipment to solve the problem that the existing technology cannot effectively locate the position of the trolley, and the trolley can only be positioned along a pre-set The fixed route travels, the problem of extremely poor flexibility.
  • the first aspect of the embodiments of the present application provides a map positioning method, which may include:
  • the determining the corner position of the outline of the map in the global image may include:
  • the determining the scale of the global image according to the corner position and the preset actual map size may include:
  • the mean value of the width ratio and the height ratio is determined as the scale of the global image.
  • the determination is based on the center point of the identification code of the target moving device, the corner position of the outline of the map, the height of the camera device from the map, and the scale
  • the actual location of the target mobile device in the map may include:
  • An actual position of the target mobile device in the map is determined based on a second position of the target mobile device in the map and the scale.
  • the first position is corrected according to the height of the camera device from the map and the preset height of the target mobile device to obtain the target mobile device at The second location in the map may include:
  • the first position is corrected according to the following formula:
  • (x * , y * ) is the coordinate of the first position
  • h m is the height of the camera device from the map
  • h c is the preset height of the target moving device
  • (x, y) is the coordinates of the second location.
  • the determining the actual position of the target mobile device in the map according to the second position of the target mobile device in the map and the scale may include:
  • the actual position of the target mobile device in the map is calculated according to the following formula:
  • (x, y) is the coordinate of the second position
  • s is the scale
  • W is the width of the actual map size
  • H is the height of the actual map size
  • (x m , y m ) is The coordinates of the actual location.
  • the acquisition of the global image of the target moving device traveling on the map through the preset camera device may include:
  • a map positioning device which may include:
  • the image acquisition module is used to obtain the global image of the target moving device traveling on the map through the preset camera device; the camera device is located at the top of the map; the target moving device and the map are provided with corresponding identification code;
  • a map identification code detection module configured to detect the identification code of the map in the global image, and determine the height of the camera device from the map according to the detection result of the identification code of the map;
  • a corner position determining module configured to determine the corner position of the outline of the map in the global image
  • a scale determination module configured to determine the scale of the global image according to the corner position and the preset actual map size
  • a device identification code detection module configured to detect the identification code of the target moving device in the global image, and determine the center point of the identification code of the target moving device according to the detection result of the identification code of the target moving device;
  • a position determination module configured to determine the position of the target mobile device at the target mobile device according to the center point of the identification code of the target mobile device, the corner position of the outline of the map, the height of the camera device from the map, and the scale actual location on the map.
  • the corner position determining module may include:
  • a grayscale unit configured to perform grayscale processing on the global image to obtain a grayscale image
  • a Gaussian blur unit configured to perform Gaussian blur processing on the grayscale image to obtain a Gaussian blurred image
  • an expansion unit configured to perform expansion processing on the Gaussian blurred image to obtain an expanded image
  • an edge detection unit configured to perform edge detection on the dilated image to obtain an edge of the map
  • a contour detection unit configured to perform contour detection on the edge of the map to obtain the contour of the map
  • the corner positioning unit is configured to perform corner positioning on the outline of the map to obtain corner positions of the outline of the map.
  • the scale determination module may include:
  • a pixel size determining unit configured to determine the pixel size of the map in the global image according to the corner position
  • a width ratio calculation unit configured to calculate a width ratio between the actual map size and the pixel size
  • a height ratio calculation unit configured to calculate the height ratio between the actual map size and the pixel size
  • a scale determining unit configured to determine an average value of the width ratio and the height ratio as the scale of the global image.
  • the position determination module may include:
  • a first position determining unit configured to determine a first position of the target mobile device in the map according to the center point of the identification code of the target mobile device and the corner position of the outline of the map;
  • the second position determination unit is configured to correct the first position according to the height of the camera device from the map and the preset height of the target mobile device, to obtain the position of the target mobile device in the map second position;
  • An actual position determining unit configured to determine the actual position of the target mobile device in the map according to the second position of the target mobile device in the map and the scale.
  • the second position determining unit is specifically configured to correct the first position according to the following formula:
  • (x * , y * ) is the coordinate of the first position
  • h m is the height of the camera device from the map
  • h c is the preset height of the target moving device
  • (x, y) is the coordinates of the second location.
  • the actual position determination unit is specifically configured to calculate the actual position of the target mobile device in the map according to the following formula:
  • (x, y) is the coordinate of the second position
  • s is the scale
  • W is the width of the actual map size
  • H is the height of the actual map size
  • (x m , y m ) is The coordinates of the actual location.
  • the image acquisition module may include:
  • a camera calibration unit configured to perform camera calibration on the camera device to obtain camera internal parameters and distortion coefficients of the camera device
  • an original image acquisition unit configured to acquire a global original image of the target moving device traveling on the map through the camera device
  • a de-distortion unit configured to perform de-distortion processing on the global original image according to the internal camera parameters and the distortion coefficients, to obtain the global image.
  • a third aspect of the embodiments of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps of any one of the above-mentioned map positioning methods are implemented.
  • the fourth aspect of the embodiments of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and operable on the processor, when the processor executes the computer program Steps for implementing any one of the above-mentioned map positioning methods.
  • a fifth aspect of the embodiments of the present application provides a computer program product, which, when the computer program product is run on a terminal device, causes the terminal device to execute the steps of any one of the above map positioning methods.
  • the embodiment of the present application has the beneficial effect that the embodiment of the present application pre-sets the camera device on the top of the map on which the target moving device travels, and sets corresponding identification codes on both the target moving device and the map Obtain the global image of the target moving device on the map through the camera; detect the identification code of the map in the global image, and determine the height of the camera from the map according to the detection result of the identification code of the map; determine the map in the global image The corner position of the outline, and determine the scale of the global image according to the corner position and the preset actual map size; detect the identification code of the target mobile device in the global image, and determine it according to the detection result of the identification code of the target mobile device The center point of the identification code of the target mobile device; determine the actual position of the target mobile device in the map according to the center point of the identification code of the target mobile device, the corner position of the outline of the map, the height and the scale of the camera from the map, so as to realize the Precise positioning of target motion devices
  • Fig. 1 is a flow chart of an embodiment of a map positioning method in the embodiment of the present application
  • Fig. 2 is the schematic flowchart of obtaining the global image of the target moving device moving on the map through the preset camera device;
  • Fig. 3 is an example diagram of the identification code of the map
  • Fig. 4 is a schematic flowchart of determining the corner position of the outline of the map in the global image
  • Fig. 5 is a schematic flowchart of determining the scale of the global image according to the corner position and the preset actual map size
  • Fig. 6 is a schematic flowchart of determining the actual position of the target mobile device in the map
  • Fig. 7 is a schematic diagram of the relationship between trigonometric functions for position correction
  • FIG. 8 is a structural diagram of an embodiment of a map positioning device in the embodiment of the present application.
  • FIG. 9 is a schematic block diagram of a terminal device in an embodiment of the present application.
  • the term “if” may be construed as “when” or “once” or “in response to determining” or “in response to detecting” depending on the context .
  • the phrases “if determined” or “if [the described condition or event is detected]” may be construed, depending on the context, to mean “once determined” or “in response to the determination” or “once detected [the described condition or event] ]” or “in response to detection of [described condition or event]”.
  • an embodiment of a map positioning method in the embodiment of the present application may include:
  • Step S101 acquiring a global image of the target moving device traveling on the map through a preset camera device.
  • the target motion device may be various motion devices such as a car, a drone, or a robot.
  • the map refers to the ground area where the target mobile device travels.
  • a square sand table can be constructed in advance, and the target mobile device travels on the sand table and performs various specified tasks, then the sand table is a map.
  • the camera device is located on the top of the map.
  • the camera device can be installed on the ceiling of the room where the map is located, so that the camera device can overlook the entire map from top to bottom and collect a complete image of the map.
  • Corresponding identification codes are provided on the target moving device and the map to facilitate its identification and positioning.
  • the identification code can be any code pattern that can be identified and located in the prior art.
  • a two-dimensional code as an identification code.
  • step S101 may specifically include the process shown in Figure 2:
  • Step S1011 perform camera calibration on the camera device, and obtain camera intrinsic parameters and distortion coefficients of the camera device.
  • the specific camera calibration method used can be set according to the actual situation.
  • the checkerboard calibration method of Zhang Zhengyou can be preferably used to obtain the internal camera reference (denoted as mtx) and distortion coefficient (denoted as dist) of the camera device.
  • Step S1012 acquire the global original image of the target moving device traveling on the map through the camera device.
  • the global original image is the original image directly collected by the camera device without past distortion processing.
  • each frame of image in the video stream is a global original image.
  • Step S1013 performing de-distortion processing on the global original image according to the internal camera parameters and distortion coefficients to obtain a global image.
  • the embodiment of the present application may perform de-distortion processing on each frame of the global original image.
  • UndistortImage the main functions used for distortion correction of distorted images in OpenCV are the UndistortImage function, and the initUndistortRectifyMap combined with the remap function.
  • UndistortImage is a simple combination of initUndistortRectifyMap and remap, the effect is the same.
  • the distortion coordinate mapping matrices mapx and mapy only need to be calculated once, repeated calls to UndistortImage will repeatedly calculate mapx and mapy, which seriously affects the processing efficiency.
  • initUndistortRectifyMap can be used only once to obtain the distortion coordinate mapping matrices mapx and mapy as the input of the remap function, and then repeatedly call the remap function for each frame of image in the video stream to perform distortion correction.
  • Step S102 Detect the identification code of the map in the global image, and determine the height of the camera device from the map according to the detection result of the identification code of the map.
  • an identification code may be pasted on the upper left corner of the map in advance.
  • the two-dimensional code as an example, its various parameters such as the number of grids, the side length of the two-dimensional code and the label (marked as id) can be set according to the actual situation.
  • Figure 3 shows the 4 ⁇ 4 grid, the side length of the two-dimensional code A QR code with a size of 100mm and an id of 1.
  • the aruco library of OpenCV can be used to detect the QR code of the map in the global image.
  • the coordinates and id of the four corners of the two-dimensional code can be obtained, which represents the current two-dimensional code, and then the coordinates of the four corners are input to OpenCV's aruco.
  • the estimatePoseSingleMarkers function solves the equation by PnP (Perspective-n-Point) to obtain the rotation matrix (denoted as rvec) and translation matrix (denoted as tvec) of the two-dimensional code.
  • the third parameter of tvec is the translation amount of the z-axis. Represents the height of the camera from the map (marked as map_h).
  • Step S103 determine the corner position of the outline of the map in the global image, and determine the scale of the global image according to the corner position and the preset actual map size.
  • the corner position of the outline of the map can be determined through the process shown in Figure 4:
  • Step S1031 performing grayscale processing on the global image to obtain a grayscale image.
  • the cvtColor function of OpenCV can be used to perform color space conversion (BGR, HSV, GRAY and other color spaces) on the global image, and the parameters are converted to grayscale using COLOR_BGR2GRAY.
  • Step S1032 performing Gaussian blur processing on the grayscale image to obtain a Gaussian blurred image.
  • Gaussian blur is a low-pass filter for images, which can make the image blurred and smooth, and is usually used to reduce image noise and reduce the level of detail.
  • the Gaussian Blur method of OpenCV can be used to perform Gaussian blur processing on the image.
  • Step S1033 performing dilation processing on the Gaussian blurred image to obtain the dilated image.
  • the dilate function of OpenCV can be used to dilate the image. Because the map is a square, the edge may lose pixels after de-distortion, resulting in blurring. Therefore, the edge of the map is expanded by expansion, and the edge of the map target or the smaller holes inside are filled in, so that the edge of the map is slightly changed. Large (to prevent disconnection) effect, which is beneficial to subsequent edge detection.
  • Step S1034 performing edge detection on the dilated image to obtain the edge of the map.
  • the Canny function of OpenCV can be used to perform edge detection on the image, and finally an output binary image is obtained, which contains the edges existing in the image, from which the map edges can be obtained.
  • Step S1035 performing contour detection on the edge of the map to obtain the contour of the map.
  • the findContours function of OpenCV can be used to find the contour
  • the parameter cv2.RETR_EXTERNAL indicates that only the outer contour is detected, that is, the outer circle contour of the map required by the embodiment of the present application.
  • the contour (denoted as contour) is represented by a series of points, but not all points on the contour are stored, only the number of points that can describe the contour with a straight line is stored. For a map, only 4 points are needed to describe its outline, and these 4 points are the corner points of the outline of the map.
  • the outline of the map is the largest among them, so by sorting the outlines, the largest outline is found to be the outline of the map.
  • the outline of the map can be approximated by the arcLength function and approxPolyDP function of OpenCV, and the outline drawing operation can be performed using drawCountors.
  • the arcLength function is used to calculate the perimeter of the closed contour or the length of the curve
  • aprroxPolyDP(cnt, epsilon, True) is used to obtain the approximate value of the contour, where cnt is the input contour value; epsilon is the threshold T, usually using the circumference of the contour Long as the threshold; True means the contour is closed.
  • Step S1036, perform corner location on the outline of the map to obtain the corner position of the outline of the map.
  • corner positioning is to arrange the positions of the four corner points, generally in the clockwise direction of upper left, upper right, lower right and lower left.
  • the sum of the abscissa and ordinate of the four corner points can be calculated separately, with the smallest sum being the upper left corner, and the largest sum being the lower right corner, as shown in the following formula:
  • i represents the calculation number of the corner point
  • x i is the abscissa of the i-th corner point
  • y i is the ordinate of the i-th corner point
  • S i is the sum of the abscissa and ordinate of the i-th corner point
  • argmax represents the maximum value
  • argmin represents the minimum value
  • id is the order of the corner points sorted in order
  • D i is the absolute value of the difference between the abscissa and ordinate of the i-th corner point.
  • determining the scale of the global image according to the corner position and the actual map size may specifically include the following steps:
  • Step S1037 Determine the pixel size of the map in the global image according to the corner position.
  • the length of the uppermost side of the map in the global image (denoted as w1, the unit is pixel) can be obtained through the upper left and upper right corner points, and the maximum length of the map can be obtained through the lower left and right corner points.
  • the length of the following side in the global image (denoted as w2, the unit is pixel), take the mean value w of both w1 and w2 as the width of the map in the global image.
  • the length of the leftmost side of the map in the global image can be obtained through the upper left and lower left corners (denoted as h1, and the unit is pixel), and the rightmost side of the map can be found in the global image through the upper right and lower right corners.
  • the length in the image (denoted as h2, the unit is pixel), take the mean value h of both h1 and h2 as the height of the map in the global image.
  • Step S1038 calculate the width ratio between the actual map size and the pixel size, and calculate the height ratio between the actual map size and the pixel size, as shown in the following formula:
  • W and H are the width and height of the actual map, respectively, sw is the width ratio of the actual map size to the pixel size, and sh is the height ratio of the actual map size to the pixel size.
  • Step S1039 determining the mean value of the width ratio and the height ratio as the scale of the global image.
  • the step of map detection is completed. At this time, the four corner points of the outline of the map are obtained, and the actual distance of the map and its scale in the global image are obtained.
  • Step S104 Detect the identification code of the target mobile device in the global image, and determine the center point of the identification code of the target mobile device according to the detection result of the identification code of the target mobile device.
  • the identification code of the target mobile device may be pasted on the upper surface of the target mobile device to facilitate the camera device to capture, and the identification code may be used to represent the position of the target mobile device.
  • the process of detecting the identification code of the target moving device is similar to the process of detecting the identification code of the map in step S102.
  • the coordinates and id of the four corner points are mainly obtained through the aruco.detectMarkers function of OpenCV, but OpenCV is not required at this time
  • the aruco.estimatePoseSingleMarkers function solves PnP to obtain the rotation matrix rvec and the translation matrix tvec, but directly calculates the coordinates of the center point of the identification code of the target moving device through the coordinates of the four corner points. For example, the upper left corner point and Calculate the mean value of the coordinates of the corner point on the lower right, and the mean value is the coordinate of the center point of the identification code of the target moving device.
  • Step S105 Determine the actual position of the target mobile device on the map according to the center point of the identification code of the target mobile device, the corner position of the outline of the map, the height of the camera device from the map, and the scale.
  • step S105 may specifically include the process shown in Figure 6:
  • Step S1051. Determine the first position of the target mobile device in the map according to the center point of the identification code of the target mobile device and the corner position of the outline of the map.
  • the intersection point of the two sides of the map can be used as the origin of the map coordinate system.
  • the intersection point of the two sides on the upper side and the left side of the map can be used as the origin of the map coordinate system.
  • the positive direction of the x-axis is from left to right, and the positive direction of the y-axis is from top to bottom along the left side.
  • T l (y 0 -y 3 )*x c +(x 3 -x 0 )*y c +(x 0 *y 3 )-(x 3 *y 0 )
  • T t (y 0 -y 3 )*x c +(x 3 -x 0 )*y c +(x 0 *y 3 )-(x 3 *y 0 )
  • x c is the abscissa of the central point of the identification code of the target mobile device
  • y c is the vertical coordinate of the central point of the identification code of the target mobile device.
  • the distances of the target motion device from the two edges can be calculated separately.
  • the distance can be calculated from the center point of the identification code of the target moving device and two points on the side, and the distance from a point to a straight line can be converted into the height of a triangle, and the height of a triangle can be calculated by vector cross product
  • the specific calculation of the area is shown in the following formula:
  • d np.abs(np.cross(vec 1 ,vec 2 ))/np.linalg.norm(P 1 -P 2 )
  • P 1 represents the first point of the side
  • P 2 represents the second point of the side
  • P m represents the center point of the identification code of the target motion device
  • vec 1 represents the vector from P m to P 1
  • vec 2 Represents the vector from P m to P 2
  • np.abs is the function to find the absolute value
  • np.cross is the function to find the vector cross product
  • np.linalg.norm is the function to find the norm
  • d is the target movement device and the left The distance from the edge of the side.
  • the first position of the target mobile device in the map can be determined.
  • Step S1052 correcting the first position according to the height of the camera device from the map and the preset height of the target mobile device to obtain a second position of the target mobile device on the map.
  • the position correction can be performed according to the trigonometric function relationship in Figure 7, and the specific calculation formula is as follows:
  • (x * , y * ) is the coordinate of the first position
  • h m is the height of the camera device from the map
  • h c is the preset height of the target mobile device, that is, the distance from the center point of the identification code of the target mobile device to the map Height
  • (x,y) is the coordinate of the second position.
  • Step S1053. Determine the actual position of the target mobile device on the map according to the second position of the target mobile device on the map and the scale.
  • the actual position of the target mobile device in the map can be calculated according to the following formula:
  • (x m , y m ) are the coordinates of the actual position of the target moving device in the map coordinate system. So far, the map detection, the construction of the map coordinate system, and the calculation of the actual position of the target moving device on the map coordinate system have all been completed. For each frame of image, the actual position of the target moving device in the map coordinate system can be obtained in real time. On this basis, the target motion device can be controlled to complete various complex tasks, instead of simply performing simple line inspection tasks along a preset fixed route, which has extremely high flexibility.
  • a camera device is set in advance on the top of the map where the target mobile device travels, and corresponding identification codes are set on both the target mobile device and the map;
  • the global image of progress; the identification code of the map is detected in the global image, and the height of the camera device from the map is determined according to the detection result of the identification code of the map; the corner position of the contour of the map is determined in the global image, and according to the corner point
  • the position and the preset actual map size determine the scale of the global image; detect the identification code of the target mobile device in the global image, and determine the center point of the identification code of the target mobile device according to the detection result of the target mobile device; according to the target
  • the center point of the identification code of the moving device, the corner position of the outline of the map, the height and the scale of the camera device from the map determine the actual position of the target moving device in the map, so that the precise positioning of the target moving device can be realized.
  • the target motion device can be controlled to complete various complex tasks
  • FIG. 8 shows a structural diagram of an embodiment of a map positioning device provided in the embodiment of the present application.
  • a map positioning device may include:
  • the image acquisition module 801 is used to obtain a global image of the target moving device traveling on the map through a preset camera device; the camera device is located at the top of the map; the target moving device and the map are provided with corresponding the identification code;
  • a map identification code detection module 802 configured to detect the identification code of the map in the global image, and determine the height of the camera device from the map according to the detection result of the identification code of the map;
  • a corner position determination module 803, configured to determine the corner position of the outline of the map in the global image
  • a scale determination module 804 configured to determine the scale of the global image according to the corner position and the preset actual map size
  • a device identification code detection module 805, configured to detect the identification code of the target moving device in the global image, and determine the center point of the identification code of the target moving device according to the detection result of the identification code of the target moving device ;
  • a position determination module 806, configured to determine the location of the target mobile device according to the center point of the identification code of the target mobile device, the corner position of the outline of the map, the height of the camera device from the map, and the scale. The actual location on the map.
  • the corner position determination module may include:
  • a grayscale unit configured to perform grayscale processing on the global image to obtain a grayscale image
  • a Gaussian blur unit configured to perform Gaussian blur processing on the grayscale image to obtain a Gaussian blurred image
  • an expansion unit configured to perform expansion processing on the Gaussian blurred image to obtain an expanded image
  • an edge detection unit configured to perform edge detection on the dilated image to obtain an edge of the map
  • a contour detection unit configured to perform contour detection on the edge of the map to obtain the contour of the map
  • the corner positioning unit is configured to perform corner positioning on the outline of the map to obtain corner positions of the outline of the map.
  • the scale determination module may include:
  • a pixel size determining unit configured to determine the pixel size of the map in the global image according to the corner position
  • a width ratio calculation unit configured to calculate a width ratio between the actual map size and the pixel size
  • a height ratio calculation unit configured to calculate the height ratio between the actual map size and the pixel size
  • a scale determining unit configured to determine an average value of the width ratio and the height ratio as the scale of the global image.
  • the location determination module may include:
  • a first position determining unit configured to determine a first position of the target mobile device in the map according to the center point of the identification code of the target mobile device and the corner position of the outline of the map;
  • the second position determination unit is configured to correct the first position according to the height of the camera device from the map and the preset height of the target mobile device, to obtain the position of the target mobile device in the map second position;
  • An actual position determining unit configured to determine the actual position of the target mobile device in the map according to the second position of the target mobile device in the map and the scale.
  • the second position determining unit is specifically configured to correct the first position according to the following formula:
  • (x * , y * ) is the coordinate of the first position
  • h m is the height of the camera device from the map
  • h c is the preset height of the target moving device
  • (x, y) is the coordinates of the second location.
  • the actual position determining unit is specifically configured to calculate the actual position of the target moving device in the map according to the following formula:
  • (x, y) is the coordinate of the second position
  • s is the scale
  • W is the width of the actual map size
  • H is the height of the actual map size
  • (x m , y m ) is The coordinates of the actual location.
  • the image acquisition module may include:
  • a camera calibration unit configured to perform camera calibration on the camera device to obtain camera internal parameters and distortion coefficients of the camera device
  • an original image acquisition unit configured to acquire a global original image of the target moving device traveling on the map through the camera device
  • a de-distortion unit configured to perform de-distortion processing on the global original image according to the internal camera parameters and the distortion coefficients, to obtain the global image.
  • FIG. 9 shows a schematic block diagram of a terminal device provided by an embodiment of the present application. For ease of description, only parts related to the embodiment of the present application are shown.
  • the terminal device 9 of this embodiment includes: a processor 90 , a memory 91 , and a computer program 92 stored in the memory 91 and operable on the processor 90 .
  • the processor 90 executes the computer program 92, it realizes the steps in the above embodiments of the map positioning method, for example, steps S101 to S105 shown in FIG. 1 .
  • the processor 90 executes the computer program 92, the functions of the modules/units in the above-mentioned device embodiments are realized, for example, the functions of the modules 801 to 806 shown in FIG. 8 .
  • the computer program 92 can be divided into one or more modules/units, and the one or more modules/units are stored in the memory 91 and executed by the processor 90 to complete this application.
  • the one or more modules/units may be a series of computer program instruction segments capable of accomplishing specific functions, and the instruction segments are used to describe the execution process of the computer program 92 in the terminal device 9 .
  • the terminal device 9 may be a computing device such as a mobile phone, a tablet computer, a desktop computer, a notebook, a palmtop computer, and a robot.
  • FIG. 9 is only an example of the terminal device 9, and does not constitute a limitation to the terminal device 9. It may include more or less components than those shown in the figure, or combine certain components, or different components.
  • the terminal device 9 may also include an input and output device, a network access device, a bus, and the like.
  • the processor 90 can be a central processing unit (Central Processing Unit, CPU), and can also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the storage 91 may be an internal storage unit of the terminal device 9 , such as a hard disk or memory of the terminal device 9 .
  • the memory 91 can also be an external storage device of the terminal device 9, such as a plug-in hard disk equipped on the terminal device 9, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, flash memory card (Flash Card), etc. Further, the memory 91 may also include both an internal storage unit of the terminal device 9 and an external storage device.
  • the memory 91 is used to store the computer program and other programs and data required by the terminal device 9 .
  • the memory 91 can also be used to temporarily store data that has been output or will be output.
  • the disclosed apparatus/terminal device and method may be implemented in other ways.
  • the device/terminal device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated module/unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium. Based on this understanding, all or part of the processes in the methods of the above embodiments in the present application can also be completed by instructing related hardware through computer programs.
  • the computer programs can be stored in a computer-readable storage medium, and the computer When the program is executed by the processor, the steps in the above-mentioned various method embodiments can be realized.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable storage medium may include: any entity or device capable of carrying the computer program code, a recording medium, a USB flash drive, a removable hard disk, a magnetic disk, an optical disk, a computer memory, a read-only memory (ROM, Read-Only Memory) ), Random Access Memory (RAM, Random Access Memory), electrical carrier signal, telecommunication signal, and software distribution medium, etc.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • electrical carrier signal telecommunication signal
  • software distribution medium etc.
  • the content contained in the computer-readable storage medium can be appropriately increased or decreased according to the requirements of legislation and patent practice in the jurisdiction.
  • computer-readable Storage media excludes electrical carrier signals and telecommunication signals.

Abstract

La présente demande appartient au domaine de la technologie de positionnement et concerne en particulier un procédé et un appareil de positionnement par carte, un support de stockage lisible par ordinateur et un dispositif terminal. Le procédé consiste : à acquérir, par un appareil de prise de vues, une image globale d'un appareil mobile cible se déplaçant sur une carte ; à détecter un code d'identification de la carte dans l'image globale et à déterminer la hauteur de l'appareil de prise de vues d'après la carte, selon le résultat de détection du code d'identification de la carte ; à déterminer les positions de points de coins du contour de la carte dans l'image globale et à déterminer l'échelle de l'image globale selon les positions des points de coins et selon une taille prédéfinie de carte réelle ; à détecter un code d'identification de l'appareil mobile cible dans l'image globale et à déterminer le point central du code d'identification de l'appareil mobile cible, selon le résultat de détection du code d'identification de l'appareil mobile cible ; et à déterminer d'après la carte la position réelle de l'appareil mobile cible sur la carte selon le point central du code d'identification de l'appareil mobile cible, selon les positions des points de coins du contour de la carte et selon la hauteur et l'échelle de l'appareil de prise de vues.
PCT/CN2021/126719 2021-07-23 2021-10-27 Procédé et appareil de positionnement par cartes, support de stockage lisible par ordinateur et dispositif terminal WO2023000528A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110837305.8 2021-07-23
CN202110837305.8A CN113628273B (zh) 2021-07-23 2021-07-23 地图定位方法、装置、计算机可读存储介质及终端设备

Publications (1)

Publication Number Publication Date
WO2023000528A1 true WO2023000528A1 (fr) 2023-01-26

Family

ID=78380728

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/126719 WO2023000528A1 (fr) 2021-07-23 2021-10-27 Procédé et appareil de positionnement par cartes, support de stockage lisible par ordinateur et dispositif terminal

Country Status (2)

Country Link
CN (1) CN113628273B (fr)
WO (1) WO2023000528A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140192658A1 (en) * 2013-01-04 2014-07-10 Qualcomm Incorporated Dynamic selection of positioning system and display map
CN104792312A (zh) * 2014-01-20 2015-07-22 广东工业大学 以定距三球为视觉标志物的室内自动运输车定位系统
CN106153050A (zh) * 2016-08-27 2016-11-23 杭州国辰牵星科技有限公司 一种基于信标的室内定位系统和方法
CN107689063A (zh) * 2017-07-27 2018-02-13 南京理工大学北方研究院 一种基于天花板图像的机器人室内定位方法
CN107992793A (zh) * 2017-10-20 2018-05-04 深圳华侨城卡乐技术有限公司 一种室内定位方法、装置以及存储介质
CN109784250A (zh) * 2019-01-04 2019-05-21 广州广电研究院有限公司 自动引导小车的定位方法和装置
CN111426325A (zh) * 2020-06-12 2020-07-17 北京云迹科技有限公司 定位方法及装置、机器人、存储介质及定位系统
CN111968177A (zh) * 2020-07-22 2020-11-20 东南大学 一种基于固定摄像头视觉的移动机器人定位方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6975425B1 (en) * 1998-02-26 2005-12-13 Canon Kabushiki Kaisha Information processing apparatus and information processing method
CN101726283A (zh) * 2009-12-24 2010-06-09 北京测科空间信息技术有限公司 航空摄影测量大比例尺测图标志方法
CN107766855B (zh) * 2017-10-25 2021-09-07 南京阿凡达机器人科技有限公司 基于机器视觉的棋子定位方法、系统、存储介质及机器人
CN109872372B (zh) * 2019-03-07 2021-04-09 山东大学 一种小型四足机器人全局视觉定位方法和系统
CN110989687B (zh) * 2019-11-08 2021-08-10 上海交通大学 一种基于嵌套正方形视觉信息的无人机降落方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140192658A1 (en) * 2013-01-04 2014-07-10 Qualcomm Incorporated Dynamic selection of positioning system and display map
CN104792312A (zh) * 2014-01-20 2015-07-22 广东工业大学 以定距三球为视觉标志物的室内自动运输车定位系统
CN106153050A (zh) * 2016-08-27 2016-11-23 杭州国辰牵星科技有限公司 一种基于信标的室内定位系统和方法
CN107689063A (zh) * 2017-07-27 2018-02-13 南京理工大学北方研究院 一种基于天花板图像的机器人室内定位方法
CN107992793A (zh) * 2017-10-20 2018-05-04 深圳华侨城卡乐技术有限公司 一种室内定位方法、装置以及存储介质
CN109784250A (zh) * 2019-01-04 2019-05-21 广州广电研究院有限公司 自动引导小车的定位方法和装置
CN111426325A (zh) * 2020-06-12 2020-07-17 北京云迹科技有限公司 定位方法及装置、机器人、存储介质及定位系统
CN111968177A (zh) * 2020-07-22 2020-11-20 东南大学 一种基于固定摄像头视觉的移动机器人定位方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JI XU: "Research on Location Method of Mobile Robot Based on Monocular Vision", CHINA MASTER'S' THESES FULL-TEXT DATABASE (ELECTRONIC JOURNAL)-INFORMATION & TECHNOLOGY), TIANJIN POLYTECHNIC UNIVERSITY, CN, no. 3, 15 March 2018 (2018-03-15), CN , XP093026245, ISSN: 1674-0246 *

Also Published As

Publication number Publication date
CN113628273B (zh) 2023-12-15
CN113628273A (zh) 2021-11-09

Similar Documents

Publication Publication Date Title
US9740967B2 (en) Method and apparatus of determining air quality
CN109785291B (zh) 一种车道线自适应检测方法
WO2019242416A1 (fr) Procédé et appareil de traitement d'image vidéo, support d'informations lisible par ordinateur et dispositif électronique
WO2021233266A1 (fr) Procédé et appareil de détection de bord et dispositif électronique et support de stockage
WO2022237811A1 (fr) Procédé et appareil de traitement d'image et dispositif
CN110930411B (zh) 一种基于深度相机的人体分割方法及系统
WO2022062238A1 (fr) Procédé et appareil de détection de ballon de football, support de stockage lisible par ordinateur, et robot
CN111192293A (zh) 一种运动目标位姿跟踪方法及装置
WO2021147113A1 (fr) Procédé d'identification de catégorie sémantique de plan et appareil de traitement de données d'image
CN109685827B (zh) 一种基于dsp的目标检测与跟踪方法
CN105046278B (zh) 基于Haar特征的Adaboost检测算法的优化方法
CN110751620A (zh) 估算体积和重量的方法、电子设备及计算机可读存储介质
WO2022205843A1 (fr) Procédé et appareil de détection de mouvement de lèvre, dispositif terminal et support de stockage lisible par ordinateur
CN113627428A (zh) 文档图像矫正方法、装置、存储介质及智能终端设备
CN111626295A (zh) 车牌检测模型的训练方法和装置
CN110781823A (zh) 录屏检测方法、装置、可读介质及电子设备
CN112364865A (zh) 一种复杂场景中运动小目标的检测方法
CN111127358A (zh) 图像处理方法、装置及存储介质
CN113781523A (zh) 一种足球检测跟踪方法及装置、电子设备、存储介质
WO2023000528A1 (fr) Procédé et appareil de positionnement par cartes, support de stockage lisible par ordinateur et dispositif terminal
CN112257607B (zh) 一种处理流水线上采集的手机图像畸变的矫正方法
CN112446353A (zh) 基于深度卷积神经网络的视频图像道线检测方法
CN111008634B (zh) 一种基于实例分割的字符识别方法及字符识别装置
CN108682021A (zh) 快速手部跟踪方法、装置、终端及存储介质
CN115908774B (zh) 一种基于机器视觉的变形物资的品质检测方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21950766

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE