CN112927298A - Target object positioning method and device, electronic equipment and storage medium - Google Patents

Target object positioning method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112927298A
CN112927298A CN202110213647.2A CN202110213647A CN112927298A CN 112927298 A CN112927298 A CN 112927298A CN 202110213647 A CN202110213647 A CN 202110213647A CN 112927298 A CN112927298 A CN 112927298A
Authority
CN
China
Prior art keywords
straight line
end point
target object
line segment
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110213647.2A
Other languages
Chinese (zh)
Inventor
苏至钒
潘晶
夏知拓
金宇赢
冯义兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Timi Robot Co ltd
Original Assignee
Shanghai Timi Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Timi Robot Co ltd filed Critical Shanghai Timi Robot Co ltd
Priority to CN202110213647.2A priority Critical patent/CN112927298A/en
Publication of CN112927298A publication Critical patent/CN112927298A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a target object positioning method and device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring first point cloud data of the laser radar and depth information acquired by a depth camera; projecting the depth information to a laser radar coordinate system to obtain second point cloud data; respectively carrying out straight line detection on the first point cloud data and the second point cloud data to obtain a plurality of corresponding first straight line segments and a plurality of corresponding second straight line segments; and determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end point of the parallel line contained in the plurality of first straight line segments. The method and the device can position the target position on the premise of no two-dimensional code or auxiliary guide mechanism, and greatly improve the alignment success rate.

Description

Target object positioning method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of robotics, and in particular, to a method and an apparatus for positioning a target object, an electronic device, and a computer-readable storage medium.
Background
As robots become more popular, robots are more widely used in various scenes. In particular, AGV (Automated Guided Vehicle) type transport robots are increasingly widely used in the fields of material handling and garbage transport.
Generally, when a transport robot transports goods or garbage, a container or a garbage can is directly placed on the robot for transportation, so that an AGV trolley is required to drive into the bottom of the container or the garbage can. Therefore, the AGV trolley needs to be aligned with the container or the garbage can, at present, the AGV trolley is generally aligned by a two-dimensional code mode or a two-dimensional code auxiliary guide mechanism, but the AGV trolley cannot be aligned without auxiliary facilities.
Disclosure of Invention
The embodiment of the application provides a target object positioning method, which is used for improving the alignment success rate.
The embodiment of the application provides a method for positioning a target object, which comprises the following steps:
acquiring first point cloud data of the laser radar and depth information acquired by a depth camera;
projecting the depth information to a laser radar coordinate system to obtain second point cloud data;
respectively carrying out straight line detection on the first point cloud data and the second point cloud data to obtain a plurality of corresponding first straight line segments and a plurality of corresponding second straight line segments;
and determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end points of parallel lines contained in the plurality of first straight line segments.
In an embodiment, the step of determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end point of the parallel line included in the plurality of first straight line segments includes:
acquiring image information of a calibration pattern acquired by the depth camera;
converting the image information of the calibration pattern into a robot coordinate system to obtain the coordinate position of the calibration pattern;
selecting a target straight-line segment of the target object from the plurality of second straight-line segments by using the coordinate position of the calibration pattern;
and determining the boundary of the target object according to the distance between the end point of the target straight line segment and the end points of parallel lines contained in the plurality of first straight line segments.
In an embodiment, the step of selecting a target straight-line segment of the target object from the plurality of second straight-line segments by using the coordinate position of the calibration pattern includes:
according to the coordinate position of the calibration pattern, estimating theoretical coordinates of the target edge of the target object under a robot coordinate system;
and comparing the theoretical coordinates with actual coordinates of the second straight-line segments respectively, and selecting the second straight-line segments with the difference between the actual coordinates and the theoretical coordinates smaller than a preset value as target straight-line segments.
In an embodiment, the step of determining the boundary of the target object according to the distances between the end points of the target straight line segment and the end points of parallel lines included in the plurality of first straight line segments includes:
calculating a first distance between the coordinates of the starting point of the target straight-line segment and the first end point coordinates and a second distance between the coordinates of the end point of the target straight-line segment and the second end point coordinates according to the first end point coordinates and the second end point coordinates of any parallel line;
and if the first distance and the second distance are both smaller than a first threshold value, determining that the parallel line and the target straight-line segment belong to the boundary of the target object.
In an embodiment, before the step of determining the boundary of the target object according to the distances between the end points of the second straight line segments and the end points of the parallel lines included in the plurality of first straight line segments, after the step of performing straight line detection on the first point cloud data and the second point cloud data respectively to obtain the plurality of corresponding first straight line segments and the plurality of second straight line segments, the method further includes the steps of:
and screening out two first straight line segments with the same slope as a group of parallel lines, wherein the coordinates of the starting point and the end point are in a preset range by calculating the slope of each first straight line segment according to the coordinates of the starting point and the end point of each first straight line segment in a robot coordinate system.
In an embodiment, the step of determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end point of the parallel line included in the plurality of first straight line segments includes:
and respectively calculating the distance between the end point of each group of parallel lines and the end point of each second straight line section according to the parallel lines contained in the plurality of first straight line sections, and taking the parallel lines and the second straight line sections with the distance smaller than a second threshold value as the boundary of the target object.
In an embodiment, the step of calculating, according to parallel lines included in the plurality of first straight line segments, a distance between an end point of each group of the parallel lines and an end point of each second straight line segment, and using the parallel lines and the second straight line segments whose distances are smaller than a second threshold as boundaries of the target object includes:
calculating a first distance between the start point coordinate and the first end point coordinate of each second straight line segment and a second distance between the end point coordinate of each second straight line segment and the second end point coordinate according to the first end point coordinate and the second end point coordinate of any parallel line;
and if the first distance and the second distance are both smaller than a second threshold value, determining that the parallel line and the second straight line segment belong to the boundary of the target object.
In an embodiment, the method further comprises:
and taking the coordinates of the middle points of the parallel lines belonging to the boundary as a robot navigation target point so that the robot can navigate according to the robot navigation target point.
The embodiment of the application provides a positioning device of a target object, the device comprises:
the data acquisition module is used for acquiring first point cloud data of the laser radar and depth information acquired by the depth camera;
the data conversion module is used for projecting the depth information to a laser radar coordinate system to obtain second point cloud data;
the line detection module is used for respectively carrying out line detection on the first point cloud data and the second point cloud data to obtain a plurality of corresponding first line segments and a plurality of corresponding second line segments;
and the boundary determining module is used for determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end points of the parallel lines contained in the plurality of first straight line segments.
An embodiment of the present application further provides an electronic device, where the electronic device includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform any one of the above-mentioned target object positioning methods.
An embodiment of the present application further provides a computer-readable storage medium, where the storage medium stores a computer program, and the computer program is executable by a processor to perform any one of the above-mentioned target object positioning methods.
The technical scheme that this application above-mentioned embodiment provided, carry out straight line detection according to laser radar's first point cloud data and obtain first straight-line segment, carry out the straightway according to the second point cloud data that the depth camera corresponds and obtain the second straightway, distance between the endpoint of the parallel line that contains in the first straight-line segment and the endpoint of second straightway through the calculation, thereby can confirm parallel line and the straightway that belongs to the boundary of target object, be convenient for aim at when carrying the target object, this application also can fix a position the target location under the prerequisite of no two-dimensional code and supplementary guiding mechanism, the alignment success rate has been improved greatly.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the embodiments of the present application will be briefly described below.
Fig. 1 is a block diagram of a target object positioning system provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of a target object positioning method according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating the principle of hough line detection according to an embodiment of the present application;
FIG. 5 is a detailed flowchart of step S340 in the corresponding embodiment of FIG. 3;
FIG. 6 is a schematic diagram of a target object shown in an embodiment of the present application;
FIG. 7 is a schematic diagram of a boundary of a target object provided by an embodiment of the present application;
FIG. 8 is a flowchart illustrating the details of step S3344 in the corresponding embodiment of FIG. 5;
fig. 9 is a block diagram of a target object locating device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Fig. 1 is a block diagram of a target object positioning system according to an embodiment of the present disclosure. As shown in fig. 1, the positioning system 100 can be applied to a transport robot for carrying a target object. The positioning system 100 may include a lidar 101, a depth camera 102, and a central platform 103. The lidar 101 and the depth camera 102 are connected to a central platform 103. The point cloud data of the laser radar 101 and the depth information collected by the depth camera 102 may be transmitted to the central control platform 103, so that the central control platform 103 may execute the method provided in the following embodiments of the present application to determine the position of the target object.
Fig. 2 is a schematic structural diagram of an electronic device provided in an embodiment of the present application. The electronic device 200 may serve as the central control platform 103, and the electronic device 200 may be configured to execute the target object positioning method provided in the embodiment of the present application. As shown in fig. 2, the electronic device 200 includes: one or more processors 202, and one or more memories 204 storing processor-executable instructions. Wherein the processor 202 is configured to execute a target object positioning method provided by the following embodiments of the present application.
The processor 202 may be a device containing a Central Processing Unit (CPU), a Graphics Processing Unit (GPU) or other form of processing unit having data processing and/or instruction execution capabilities, may process data for other components in the electronic device 200, and may control other components in the electronic device 200 to perform desired functions.
The memory 204 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. On which one or more computer program instructions may be stored that may be executed by processor 202 to implement the target object location methods described below. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer-readable storage medium.
In one embodiment, the electronic device 200 shown in FIG. 2 may also include an input device 206, an output device 208, and a data acquisition device 210, which may be interconnected via a bus system 212 and/or other form of connection mechanism (not shown). It should be noted that the components and configuration of the electronic device 200 shown in FIG. 2 are exemplary only, and not limiting, and the electronic device 200 may have other components and configurations as desired.
The input device 206 may be a device used by a user to input instructions and may include one or more of a keyboard, a mouse, a microphone, a touch screen, and the like. The output device 208 may output various information (e.g., images or sounds) to the outside (e.g., a user), and may include one or more of a display, a speaker, and the like. The data acquisition device 210 may acquire an image of a subject and store the acquired image in the memory 204 for use by other components. Illustratively, the data acquisition device 210 may be a camera.
In an embodiment, the components of the exemplary electronic device 200 for implementing the target object locating method of the embodiment of the present application may be integrally disposed, or may be disposed separately, such as the processor 202, the memory 204, the input device 206, and the output device 208 are integrally disposed, and the data acquisition device 210 is disposed separately.
Fig. 3 is a flowchart illustrating a method for locating a target object according to an embodiment of the present application, and as shown in fig. 3, the method includes the following steps S310 to S350.
Step S310: and acquiring first point cloud data of the laser radar and depth information acquired by the depth camera.
The laser radar is a radar system that detects characteristic quantities such as the azimuth and the distance of a target by emitting a laser beam. The first point cloud data refers to scanning data of the laser radar, and the first point cloud data comprises azimuth angles and distances between reflection points in different directions and an origin of a laser radar coordinate system. The laser radar coordinate system is a polar coordinate system which is constructed by taking the position of the laser radar as an origin.
The depth camera is added with a depth measurement function on the basis of a traditional camera, namely, the distance between the depth camera and a target can be detected. The depth camera may be a realsense camera. Depth information refers to the location coordinates of the target in the depth camera coordinate system.
Step S320: and projecting the depth information to a laser radar coordinate system to obtain second point cloud data.
The second point cloud data refers to the azimuth angle and the distance of the target in the radar coordinate system, which are obtained based on the data collected by the depth camera. In an embodiment, according to a mapping relationship between the depth camera coordinate system and the laser radar coordinate system, the depth information in the depth camera coordinate system may be mapped to obtain an azimuth angle and a distance in the laser radar coordinate system, so as to obtain the second point cloud data.
Step S330: respectively carrying out straight line detection on the first point cloud data and the second point cloud data to obtain a plurality of corresponding first straight line segments and a plurality of corresponding second straight line segments;
and the first straight line segment is detected from the first point cloud data, and the second straight line segment is detected from the second point cloud data. The straight line detection can adopt a Hough straight line detection method. The straight line shown in fig. 4 can be expressed as r ═ xcos θ + ysin θ, where r is the distance from the origin of the coordinate system to the straight line, and θ is the angle between the perpendicular to the straight line and the x-axis. Given a point (x)0,y0) The set of parameters (r, theta) of all the straight lines passing through the point forms a trigonometric function on the (r, theta) plane
Figure BDA0002951969020000081
This translates the problem of detecting straight lines in image space into finding the maximum number of sinusoids passing through points (r, θ) in polar parameter space. In Hough space, the more times the intersection points of the curves are, the more certain the represented parameters are, and the fuller the drawn graph is.
Therefore, (r, θ) with a large number of occurrences in the first point cloud data can be counted, and the straight line corresponding to (r, θ) can be used as the detected first straight line segment. And (r, theta) with more occurrence times in the second point cloud data is counted, and the straight line corresponding to the (r, theta) is used as the detected first straight line segment. r can be considered as a distance and θ as an azimuth. The first straight line segment may be considered to be a straight line segment detected based on the scan data of the lidar. The second straight line segment may be considered to be a straight line segment detected based on depth information collected by the depth camera.
In an embodiment, the first line segment may be represented by start and end coordinates of the first line segment in the robot coordinate system. The second straight-line segment may be represented by its start and end coordinates in the robot coordinate system. The robot coordinate system is a coordinate system with the center of the robot as the origin. For example, the x-axis direction is right in front of the robot, the y-axis direction is left in the left-hand direction, and the z-axis direction is the head direction. The robot coordinate system and the laser radar coordinate system can have a translation of an x axis, that is, the origin of the laser radar coordinate system is (x, 0, 0) in the robot coordinate system.
Step S340: and determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end points of parallel lines contained in the plurality of first straight line segments.
The parallel line comprises two first straight line segments which are parallel to each other. The two parallel first straight line segments may also be referred to as a set of parallel lines. The plurality of first straight line segments may include one or more sets of parallel lines. If the distance between the end point of a certain second straight line segment and the end point of a certain set of parallel lines is smaller than the threshold value, the second straight line segment and the set of parallel lines can be considered to belong to the boundary of the target object. And taking the coordinates of the middle points of the parallel lines belonging to the boundary as a robot navigation target point so that the robot can navigate according to the robot navigation target point.
In one embodiment, the parallel lines included in the first straight line segments may be obtained by: and screening out two first straight line segments with the same slope as a group of parallel lines, wherein the start point coordinates and the end point coordinates of each first straight line segment are in a preset range by calculating the slope of each first straight line segment according to the start point coordinates and the end point coordinates of each first straight line segment in the robot coordinate system.
Assuming that there are two straight lines, the starting points are x1,y1]And [ x ]2,y2]The end points are respectively [ a1,b1]And [ a ]2,b2]The slope of the two can be expressed as
Figure BDA0002951969020000101
And
Figure BDA0002951969020000102
if l1=l2It is said that the two straight lines are parallel. Therefore, the slope of each straight-line segment can be calculated according to the starting point coordinate and the end point coordinate of each straight-line segment, and any two straight-line segments with the same slope are found out to serve as a group of parallel lines.
The preset range may be an X-axis [ -0.5,2], a Y-axis [ -1.0,1.0] (unit M) in the robot coordinate system. According to the starting point coordinates and the end point of each first straight line segment, all parallel lines in the range can be found out, and the starting point coordinates and the end point coordinates of the parallel lines are recorded. In the straight line segment from top to bottom, the upper end may be regarded as the start point coordinate, and the lower end may be regarded as the end point coordinate, but the invention is not limited thereto.
Because the packing box generally can be in the place ahead of robot, so this application embodiment is through restricting the coordinate scope to can reject unnecessary interference in many other scopes, can also reduce the calculated amount, need not look for parallel line under all laser radar.
In an embodiment, the step S340 can be divided into two cases, i.e., a case with a calibration pattern and a case without a calibration pattern.
In the case of a calibration pattern, as shown in fig. 5, the step S340 specifically includes:
step S341: and acquiring the image information of the calibration pattern acquired by the depth camera.
Step S342: and converting the image information of the calibration pattern into a robot coordinate system to obtain the coordinate position of the calibration pattern.
The coordinate position of the calibration pattern refers to the rotation amount and displacement of the calibration pattern in the robot coordinate system, i.e. T. The rotation and displacement of the calibration pattern relative to the robot coordinate system may be obtained by acquiring image information of the calibration pattern acquired by the depth camera. The image information contains color information (i.e., RGB values) and depth information (i.e., the position coordinates of the calibration pattern relative to the depth camera). Thresholding color information contained in the image information, extracting an image contour and filtering the contour; and then acquiring the position of the corner point, and carrying out affine transformation on the position of the corner point. According to the mapping relation between the camera coordinate system of the depth camera and the robot coordinate system, converting the image information (namely the depth information after affine transformation) of the calibration pattern into the robot coordinate system, and obtaining the coordinate position of the calibration pattern.
Fig. 6 is a schematic diagram of a target object according to an embodiment of the present application. As shown in fig. 6, the calibration pattern 51 is attached to the target object 50. The target object 50 may be a trash can, a container, or the like. The calibration pattern 51 may be a two-dimensional code pattern (e.g., an aruco code). The calibration pattern 51 may be affixed to the container at a fixed location, such as 35cm from the bottom edge of the front surface of the container. In one embodiment, the rotation and displacement of the calibration pattern relative to the robot coordinate system can be obtained in real time by using an own _ ros algorithm in a ros frame (distributed processing frame). Namely, it is
Figure BDA0002951969020000111
With rotation in a 3 x 3 matrix, i.e. T
Figure BDA0002951969020000112
In part, the displacement is a1 × 3 matrix, i.e., x, y, z.
Step S343: and selecting a target straight-line segment of the target object from the plurality of second straight-line segments by using the coordinate position of the calibration pattern.
Wherein the target straight line segment can be a bottom side straight line segment or a top side straight line segment of the target object.
After the coordinate position of the calibration pattern is calculated, the calibration pattern is known to be attached to a fixed position of the target object, for example, 35cm from the bottom edge and located at the left and right centers of the cargo box. The calibration pattern is attached to a plane, and a roll angle and a pitch angle do not exist, so that the theoretical coordinate of a target edge (such as the bottom edge of the container) below the calibration pattern relative to a robot coordinate system is calculated according to the coordinate theory of the calibration pattern
Figure BDA0002951969020000121
Therein
Figure BDA0002951969020000122
And part represents heading, and (x, y) represents the midpoint of the bottom edge of the container. Therefore, the theoretical coordinates are compared with the actual coordinates of the second straight-line segments, and the second straight-line segments with the difference between the actual coordinates and the theoretical coordinates smaller than a preset value can be selected as the target straight-line segments.
The actual coordinates of the second straight-line segment may include the coordinates of the middle point of the second straight-line segment and the angle of the perpendicular to the second straight-line segment. And calculating the absolute value of the difference between the theoretical coordinate L and the actual coordinate of a certain second straight-line segment, and if the absolute value of the difference is less than a preset value, considering the certain second straight-line segment as the target straight-line segment. The target straight line segment may be considered to be the straight line segment of the bottom edge of the cargo box. In one embodiment, the target straight-line segment may be represented by the start point coordinates and the end point coordinates of the target straight-line segment.
Step S344: and determining the boundary of the target object according to the distance between the end point of the target straight line segment and the end point of the parallel line contained in the plurality of first straight line segments.
Wherein, at least two first straight line segments with the same slope can be regarded as parallel lines. FIG. 7 is a schematic diagram of the boundary of a target object. As can be seen in fig. 7, the distance between the end points of the parallel lines (a1, B1) and the end points of the straight line segments of the bottom edge (a2, B2) is very close. After the target straight line segment is determined based on the calibration pattern, the distance between the end point of the target straight line segment and the end points of a certain group of parallel lines can be calculated, and if the distance is smaller than a threshold value, the group of parallel lines and the straight line segment at the bottom edge are considered to form the boundary of the target object. Thus, when the target object is conveyed, alignment can be performed based on the position of the boundary.
In an embodiment, as shown in fig. 8, the step S344 specifically includes:
step S3441: and calculating a first distance between the coordinates of the starting point of the target straight-line segment and the first end point coordinates and a second distance between the coordinates of the end point of the target straight-line segment and the second end point coordinates according to the first end point coordinates and the second end point coordinates of any parallel line.
The first straight line and the second straight line are parallel, and the two straight lines are called parallel lines. The first end point coordinate means an end point coordinate below the first straight line. The second end point coordinate is the coordinate of the end point below the second straight line. As shown in fig. 7, the coordinates of a1 may be regarded as first end point coordinates, and the coordinates of B1 may be regarded as second end point coordinates. The coordinates of a2 may be regarded as the coordinates of the start point of the target straight line segment, and the coordinates of B2 may be regarded as the coordinates of the end point of the target straight line segment.
For example, assume the coordinates of the start of the straight line segment of the target are [ x ]1,y1]And the endpoint coordinate is [ x ]2,y2]The first starting point coordinate of a group of laser radar parallel lines is [ alpha ]1,β1]The second starting point coordinate is [ mu ]1,γ1]The first end point coordinate is [ alpha ]2,β2]And the second endpoint coordinate is [ mu ]2,γ2]First distance of
Figure BDA0002951969020000131
Second distance
Figure BDA0002951969020000132
Step S3442: and if the first distance and the second distance are both smaller than a first threshold value, determining that the parallel line and the target straight-line segment belong to the boundary of the target object.
Wherein the first threshold may be 0.1, and if the set of parallel lines is to be found, there is a first distance
Figure BDA0002951969020000133
And a second distance
Figure BDA0002951969020000134
I.e. to find uniquely defined parallel lines which together with the target straight line segments form the boundary of the target object.
After the position of the target straight-line segment is determined, a group of parallel lines can be uniquely determined according to the starting point coordinate and the end point coordinate of the target straight-line segment by traversing all the parallel lines, and the parallel lines and the target straight-line segment form the boundary of the target object. Then, the coordinates of the middle points of the parallel lines belonging to the boundary can be used as the robot navigation target point for navigation. The coordinate of the midpoint of the parallel line is used as a target point for navigation, for example, the mechanical arm of the robot can be aligned with the coordinate of the midpoint, and the coordinate of the midpoint is used as a grabbing point for grabbing the target object.
The embodiment of the application uses the scanning data of the laser radar to carry out alignment, and can effectively solve the problems that the field angle of the depth camera is small, a blind area exists during alignment and the like. Parallel lines obtained by the laser radar are verified by using the target straight-line segments determined by the calibration patterns, so that the interference of similar characteristics of the surrounding environment can be avoided, and the robustness of the system is improved.
In an embodiment, in the case of no calibration pattern, the distance between the end point of each group of parallel lines and the end point of each second straight-line segment may be calculated according to the parallel lines included in the plurality of first straight-line segments, and the parallel lines and the second straight-line segments whose distance is smaller than the second threshold value may be used as the boundary of the target object.
The parallel lines can be extracted as described above, and are not described herein again. Since the target straight-line segment cannot be determined by the aid of the calibration pattern, it is necessary to traverse all the second straight-line segments and all the parallel lines and calculate the distances between the two lower end points (a1, B1) of any one of the parallel lines and the starting point (a2) and the end point (B2) of any one of the second straight-line segments. If the distance between A1 and A2 and the distance between B1 and B2 are both smaller than the threshold value, the current parallel line and the second straight-line segment are considered to belong to the boundary of the target object, and the midpoint coordinates of the parallel line can be used as the robot navigation target point for navigation.
For example, for any parallel line and any second straight-line segment, the coordinate [ alpha ] of the first end point of the parallel line can be determined2,β2]And second endpoint coordinate [ mu ]2,γ2]And calculating the coordinates [ x ] of the starting point of the second straight line segment1,y1]With the first end point coordinate [ alpha ]2,β2]First distance therebetween
Figure BDA0002951969020000141
And the terminal coordinate [ x ] of the second straight line segment2,y2]With the second end point coordinate [ mu ]2,γ2]Second distance therebetween
Figure BDA0002951969020000142
If both the first distance and the second distance are less than a threshold, for example
Figure BDA0002951969020000143
And is
Figure BDA0002951969020000144
The current parallel line and the second straight-line segment are determined to belong to the boundary of the target object. The coordinates of the middle point of the parallel line can be aligned with the coordinate navigation target point.
It should be noted that, in any of the above embodiments, when a plurality of groups of parallel lines satisfying that both the first distance and the second distance are smaller than the first threshold value are provided, a group of parallel lines is screened out as the boundary of the target object according to a preset rule. In one example, the predetermined rule is that the average of the first distance and the second distance is the smallest. In yet another example, the preset rule is that the first distance or the second distance is the smallest. When determining the boundary of the target object according to the second threshold, if there are multiple groups of parallel lines satisfying the second threshold, the embodiment of determining the boundary of the target object is the same as or similar to that described above, and is not described herein again.
From the above embodiment, the calibration pattern and the point cloud data are in a loose coupling relationship, and can be verified mutually, but are not absolutely dependent, and when the calibration pattern is lacked, the boundary of the target object can be determined based on parallel lines detected by the laser radar and straight line segments detected by the depth camera, so that the algorithm has high practicability. The positioning method provided by the embodiment of the application abandons the traditional method of only using the aruco (two-dimensional code) navigation or the lidar (radar) navigation, performs fusion positioning on the two, and determines the straight line segment of the target by the assistance of the calibration pattern. The multi-redundancy alignment design prevents a single sensor from failing to be aligned (for example, the illumination influences the camera to cause the failure of the detection of the aruco). The technical scheme provided by the embodiment of the application greatly improves the alignment precision and the success rate.
The following are embodiments of the apparatus of the present application, which can be used to perform embodiments of the method for locating the target object of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the method for locating a target object of the present application.
Fig. 9 is a block diagram of a target object locating device according to an embodiment of the present application. As shown in fig. 9, the apparatus includes: a data acquisition module 810, a data conversion module 820, a line detection module 830, and a boundary determination module 840.
The data acquisition module 810 is configured to acquire first point cloud data of the laser radar and depth information acquired by the depth camera;
the data conversion module 820 is used for projecting the depth information to a laser radar coordinate system to obtain second point cloud data;
the line detection module 830 is configured to perform line detection on the first point cloud data and the second point cloud data respectively to obtain a plurality of corresponding first line segments and a plurality of corresponding second line segments; the first straight line segment is detected from the first point cloud data, and the second straight line segment is detected from the second point cloud data;
the boundary determining module 840 is configured to determine the boundary of the target object according to the distance between the end point of the second straight line segment and the end point of the parallel line included in the plurality of first straight line segments.
The implementation processes of the functions and actions of the modules in the device are specifically detailed in the implementation processes of the corresponding steps in the target object positioning method, and are not described herein again.
In the embodiments provided in the present application, the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (11)

1. A method for locating a target object, the method comprising:
acquiring first point cloud data of the laser radar and depth information acquired by a depth camera;
projecting the depth information to a laser radar coordinate system to obtain second point cloud data;
respectively carrying out straight line detection on the first point cloud data and the second point cloud data to obtain a plurality of corresponding first straight line segments and a plurality of corresponding second straight line segments;
and determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end points of parallel lines contained in the plurality of first straight line segments.
2. The method of claim 1, wherein the step of determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end point of the parallel line included in the plurality of first straight line segments comprises:
acquiring image information of a calibration pattern acquired by the depth camera;
converting the image information of the calibration pattern into a robot coordinate system to obtain the coordinate position of the calibration pattern;
selecting a target straight-line segment of the target object from the plurality of second straight-line segments by using the coordinate position of the calibration pattern;
and determining the boundary of the target object according to the distance between the end point of the target straight line segment and the end points of parallel lines contained in the plurality of first straight line segments.
3. The method of claim 2, wherein said step of using the coordinate positions of the calibration pattern to select a target straight line segment of the target object from the plurality of second straight line segments comprises:
according to the coordinate position of the calibration pattern, estimating theoretical coordinates of the target edge of the target object under a robot coordinate system;
and comparing the theoretical coordinates with actual coordinates of the second straight-line segments respectively, and selecting the second straight-line segments with the difference between the actual coordinates and the theoretical coordinates smaller than a preset value as target straight-line segments.
4. The method according to claim 2, wherein the step of determining the boundary of the target object according to the distance between the end point of the target straight line segment and the end point of the parallel line included in the plurality of first straight line segments comprises:
calculating a first distance between the coordinates of the starting point of the target straight-line segment and the first end point coordinates and a second distance between the coordinates of the end point of the target straight-line segment and the second end point coordinates according to the first end point coordinates and the second end point coordinates of any parallel line;
and if the first distance and the second distance are both smaller than a first threshold value, determining that the parallel line and the target straight-line segment belong to the boundary of the target object.
5. The method according to claim 1, wherein before the step of determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end point of the parallel line included in the plurality of first straight line segments, after the step of performing the straight line detection on the first point cloud data and the second point cloud data respectively to obtain the corresponding plurality of first straight line segments and second straight line segments, the method further comprises the steps of:
and screening out two first straight line segments with the same slope as a group of parallel lines, wherein the coordinates of the starting point and the end point are in a preset range by calculating the slope of each first straight line segment according to the coordinates of the starting point and the end point of each first straight line segment in a robot coordinate system.
6. The method of claim 1, wherein the step of determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end point of the parallel line included in the plurality of first straight line segments comprises:
and respectively calculating the distance between the end point of each group of parallel lines and the end point of each second straight line section according to the parallel lines contained in the plurality of first straight line sections, and taking the parallel lines and the second straight line sections with the distance smaller than a second threshold value as the boundary of the target object.
7. The method according to claim 6, wherein the step of calculating, from parallel lines included in the plurality of first straight line segments, distances between end points of each group of the parallel lines and end points of each second straight line segment, respectively, and the step of using the parallel lines and the second straight line segments having the distances smaller than a second threshold as the boundary of the target object comprises:
calculating a first distance between the start point coordinate and the first end point coordinate of each second straight line segment and a second distance between the end point coordinate of each second straight line segment and the second end point coordinate according to the first end point coordinate and the second end point coordinate of any parallel line;
and if the first distance and the second distance are both smaller than a second threshold value, determining that the parallel line and the second straight line segment belong to the boundary of the target object.
8. The method of claim 1 or 6, further comprising:
and taking the coordinates of the middle points of the parallel lines belonging to the boundary as a robot navigation target point so that the robot can navigate according to the robot navigation target point.
9. An apparatus for locating a target object, the apparatus comprising:
the data acquisition module is used for acquiring first point cloud data of the laser radar and depth information acquired by the depth camera;
the data conversion module is used for projecting the depth information to a laser radar coordinate system to obtain second point cloud data;
the line detection module is used for respectively carrying out line detection on the first point cloud data and the second point cloud data to obtain a plurality of corresponding first line segments and a plurality of corresponding second line segments; the first straight line segment is detected from the first point cloud data, and the second straight line segment is detected from the second point cloud data;
and the boundary determining module is used for determining the boundary of the target object according to the distance between the end point of the second straight line segment and the end points of the parallel lines contained in the plurality of first straight line segments.
10. An electronic device, characterized in that the electronic device comprises:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method of positioning a target object of any one of claims 1-8.
11. A computer-readable storage medium, characterized in that the storage medium stores a computer program executable by a processor to perform the method of locating a target object according to any one of claims 1-8.
CN202110213647.2A 2021-02-25 2021-02-25 Target object positioning method and device, electronic equipment and storage medium Pending CN112927298A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110213647.2A CN112927298A (en) 2021-02-25 2021-02-25 Target object positioning method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110213647.2A CN112927298A (en) 2021-02-25 2021-02-25 Target object positioning method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112927298A true CN112927298A (en) 2021-06-08

Family

ID=76172017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110213647.2A Pending CN112927298A (en) 2021-02-25 2021-02-25 Target object positioning method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112927298A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180253857A1 (en) * 2015-09-25 2018-09-06 Logical Turn Services, Inc. Dimensional acquisition of packages
CN110110678A (en) * 2019-05-13 2019-08-09 腾讯科技(深圳)有限公司 Determination method and apparatus, storage medium and the electronic device of road boundary
CN110349221A (en) * 2019-07-16 2019-10-18 北京航空航天大学 A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor
WO2020103427A1 (en) * 2018-11-23 2020-05-28 华为技术有限公司 Object detection method, related device and computer storage medium
CN111324121A (en) * 2020-02-27 2020-06-23 四川阿泰因机器人智能装备有限公司 Mobile robot automatic charging method based on laser radar
CN112017240A (en) * 2020-08-18 2020-12-01 浙江大学 Tray identification and positioning method for unmanned forklift
CN112258590A (en) * 2020-12-08 2021-01-22 杭州迦智科技有限公司 Laser-based depth camera external parameter calibration method, device and storage medium thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180253857A1 (en) * 2015-09-25 2018-09-06 Logical Turn Services, Inc. Dimensional acquisition of packages
WO2020103427A1 (en) * 2018-11-23 2020-05-28 华为技术有限公司 Object detection method, related device and computer storage medium
CN110110678A (en) * 2019-05-13 2019-08-09 腾讯科技(深圳)有限公司 Determination method and apparatus, storage medium and the electronic device of road boundary
CN110349221A (en) * 2019-07-16 2019-10-18 北京航空航天大学 A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor
CN111324121A (en) * 2020-02-27 2020-06-23 四川阿泰因机器人智能装备有限公司 Mobile robot automatic charging method based on laser radar
CN112017240A (en) * 2020-08-18 2020-12-01 浙江大学 Tray identification and positioning method for unmanned forklift
CN112258590A (en) * 2020-12-08 2021-01-22 杭州迦智科技有限公司 Laser-based depth camera external parameter calibration method, device and storage medium thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
闫明等: "一种新的非结构化环境可通行区域检测算法", 计算机与数字工程, vol. 47, no. 7, pages 1652 - 1661 *

Similar Documents

Publication Publication Date Title
CN108381549B (en) Binocular vision guide robot rapid grabbing method and device and storage medium
CN111402336A (en) Semantic S L AM-based dynamic environment camera pose estimation and semantic map construction method
CN109784250B (en) Positioning method and device of automatic guide trolley
JP5799784B2 (en) Road shape estimation apparatus and program
CN108573471B (en) Image processing apparatus, image processing method, and recording medium
KR20210052409A (en) Lane line determination method and apparatus, lane line positioning accuracy evaluation method and apparatus, device, and program
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN107609510B (en) Positioning method and device for lower set of quayside container crane
US20100104194A1 (en) Image processing apparatus, electronic medium, and image processing method
Sansoni et al. Optoranger: A 3D pattern matching method for bin picking applications
CN111598946A (en) Object pose measuring method and device and storage medium
CN114972421A (en) Workshop material identification tracking and positioning method and system
Jang et al. Camera orientation estimation using motion-based vanishing point detection for advanced driver-assistance systems
JP2730457B2 (en) Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision
CN114757878A (en) Welding teaching method, device, terminal equipment and computer readable storage medium
CN114170596A (en) Posture recognition method and device, electronic equipment, engineering machinery and storage medium
Birk et al. Simultaneous localization and mapping (SLAM)
CN112509126A (en) Method, device, equipment and storage medium for detecting three-dimensional object
JP2778430B2 (en) Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision
CN111832634A (en) Foreign matter detection method, system, terminal device and storage medium
CN112927298A (en) Target object positioning method and device, electronic equipment and storage medium
CN116309882A (en) Tray detection and positioning method and system for unmanned forklift application
CN114758163B (en) Forklift movement control method and device, electronic equipment and storage medium
Łaganowska Application of vision systems to the navigation of mobile robots using markers
Sopauschke et al. Smart process observer for crane automation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination