CN115328175A - Logistics robot system - Google Patents

Logistics robot system Download PDF

Info

Publication number
CN115328175A
CN115328175A CN202211269595.1A CN202211269595A CN115328175A CN 115328175 A CN115328175 A CN 115328175A CN 202211269595 A CN202211269595 A CN 202211269595A CN 115328175 A CN115328175 A CN 115328175A
Authority
CN
China
Prior art keywords
robot terminal
path
information
video stream
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211269595.1A
Other languages
Chinese (zh)
Other versions
CN115328175B (en
Inventor
王纪武
任浩
陈锁柱
马秋兰
万伟鹏
付俊炜
郑世龙
许钧翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Logistics Center Co ltd
Beijing Shidai Fuchen Intelligent Technology Co ltd
Original Assignee
Southwest Logistics Center Co ltd
Beijing Shidai Fuchen Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Logistics Center Co ltd, Beijing Shidai Fuchen Intelligent Technology Co ltd filed Critical Southwest Logistics Center Co ltd
Publication of CN115328175A publication Critical patent/CN115328175A/en
Application granted granted Critical
Publication of CN115328175B publication Critical patent/CN115328175B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a logistics robot system, comprising: the robot terminal at least comprises a driving device for driving the robot terminal to move; the environment perception module is used for perceiving the surrounding environment, and the surrounding environment at least comprises current environment information; the data processing module, electric connection environment perception module and drive arrangement, handle current environmental information and draw the path information, in order to obtain the relative position relation of robot terminal and current route, control signal is exported according to the relative position relation, in order to control drive arrangement, make robot terminal move according to predetermineeing the route, drive terminal carries out current environment and obtains in the removal process, carry out the path information according to current environment and draw, acquire the relative position relation of robot terminal and current route through the path information, and export control signal according to the relative position relation, make drive terminal move along predetermineeing the route, avoid drive terminal skew to predetermine the route.

Description

Logistics robot system
Technical Field
The invention belongs to the technical field of logistics storage, and particularly relates to a logistics robot system.
Background
At present, a logistics warehouse is a place where e-commerce and logistics companies store a large number of packages to be released and stored goods, along with the rapid growth of express business in China, the number of packages in a logistics transfer station is increased, and the required human capital is increased. However, with the development of artificial intelligence, the intelligent logistics robot gradually enters the field of vision of the public, logistics work gradually starts to be combined with the intelligent robot, and common logistics robots, such as sorting robots and storage AGV logistics robots, are important components forming an intelligent logistics network.
The logistics robot of present transportation usefulness is mostly on placing the commodity in logistics robot body, has reduced the parcel quantity that logistics robot single operation was transported to a certain extent to efficiency has been reduced. The logistics robot also adopts a non-guiding navigation mode such as slam, but the mode has larger offset generated relative to the guiding mode in the transportation process, and the loading area and the unloading area need to be adjusted by manual operation to load and unload goods.
Therefore, there is a need to provide an improved solution to the above-mentioned deficiencies of the prior art.
Disclosure of Invention
The present invention is directed to overcome the above-mentioned shortcomings in the prior art, and to provide a logistics robot capable of self-correcting a moving path.
In order to achieve the above purpose, the invention provides the following technical scheme:
a logistics robot system, the logistics robot system comprising:
the robot terminal at least comprises a driving device for driving the robot terminal to move;
the environment perception module is used for perceiving the surrounding environment, and the surrounding environment at least comprises current environment information;
the data processing module is electrically connected with the environment sensing module and the driving device, processes the current environment information and extracts path information so as to obtain the relative position relationship between the robot terminal and the current path, and outputs a control signal according to the relative position relationship so as to control the driving device to enable the robot terminal to move according to a preset path;
the data processing module generates a control signal based on a line following mode, so that the robot terminal moves along a central line of a preset path, and the processing of the current environment information comprises the following steps: sequentially carrying out video stream acquisition, gray level and filtering processing, self-adaptive threshold processing, edge extraction, morphology processing, contour detection, centerline extraction and centerline comparison on current environment information, and carrying out path tracking on a centerline according to a comparison result;
the minimum edge surrounding contour is obtained through contour detection, texture noise information of the ground is removed according to screening of the surrounding line contour, a central line is obtained through fitting, and central line comparison is carried out through the position of the central line in the image.
Preferentially, when the cross identification is found in the image contained in the video stream and the preset command is a turning command, enabling the robot terminal to respond to the turning command to execute a turning action; when the rotation angle of the directional driving wheel positioned on the outer side is smaller than a preset angle threshold value, processing is not carried out on the image acquired by the video stream; when the rotating angle of the directional driving wheel positioned on the outer side is a preset angle threshold value, jumping to the step: the method comprises the steps of sequentially carrying out video stream acquisition, gray level and filtering processing, self-adaptive threshold processing, edge extraction, morphology processing, contour detection, center line extraction and center line comparison on current environment information, carrying out path tracking on a center line according to a comparison result, finishing turning action and starting straight-going action if the center line is found to be positioned in the center of an image contained in the video stream.
Preferably, during the moving process, when the cross mark is found in the image acquired by the video stream, the current positioning information is updated by the position information represented by the cross mark.
Preferably, after the centerline extraction and before the centerline alignment, the method further includes: selecting a plurality of scanning lines in the horizontal direction in an image contained in a video stream, determining boundary points of the scanning lines and the central line, and determining the width and the orientation of the central line according to the boundary points.
Preferably, the surrounding environment further includes current positioning information to confirm a movement distance and a position of the robot terminal and guide the robot terminal to move.
Preferably, the surrounding environment further includes obstacle information for identifying an obstacle on a movement path of the robot terminal and allowing the robot terminal to avoid the obstacle.
Preferably, the environment sensing module at least comprises a vision sensor, a full-field positioning module and a laser radar, wherein the vision sensor is used for acquiring current environment information, the full-field positioning module is used for acquiring current positioning information, and the laser radar is used for acquiring obstacle information.
Preferably, when the central line is found to be positioned in the center of the images contained in the video stream through the central line comparison, the control signal is not corrected;
calculating and correcting when the center line is compared with the center line and the center of the image contained in the video stream is subjected to path offset or inclination, and correcting the control signal according to the calculation result;
and when the center line comparison shows that the images contained in the video stream have no center line, driving the robot terminal to retreat along the current path until the images contained in the video stream have center lines, identifying the current path information, calculating and correcting the deviation according to the identification result, and correcting the control signal according to the calculation result.
Preferably, the logistics robot system further comprises an upper computer, wherein the upper computer comprises:
the full-field positioning map is used for displaying full-field coordinate information and a moving path of the robot terminal;
the instruction input module is used for inputting the operation task of the robot terminal;
the processor is used for planning a path based on the operation task and generating a preset path;
and the communication module is in communication connection with the data processing module so as to send a preset path to the data processing module.
Preferably, the upper computer further comprises an operating handle, the operating handle is electrically connected with the communication module so as to send the control signal generated by the operating handle to the data processing module, and the driving device moves according to the control signal generated by the operating handle.
Preferably, the robot terminal comprises a tractor and a trailer which are detached, the driving device is a directional driving wheel, the two directional driving wheels are respectively arranged on two sides of one end, close to the trailer, of the tractor, and the control signal output by the data processing module can independently control the two directional driving wheels.
Preferably, the tractor stretches out the layer board corresponding to the one end of trailer, be equipped with the level on the trailer and stretch to the drag hook of layer board top, be equipped with on the layer board and can follow vertically to stretch into or withdraw from electric putter in the drag hook, electric putter electric connection is in data processing module.
Has the advantages that: the driving terminal acquires the current environment in the moving process, extracts path information according to the current environment, acquires the relative position relation between the robot terminal and the current path through the path information, and outputs a control signal according to the relative position relation, so that the driving terminal moves along the preset path, the driving terminal is prevented from deviating from the preset path, the working efficiency is improved, and the error rate of the robot terminal is reduced.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. Wherein:
fig. 1 is a schematic structural diagram of a robot terminal according to an embodiment of the present invention;
FIG. 2 is a schematic structural view of a tractor according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a trailer according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating the control of the robot terminal according to an embodiment of the present invention;
fig. 5 is a flow chart of the adaptive threshold processing of fig. 4.
In the figure: 1. a tractor; 2. a laser radar; 3. an emergency stop button; 4. touching a computer; 5. a vision sensor; 6. a universal driven wheel; 7. a directional drive wheel; 8. a directional driven wheel; 9. a trailer; 10. an electric push rod; 11. and (4) pulling a hook.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely below, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present invention.
In the description of the present invention, the terms "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", and the like indicate orientations or positional relationships based on orientations or positional relationships shown in the drawings, which are merely for convenience of description of the present invention and do not require that the present invention must be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. The terms "connected" and "connected" used herein should be interpreted broadly, and may include, for example, a fixed connection or a detachable connection; they may be directly connected or indirectly connected through intermediate members, and specific meanings of the above terms will be understood by those skilled in the art as appropriate.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings. It should be noted that the embodiments and features of the embodiments of the present invention may be combined with each other without conflict.
As shown in fig. 1 to 5, the present invention provides a logistics robot system, including: the robot terminal at least comprises a driving device for driving the robot terminal to move; the environment perception module is used for perceiving the surrounding environment, and the perceived surrounding environment at least comprises current environment information; the data processing module, electric connection environment perception module and drive arrangement, handle current environmental information, in order to extract the path information of robot terminal position in current environment, and acquire the relative position relation of robot terminal and current route, and output control signal according to the relative position relation, in order to control drive arrangement, make robot terminal remove according to predetermineeing the route, data processing module carries out control signal's correction in real time based on the path information, make robot terminal remove along predetermineeing in the route, thereby it carries out control signal's correction to need not manual operation, a manpower cost of transportation for reducing the warehousing and warehousing process, realize the automation goods of warehousing and ex-warehouse. The method comprises the steps that a logistics operator firstly performs multi-path planning and multi-task allocation design before a robot terminal works, and then the robot terminal generates a preset path according to task requirements.
In another optional embodiment, the surrounding environment further includes current positioning information, and the surrounding environment further includes current positioning information to confirm the moving distance and position of the robot terminal and guide the robot terminal to move. The surrounding environment also comprises obstacle information, which helps the robot terminal to identify the front obstacle and enables the robot terminal to avoid the obstacle. When the distance between the object existing in the front and the tractor 1 is sensed to be less than a certain threshold value, the tractor 1 stops running and gives an alarm.
Or the data processing module adjusts the control signal according to the obstacle information, so that the robot terminal bypasses the obstacle. When the obstacle is encountered, the moving path is adjusted in time to bypass the obstacle according to the obstacle condition, and after the obstacle is bypassed, the moving path returns to the preset path in time and the whole course is completed along the preset path.
In this embodiment, the environment sensing module at least includes vision sensor 5, full field orientation module and lidar 2, wherein, vision sensor 5 is used for acquireing current environmental information, full field orientation module is used for acquireing current orientation information, lidar 2 is used for acquireing obstacle information, lidar 2 mainly used advances the obstacle detection of route in-process, lidar 2 transmission laser beam, the light signal that the receipt has the target to reflect back obtains information such as distance, position of robot terminal surrounding environment, specifically, vision sensor 5 gathers the surrounding environment through the camera, and transmit the surrounding environment to data processing module.
In another optional implementation, the data processing module generates a control signal based on a line following manner, so that the robot terminal moves along a central line of a preset path, the processing of the current environment information and the generation process of the control signal are that, in the moving process of the robot terminal, the peripheral environment is collected by the visual sensor 5, a video stream is generated, the video stream containing the previous environment information is sent to the data processing module, the data processing module sequentially performs gray level and filtering processing, adaptive threshold processing, edge extraction, morphological processing, contour detection, central line extraction and central line comparison on the received video stream, and performs path tracking on the central line according to the comparison result, so that the robot terminal moves along the central line, and no path deviation is generated.
Specifically, after the video stream collects images, the images are transmitted to the data processing module, the data processing module splits the R, G, B value of the color image, and the color image is converted into a gray image through a perception formula of human eyes on color, namely 'Grey = 0.299R + 0.587G + 0.114B'. And then, removing noise points in the environment or noise points generated by the influence of hardware equipment through low-pass filtering, namely Gaussian low-pass filtering, and besides, adopting contrast adjustment and geometric mean filtering to make effective environment information in the image more prominent. Obtaining a relatively ideal protruded guide line image through the image preprocessing and the image enhancement, carrying out image binarization according to self-adaptive threshold processing, and then carrying out edge detection on the binary image by adopting a canny operator, wherein the detected edge may not have edge information of ground texture at the edge of the guide line, the edge connection of the extracted guide line is incomplete, and the edge can be closed by optimizing the extracted edge and adopting a rectangular kernel for expansion and corrosion; and obtaining a minimum edge surrounding contour through contour detection, removing texture noise information of the ground according to the screening of the surrounding line contour, obtaining a central line through fitting, and performing central line comparison through the position of the central line in the image. In the process of eliminating the ground texture noise information according to the screening of the surrounding line profile, the screening factors can include three factors of area, aspect ratio and width, and in the application, an area threshold, an aspect ratio threshold and a width threshold are usually set, and the specific sizes of the three thresholds are referenced to a central line positioned in the center of the image.
In the processing logic of the adaptive threshold processing, firstly, area threshold comparison is carried out on the area of an image contained in a video stream, if the area value of the image is not smaller than a preset area threshold, circulation is ended, and edge extraction processing is not carried out downwards; when the area value of the image is smaller than a preset area threshold, carrying out perimeter threshold comparison on the perimeter of the image, and if the perimeter value of the image is not smaller than the preset perimeter threshold, ending the circulation and not carrying out downward edge extraction processing; and when the perimeter value of the image is smaller than a preset perimeter threshold value, storing the contour image into a new container for later processing of edge extraction, morphological processing, contour detection, centerline extraction and centerline comparison.
In this embodiment, a center line is arranged on a moving path of the robot terminal, a traveling path of the robot terminal is defined in a line following manner, the moving path of the robot terminal can be changed according to the field requirement without damaging the environment, the early arrangement cost and the environment reconstruction cost of the environment are greatly reduced, and the robot terminal can strictly travel according to the set path. And the operation path can be planned at will, so that the map construction is simplified and the calculation force is reduced.
In this embodiment, center lines are arranged on roads of all preset paths of the robot terminal and extend continuously or discontinuously along the roads, specifically, the center lines can be obtained by pasting tapes or painting paint, the center lines on the road surface are identified and tracked by the vision sensor 5 in the advancing process of the robot terminal, the preset track material is the tapes or the paint, and magnetic guidance is arranged in the non-traditional technology to navigate the robot terminal, so that the cost can be reduced, meanwhile, the integrity of the ground can be protected, and the color of the center lines is not limited and can be black, yellow, red, white and the like.
In this embodiment, when the center line is found to be located at the center of the images included in the video stream through the center line comparison, the control signal is not corrected, and the driving terminal keeps the original driving state to perform driving.
Calculating and correcting when the center line is found to be offset or inclined with the center of the image contained in the video stream through center line comparison, and correcting the control signal according to the calculation result;
when the central line comparison shows that the images contained in the video stream have no central line, the robot terminal is driven to retreat along the current path until the images contained in the video stream have the central line, the path information of the current position is identified (namely, the images are collected again, the video stream is subjected to gray scale and filtering processing, adaptive threshold value processing, edge extraction, morphological processing, contour detection, central line extraction and central line comparison are carried out), calculation and deviation correction are carried out according to the identification result, a control signal is corrected according to the calculation result, and if a reasonable path cannot be obtained after repeated retreating, warning information is sent out.
The deviation corrected by calculating the deviation correction refers to a transverse distance deviation d and a course rotation angle deviation theta, the transverse distance d is the distance between the center point of the center line of the navigation belt and the origin of the image coordinate system, the course rotation angle theta is the slope of the fitted center line, the distance deviation d and the course deviation theta can be calculated after fitting by a least square method, then the rotating speed of the driving wheel is adjusted according to the distance deviation and the course deviation, the driving terminal is moved along the center line of the preset path by the control method of the following line, and the phenomenon of deviation or inclination of the moving line is avoided.
In another optional embodiment, the logistics robot system further comprises an upper computer, and the upper computer comprises: the system comprises a full-field positioning map and an upper computer, wherein the full-field positioning map is used for displaying full-field coordinate information and a moving path of a robot terminal, and the upper computer displays the full-field coordinate information and the moving path of the robot terminal through a display screen, so that an operator can conveniently observe the running condition of the robot terminal; the command input module is used for inputting a work task of the robot terminal, wherein the specific work task can comprise a coordinate point of a starting point and an end point, work time, task allocation and the like; the processor is used for planning a path based on the job task and generating a preset path; the operator can screen the preset path, or the operator can adjust the preset path according to actual needs. And the communication module is in communication connection with the data processing module so as to send the preset path to the data processing module. After the robot terminal receives the preset path, the robot terminal moves according to the preset path, meanwhile, in the moving process, the robot terminal acquires images (mainly ground images) of the surrounding environment, the images are uploaded to a processor, a data processing module performs gray processing, edge extraction, edge sharpening and other processing on the images to extract path information, a data processing module performs algorithm calculation according to the path information to output a control signal, and the data processing module controls a driving device to drive the robot terminal. The method comprises the steps of extracting path edge information by using an edge detection technology in the process of identifying and tracking path information, and effectively reducing the calculation amount by using the edge information for subsequent calculation, specifically, after extracting a central line (or a pilot line), selecting a plurality of scanning lines in the horizontal direction, determining boundary points of each scanning line and the central line (the pilot line), and determining the width and the direction of the central line (the pilot line) according to the boundary points, for example, the number of the scanning lines in the horizontal direction is two, 2 boundary points are formed between each scanning line and two boundaries of the central line (the pilot line), so that 4 boundary points are formed, the two boundary points of the same scanning line can be used for calculating the width of the central line (the pilot line), and the two boundary points from the two scanning lines corresponding to the same boundary can be used for calculating the direction of the central line (the pilot line).
Generally, paths (i.e., center lines (guidance lines)) of the robot terminal are parallel to each other, and cross each other. When the intersection needs to turn, if the odometer is used for controlling the turning position point (starting point), the odometer has accumulated errors, and the turning action at the same starting point position at the intersection cannot be guaranteed every time. In addition, during a turn, the speed may not be too slow (if too slow, rotation may not be possible due to frictional forces). Based on the generated inertia effect, the robot terminal is difficult to rotate a given angle accurately, so that the vision line following algorithm provided by the invention can effectively solve the problem of intersection line turning (consistency of starting points) and is less influenced by ambient light.
Specifically, a cross marker is arranged in a midline (leading line) crossing area, and the cross marker is used for representing that the path led out at the crossing area has a plurality of paths, such as a straight path, a turning path and the like. The crossing marks may be set in such a manner that a material different from the central line (navigation line), such as a material on the ground, is set at the boundary of the crossing area of the central line (navigation line), so that after the binarization processing is performed on the image of the video stream, a breakpoint occurs in the crossing area relative to other areas of the central line (navigation line), and it is determined that the image contains the crossing marks based on the breakpoint.
If the cross identification exists in the image contained in the video stream and the received preset instruction is a turning instruction, the turning instruction is used for indicating to turn at the cross position corresponding to the cross identification, so that the robot terminal executes turning action.
The turning track is arc-shaped, the odometer is used for calculating the turning angle, when the rotation angle of the directional driving wheel positioned on the outer side is smaller than a preset angle threshold value, the image collected by the video stream is not processed, and the preset angle threshold value is set according to the rotation angle required by the directional driving wheel positioned on the outer side when a center line driving after turning appears in the image collected by the video stream, for example, the preset angle threshold value is 45 degrees.
When the rotation angle of the directional driving wheel positioned on the outer side is equal to a preset angle threshold, processing the video stream is started, namely, processing the image collected by the video stream: the method comprises the following steps of gray level and filtering processing, self-adaptive threshold processing, edge extraction, morphology processing, contour detection, centerline (navigation line) extraction and centerline (navigation line) comparison, path tracking is carried out on the centerline (navigation line) according to a comparison result, and when the centerline (navigation line) is found to be located in the center of an image contained in a video stream through centerline (navigation line) comparison, turning action is finished, so that the robot terminal executes straight-going action.
It should be noted that: when the robot terminal turns left, the directional driving wheel on the right side is the directional driving wheel on the outer side; when the robot terminal turns right, the directional driving wheel positioned on the left side is the directional driving wheel positioned on the outer side. The robot terminal is a differential wheel type robot terminal, and the differential wheel type robot terminal is influenced by ground conditions, so that slipping and correction of direction in a linear traveling process can occur, a full field positioning module (mileometer) inevitably generates counting errors, when the moving distance of the robot terminal is longer, the accumulated errors are larger and larger, and the positioning accuracy of the robot terminal is influenced, therefore, the data processing module further executes the following navigation method:
when the cross mark is found in the image collected by the video stream, the current positioning information is updated by the position information represented by the cross mark.
For example: and setting a cross mark at the end point of the preset path, and when the cross mark is found in the image acquired by the video stream, indicating that the end point appears in the visual field of the robot terminal, namely the robot terminal is about to reach the end point, and after the robot terminal continues to run straight along the current path for a preset distance, reaching the end point. When the robot terminal finds the cross mark, the center of the robot terminal is a distance from the center of the cross area, and the distance is called a preset distance and can be determined by measurement in advance. The navigation method does not judge whether the terminal point is reached according to the positioning information calculated by the odometer, namely, the positioning is accurate as long as the identification of the cross mark is accurate, the influence of accumulated errors caused by using the odometer can be avoided, and the accuracy of the linear walking distance is ensured.
In another optional embodiment, the robot terminal can also move based on the operation mode of the handle so as to deal with emergency and complete specific tasks, and specifically, the upper computer further comprises an operation handle which is electrically connected with the communication module so as to enable a control signal generated by the operation handle to be generated to the data processing module, so that the driving device moves according to the control signal generated by the operation handle. The robot is controlled by remote control operation through the operating handle. The combination of multiple operation modes ensures that the robot can be suitable for multiple scene occasions.
In another alternative embodiment, the robot terminal comprises a detachable tractor 1 and a trailer 9, wherein the driving device is arranged on the tractor 1, the power source is provided by the tractor 1, and at least a placing platform for bearing articles is arranged on the trailer 9, so that the cargo transportation amount of single transportation can be greatly improved.
In this embodiment, vision sensor 5, full field orientation module and laser radar 2 all set up on the tractor, and specifically, vision sensor 5 sets up the one end of keeping away from trailer 9 at the tractor, and is located the intermediate position of tractor to the downward ground that points to of slope, so that gather road surface information, gather and analysis processes the road surface central line. Laser radar 2 and full field orientation module preferred selection are located the central line of tractor 1, and laser radar 2 should still set up in the top of tractor 1 to guarantee to carry out the barrier around the robot terminal and detect. The moving precision of the robot terminal can be ensured in a line-following mode, and the quantity of transported goods can be ensured in a trailer 9 mode.
The driving device is an oriented driving wheel 7 (the oriented driving wheel 7 is driven by a hub motor, and the data processing module controls the hub motor), the two oriented driving wheels 7 are respectively arranged at two sides of one end, close to the trailer 9, of the tractor 1, and a control signal output by the data processing module can independently control the two oriented driving wheels 7, so that the two oriented driving wheels 7 generate differential rotation, and steering and correction of a traveling route are achieved.
For guaranteeing stability, tractor 1 and trailer 9 are four-wheel construction, make the automobile body guarantee under the uneven road surface condition that at least three point and ground contact have guaranteed the flexibility when operation stationarity also has been guaranteed, arrange trailer 9's universal driven wheel 6 in the rear simultaneously, the orientation wheel is arranged in the place ahead and although reduced trailer 9 in the flexibility of turn process to a certain extent, still changes the turning radius who confirms trailer 9. Wherein, two wheels at one end of the tractor 1 far away from the trailer 9 are universal driven wheels 6, so that steering is realized by two directional driving wheels 7 at the other end of the tractor 1 through differential speed; on the trailer 9, four wheels can be selected to be all universal driven wheels 6, or two directional driven wheels 8 are arranged at one end of the trailer 9 close to the tractor 1.
In another optional embodiment, connect through electric putter 10 between tractor 1 and the trailer 9, concretely, it has the layer board to stretch out to correspond the one end of trailer 9 at tractor 1, it stretches to the drag hook 11 of layer board top to be equipped with the level on the trailer 9, be equipped with on the layer board and can follow vertically to stretch into or withdraw from electric putter 10 in the drag hook 11, electric putter 10 is as the pin of connecting layer board and drag hook 11, electric putter 10 electric connection is in data processing module, after the robot terminal reachs the assigned position, control electric putter 10 withdraws from drag hook 11, trailer 9 breaks away from with tractor 1, tractor 1 can carry out other work tasks this moment, the utilization efficiency of tractor 1 has been improved.
In this embodiment, logistics management personnel arranges the environmental path in whole field map in earlier stage, arrange and accomplish back robot terminal and carrying out initial point initialization and camera initialization, the host computer publishes the operation task and predetermines the route, robot terminal traveles to the loading area, the trailer of corresponding central line has been placed in the loading area, tractor 1 discerns the road route and adjusts the position appearance according to ground road sign (central line) and makes tractor 1's afterbody electric putter 10 aim at the drag hook 11 of trailer 9, position alignment back electric putter 10 rises and is connected trailer 9 and tractor 1. After the connection is successful, the trailer 9 is moved along a preset path according to the task requirement issued by the upper computer, the trailer 9 is pulled to an unloading area along a track path, and the trailer 9 is unloaded to a specified position similar to the loading area. The trailer 9 need not manual operation to the assigned position automatic connection and connects, and every tractor 1 can be equipped with a plurality of trailers 9 that are equipped with the goods, and tractor 1 is carrying out the loading time that the transportation does not occupy the goods all the time, and the use is that tractor 1 moves to the appointed thing point of getting along predetermineeing the route, accurately makes electric putter 10 rise automatically when position discernment, connects trailer 9 pull ring and carries out corresponding transportation work again automatically. It should be noted that the identification of the loading area may be determined according to the end point of the moving path of the tractor, when the tractor moves to the end point of the preset path, the loading area is indicated, at this time, the operation of backing up the car is started, a location mark is set at the designated position, and when the location mark appears in the image acquired by the video stream, the tractor is indicated to have backed up and run to the designated position.
When the planning and laying of the track is not performed in advance, the task can be performed in a handle operation mode, the full-field positioning module can perform repeated path and task, and full-automatic multi-task operation can be achieved based on a line-following mode. The robot terminal can be used for different application occasions through various working modes of the rocking handle control work, the full-field positioning module control work and the line-following operation.
In another optional embodiment, an emergency stop button 3 and an intelligent touch computer are arranged on the upper computer to prevent special conditions from occurring and perform task input of the upper computer. It should be understood that the above description is only exemplary, and the embodiments of the present application do not limit the present invention.
The invention is not to be considered as limited to the particular embodiments shown, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A logistics robot system, characterized in that the logistics robot system comprises:
the robot terminal at least comprises a driving device for driving the robot terminal to move;
the environment perception module is used for perceiving the surrounding environment, and the surrounding environment at least comprises current environment information;
the data processing module is electrically connected with the environment sensing module and the driving device, processes the current environment information and extracts path information so as to obtain the relative position relationship between the robot terminal and the current path, and outputs a control signal according to the relative position relationship so as to control the driving device to enable the robot terminal to move according to a preset path;
the data processing module generates a control signal based on a line-following mode, so that the robot terminal moves along a central line of a preset path, and the processing of the current environment information comprises the following steps: sequentially carrying out video stream acquisition, gray level and filtering processing, self-adaptive threshold processing, edge extraction, morphological processing, contour detection, center line extraction and center line comparison on the current environment information, and carrying out path tracking on the center line according to a comparison result;
the minimum edge surrounding contour is obtained through contour detection, the texture noise information of the ground is removed according to the screening of the surrounding line contour, a central line is obtained through fitting, and central line comparison is carried out through the position of the central line in the image.
2. The logistics robot system of claim 1, wherein the robot terminal comprises a tractor and a trailer for disassembly, the driving device is a directional driving wheel, the two directional driving wheels are respectively arranged at two sides of one end of the tractor close to the trailer, and the control signal output by the data processing module can independently control the two directional driving wheels;
the trailer is of a four-wheel structure, the universal driven wheel of the trailer is arranged at the rear part, and the directional wheel is arranged at the front part.
3. The logistics robot system of claim 2, wherein when the images contained in the video stream are found to have cross marks and the preset command is a turning command, the robot terminal is enabled to respond to the turning command to execute a turning action;
when the rotation angle of the directional driving wheel positioned on the outer side is smaller than a preset angle threshold value, processing is not carried out on the image acquired by the video stream;
when the rotating angle of the directional driving wheel positioned on the outer side is a preset angle threshold value, jumping to the step: the method comprises the steps of sequentially carrying out video stream acquisition, gray level and filtering processing, self-adaptive threshold processing, edge extraction, morphology processing, contour detection, center line extraction and center line comparison on current environment information, carrying out path tracking on a center line according to a comparison result, finishing turning action and starting straight-going action if the center line is found to be positioned in the center of an image contained in the video stream.
4. The logistics robot system of claim 1, wherein during the moving process, when a cross mark is found in the image captured by the video stream, the current positioning information is updated with the position information represented by the cross mark.
5. The logistics robot system of claim 1, further comprising, after the centerline extraction and before the centerline comparison:
selecting a plurality of scanning lines in the horizontal direction in an image contained in a video stream, determining boundary points of the scanning lines and the central line, and determining the width and the orientation of the central line according to the boundary points.
6. The logistics robot system of claim 1, wherein the surrounding environment further comprises current positioning information to confirm the moving distance and position of the robot terminal, guiding the robot terminal to move; the surrounding environment further comprises obstacle information so as to identify obstacles on a moving path of the robot terminal and enable the robot terminal to avoid the obstacles;
the environment perception module at least comprises a vision sensor, a full-field positioning module and a laser radar, wherein the vision sensor is used for acquiring current environment information, the full-field positioning module is used for acquiring current positioning information, and the laser radar is used for acquiring obstacle information.
7. The logistics robot system of claim 1, wherein when the center line is found to be located at the center of the images included in the video stream through the center line comparison, no correction is performed on the control signal;
calculating and correcting when the center line is compared with the center line and the center of the image contained in the video stream is subjected to path offset or inclination, and correcting the control signal according to the calculation result;
and when the center line comparison shows that the images contained in the video stream have no center line, driving the robot terminal to retreat along the current path until the images contained in the video stream have center lines, identifying the current path information, calculating and correcting the deviation according to the identification result, and correcting the control signal according to the calculation result.
8. The logistics robot system of claim 1, further comprising an upper computer, the upper computer comprising:
the full-field positioning map is used for displaying full-field coordinate information and a moving path of the robot terminal;
the instruction input module is used for inputting the operation task of the robot terminal;
the processor is used for planning a path based on the operation task and generating a preset path;
and the communication module is in communication connection with the data processing module so as to send a preset path to the data processing module.
9. The logistics robot system of claim 8, wherein the upper computer further comprises an operating handle, and the operating handle is electrically connected with the communication module to send a control signal generated by the operating handle to the data processing module, so that the driving device moves according to the control signal generated by the operating handle.
10. The logistics robot system of claim 2, wherein a supporting plate extends from one end of the tractor corresponding to the trailer, a drag hook horizontally extending above the supporting plate is arranged on the trailer, an electric push rod capable of longitudinally extending into or retracting out of the drag hook is arranged on the supporting plate, and the electric push rod is electrically connected with the data processing module.
CN202211269595.1A 2021-11-12 2022-10-18 Logistics robot system Active CN115328175B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111342317X 2021-11-12
CN202111342317.XA CN114200927A (en) 2021-11-12 2021-11-12 Logistics robot system

Publications (2)

Publication Number Publication Date
CN115328175A true CN115328175A (en) 2022-11-11
CN115328175B CN115328175B (en) 2023-02-17

Family

ID=80647639

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111342317.XA Pending CN114200927A (en) 2021-11-12 2021-11-12 Logistics robot system
CN202211269595.1A Active CN115328175B (en) 2021-11-12 2022-10-18 Logistics robot system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202111342317.XA Pending CN114200927A (en) 2021-11-12 2021-11-12 Logistics robot system

Country Status (1)

Country Link
CN (2) CN114200927A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115619300A (en) * 2022-11-14 2023-01-17 昆船智能技术股份有限公司 Automatic loading system and method for containers

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115195893A (en) * 2022-07-18 2022-10-18 拉扎斯网络科技(上海)有限公司 Distribution equipment and distribution processing method and device
CN115129070B (en) * 2022-08-31 2022-12-30 深圳市欧铠智能机器人股份有限公司 Intelligent obstacle avoidance system and method for storage robot under Internet of things

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102789234A (en) * 2012-08-14 2012-11-21 广东科学中心 Robot navigation method and robot navigation system based on color coding identifiers
CN102789233A (en) * 2012-06-12 2012-11-21 湖北三江航天红峰控制有限公司 Vision-based combined navigation robot and navigation method
CN108146524A (en) * 2017-12-27 2018-06-12 沈阳萝卜科技有限公司 A kind of transportation robot that can replace storage box
US20190064835A1 (en) * 2017-08-30 2019-02-28 Assa Abloy Entrance Systems Ab Vehicle guidance systems and associated methods of use at logistics yards and other locations
CN113110443A (en) * 2021-04-12 2021-07-13 大连理工大学 Robot tracking and positioning method based on camera

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103631221B (en) * 2013-11-20 2016-05-04 华南理工大学广州学院 A kind of distant operate services robot system
CN104723318B (en) * 2015-03-10 2017-03-15 苏州英达瑞机器人科技有限公司 Utonomous working robot system
CN106272438B (en) * 2016-10-17 2018-04-24 宁德师范学院 Applied to the robot express mail dissemination system and control method in express delivery office for incoming and outgoing mail
CN107065853B (en) * 2016-12-21 2020-02-14 深圳若步智能科技有限公司 Logistics robot system and working method thereof
CN107272710B (en) * 2017-08-08 2021-07-13 河海大学常州校区 Medical logistics robot system based on visual positioning and control method thereof
CN108544465A (en) * 2018-05-03 2018-09-18 南京邮电大学 Omni-directional mobile robots and its control method based on Mecanum wheels
CN109844674B (en) * 2018-10-15 2023-02-03 灵动科技(北京)有限公司 Logistics robot with controllable camera and indicator and operation method
CN109910010A (en) * 2019-03-23 2019-06-21 广东石油化工学院 A kind of system and method for efficient control robot
CN110340912A (en) * 2019-07-25 2019-10-18 南通大学 A kind of Intelligent logistics Transport Robot Control System for Punch
CN111474933B (en) * 2020-04-24 2022-03-15 合肥工业大学 Automatic deviation rectification control method of magnetic guidance AGV
CN112025707A (en) * 2020-08-28 2020-12-04 济南浪潮高新科技投资发展有限公司 Robot distribution system based on 5G communication
CN112209302A (en) * 2020-10-09 2021-01-12 青岛广运智能装备有限公司 Intelligent controller for improving manual electric forklift into robot
CN112698653A (en) * 2020-12-23 2021-04-23 南京中朗智能技术有限公司 Robot autonomous navigation control method and system based on deep learning
CN113290561A (en) * 2021-05-28 2021-08-24 深圳辰迈机器人有限公司 Medical self-disinfection logistics robot and control method thereof
CN113359734B (en) * 2021-06-15 2022-02-22 苏州工业园区报关有限公司 Logistics auxiliary robot based on AI
CN113359611A (en) * 2021-06-18 2021-09-07 乐聚(深圳)机器人技术有限公司 Control method, device and equipment of robot handle and storage medium
CN113534810A (en) * 2021-07-22 2021-10-22 乐聚(深圳)机器人技术有限公司 Logistics robot and logistics robot system
CN113618731A (en) * 2021-07-22 2021-11-09 中广核研究院有限公司 Robot control system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102789233A (en) * 2012-06-12 2012-11-21 湖北三江航天红峰控制有限公司 Vision-based combined navigation robot and navigation method
CN102789234A (en) * 2012-08-14 2012-11-21 广东科学中心 Robot navigation method and robot navigation system based on color coding identifiers
US20190064835A1 (en) * 2017-08-30 2019-02-28 Assa Abloy Entrance Systems Ab Vehicle guidance systems and associated methods of use at logistics yards and other locations
CN108146524A (en) * 2017-12-27 2018-06-12 沈阳萝卜科技有限公司 A kind of transportation robot that can replace storage box
CN113110443A (en) * 2021-04-12 2021-07-13 大连理工大学 Robot tracking and positioning method based on camera

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115619300A (en) * 2022-11-14 2023-01-17 昆船智能技术股份有限公司 Automatic loading system and method for containers
CN115619300B (en) * 2022-11-14 2023-03-28 昆船智能技术股份有限公司 Automatic loading system and method for containers

Also Published As

Publication number Publication date
CN115328175B (en) 2023-02-17
CN114200927A (en) 2022-03-18

Similar Documents

Publication Publication Date Title
CN115328175B (en) Logistics robot system
CN105700532B (en) The Intelligent Mobile Robot navigator fix control method of view-based access control model
CN104751151B (en) A kind of identification of multilane in real time and tracking
CN108873904B (en) Unmanned parking method and device for mining vehicle and readable storage medium
US20190064828A1 (en) Autonomous yard vehicle system
CN112477533B (en) Dual-purpose transport robot of facility agriculture rail
CN108931786A (en) Curb detection device and method
CN103204104B (en) Monitored control system and method are driven in a kind of full visual angle of vehicle
CN114325755B (en) Retaining wall detection method and system suitable for automatic driving vehicle
CN113805571B (en) Robot walking control method, system, robot and readable storage medium
CN111930125A (en) Low-cost obstacle detection device and method suitable for AGV
CN103760903A (en) Intelligent tracking conveying management system for warehouses
Chun-Fu et al. Research on visual navigation algorithm of AGV used in the small agile warehouse
CN107578046B (en) Auxiliary vehicle driving method based on image binarization processing
CN114690765A (en) Mine card self-adaptive retaining wall unloading parking method, device, system and computer equipment
Zhang et al. Monocular visual navigation of an autonomous vehicle in natural scene corridor-like environments
CN115568332A (en) Automatic following transportation platform for field environment and control method thereof
De Saxe et al. A visual template-matching method for articulation angle measurement
CN115223039A (en) Robot semi-autonomous control method and system for complex environment
CN116533998B (en) Automatic driving method, device, equipment, storage medium and vehicle of vehicle
Nadav et al. Off-road path and obstacle detection using monocular camera
CN117369460A (en) Intelligent inspection method and system for loosening faults of vehicle bolts
CN206298317U (en) A kind of fork truck type AGV system of positioning function of being moveed backward with high accuracy
CN115438430B (en) Mining area vehicle driving stability prediction method and device
CN111591284A (en) Visual field blind area obstacle avoidance method and device based on monocular recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant