CN112445220B - Automatic guided vehicle control method and device, storage medium and electronic equipment - Google Patents

Automatic guided vehicle control method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112445220B
CN112445220B CN201910817729.0A CN201910817729A CN112445220B CN 112445220 B CN112445220 B CN 112445220B CN 201910817729 A CN201910817729 A CN 201910817729A CN 112445220 B CN112445220 B CN 112445220B
Authority
CN
China
Prior art keywords
guided vehicle
automatic guided
dimensional code
present disclosure
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910817729.0A
Other languages
Chinese (zh)
Other versions
CN112445220A (en
Inventor
侯锡锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN201910817729.0A priority Critical patent/CN112445220B/en
Publication of CN112445220A publication Critical patent/CN112445220A/en
Application granted granted Critical
Publication of CN112445220B publication Critical patent/CN112445220B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The disclosure relates to the technical field of intelligent vehicles, and provides a control method of an automatic guided vehicle, a control device of the automatic guided vehicle, a computer storage medium and electronic equipment, wherein the control method of the automatic guided vehicle comprises the following steps: acquiring two-dimensional code labels randomly paved in a working site of an automatic guided vehicle, wherein the two-dimensional code labels carry position information of the working site; planning a task execution path of the automatic guided vehicle based on the two-dimensional code label; acquiring position offset information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path; determining a safety boundary of the automatic guided vehicle according to the position deviation information and a preset safety driving distance of the automatic guided vehicle; and carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary. The control method of the automatic guided vehicle not only can solve the technical problem of high implementation difficulty caused by paving equidistant two-dimensional codes in the prior art, but also can reduce the paving cost of the two-dimensional codes.

Description

Automatic guided vehicle control method and device, storage medium and electronic equipment
Technical Field
The disclosure relates to the technical field of intelligent vehicles, and in particular relates to a control method of an automatic guided vehicle, a control device of the automatic guided vehicle, a computer storage medium and electronic equipment.
Background
With the rapid development of computer and internet technologies, the technical field of intelligent vehicles is also rapidly developing. For example: an automatic guided vehicle (Automated Guided Vehicle, abbreviated as AGV) capable of transporting a load, which is equipped with an electromagnetic or optical automatic guide device capable of traveling along a predetermined guide path and having a safety protection function and various transfer functions.
At present, two-dimensional code image labels arranged in the same direction are generally pasted in a matrix mode at equal intervals in a passing area of a warehouse, and a walking path is planned through two-dimensional code associated position information so as to control the operation of an automatic guided vehicle. The distances among a plurality of operation points in the actual operation scene are random, and the situation that equidistant two-dimensional codes cannot be paved due to site limitation possibly exists, so that the site implementation difficulty is high.
In view of the foregoing, there is a need in the art for developing a new method and apparatus for controlling an automated guided vehicle.
It should be noted that the information disclosed in the foregoing background section is only for enhancing understanding of the background of the present disclosure.
Disclosure of Invention
The disclosure aims to provide a control method of an automatic guided vehicle, a control device of the automatic guided vehicle, a computer storage medium and electronic equipment, so that the defect of higher field implementation difficulty caused by paving equidistant two-dimension codes in the control method of the automatic guided vehicle in the prior art is avoided at least to a certain extent.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a control method of an automatic guided vehicle, including: acquiring two-dimensional code labels randomly paved in a working place of the automatic guided vehicle, wherein the two-dimensional code labels carry position information of the working place; planning a task execution path of the automatic guided vehicle based on the two-dimensional code label; acquiring the position offset information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path; determining a safety boundary of the automatic guided vehicle according to the position deviation information and a preset safety driving distance of the automatic guided vehicle; and carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
In an exemplary embodiment of the present disclosure, the method further comprises: layering the automatic guided vehicles in the same working place to obtain layering results; determining a collision detection range corresponding to the automatic guided vehicle according to the layering result; acquiring a target automatic guided vehicle in the collision detection range; and carrying out real-time collision detection on the automatic guided vehicle based on a direction bounding box algorithm according to the safety boundary of the automatic guided vehicle and the target automatic guided vehicle.
In an exemplary embodiment of the disclosure, the planning the task execution path of the automated guided vehicle based on the two-dimensional code tag includes: acquiring the current position of the automatic guided vehicle based on the two-dimensional code tag; and planning a task execution path of the automatic guided vehicle according to the position relation between the current position and the position of the preset operation point.
In an exemplary embodiment of the disclosure, the determining the safety boundary of the automatic guided vehicle according to the position offset information and the preset safety driving distance of the automatic guided vehicle includes: based on the position offset information, acquiring a projection distance of the safe driving distance on a preset coordinate axis; determining boundary point coordinates corresponding to the automatic guided vehicle according to the projection distance and the coordinate information of the current position; and constructing a safety boundary of the automatic guided vehicle according to the boundary point coordinates.
In an exemplary embodiment of the present disclosure, the method further comprises: and layering the automatic guided vehicles in the same working site based on a quadtree algorithm to obtain the layering result.
In an exemplary embodiment of the present disclosure, the method further comprises: acquiring the interval distance between the real-time position of the automatic guided vehicle and the position of the preset operation point; and adjusting the running speed of the automatic guided vehicle based on the interval distance.
In an exemplary embodiment of the present disclosure, the method further comprises: and acquiring the travelling distance of the automatic guided vehicle, and updating the real-time position of the automatic guided vehicle according to the travelling distance.
According to a second aspect of the present disclosure, there is provided a control device of an automatic guided vehicle, including: the label laying module is used for obtaining two-dimensional code labels randomly laid in a working place of the automatic guided vehicle, and the two-dimensional code labels carry position information of the working place; the path planning module is used for planning a task execution path of the automatic guided vehicle based on the two-dimensional code label; the information acquisition module is used for acquiring the position offset information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path; the boundary determining module is used for determining the safety boundary of the automatic guided vehicle according to the position deviation information and the preset safety driving distance of the automatic guided vehicle; and the collision detection module is used for carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
According to a third aspect of the present disclosure, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the control method of an automated guided vehicle according to the first aspect described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to execute the method of controlling an automated guided vehicle according to the first aspect described above via execution of the executable instructions.
As can be seen from the above technical solutions, the control method of the automatic guided vehicle, the control device of the automatic guided vehicle, the computer storage medium and the electronic device in the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
In the technical scheme provided by some embodiments of the present disclosure, on one hand, two-dimensional code labels randomly paved in a working place of an automatic guided vehicle are obtained, the two-dimensional code labels carry position information of the working place, and a task execution path of the automatic guided vehicle is planned based on the two-dimensional code labels, so that the technical problems that in the prior art, the on-site implementation difficulty is high, the path is long, one operation point cannot be directly reached from the other operation point, the operation speed is slow, the operation efficiency of the automatic guided vehicle is improved, and the paving cost of the two-dimensional code is reduced can be solved. On the other hand, according to the task execution path, the position offset information of the current position and the next position of the automatic guided vehicle is obtained in real time, the safety boundary of the automatic guided vehicle is determined according to the position offset information and the preset safety running distance of the automatic guided vehicle, and according to the safety boundary, the automatic guided vehicle is subjected to real-time collision detection, so that the accuracy of calculation of the safety boundary and the accuracy of collision detection can be improved, and the running safety of the automatic guided vehicle is ensured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1 illustrates a flow diagram of a method of controlling an automated guided vehicle in an exemplary embodiment of the present disclosure;
FIG. 2 illustrates a flow diagram of a method of controlling an automated guided vehicle in another exemplary embodiment of the present disclosure;
FIG. 3 illustrates a schematic diagram of position offset information in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a schematic diagram of a corresponding safe driving distance of an automated guided vehicle in an exemplary embodiment of the present disclosure;
FIG. 5 illustrates a flow diagram of a method of controlling an automated guided vehicle in yet another exemplary embodiment of the present disclosure;
FIG. 6 illustrates a schematic diagram of a method of controlling an automated guided vehicle in an exemplary embodiment of the present disclosure;
FIG. 7 illustrates a flow diagram of a method of controlling an automated guided vehicle in yet another exemplary embodiment of the disclosure;
FIG. 8 illustrates a schematic diagram of layering an automated guided vehicle in the same worksite in an exemplary embodiment of the present disclosure;
FIG. 9 illustrates a schematic diagram of the safety boundaries of an automated guided vehicle A and a target automated guided vehicle B in an exemplary embodiment of the present disclosure;
FIG. 10 illustrates a schematic diagram of a method of controlling an automated guided vehicle in another exemplary embodiment of the present disclosure;
FIG. 11 illustrates a schematic diagram of a method of controlling an automated guided vehicle in yet another exemplary embodiment of the present disclosure;
FIG. 12 illustrates an overall flow diagram of a method of controlling an automated guided vehicle in an exemplary embodiment of the present disclosure;
FIG. 13 illustrates a schematic structural diagram of a control device of an automated guided vehicle in an exemplary embodiment of the present disclosure;
FIG. 14 illustrates a schematic diagram of a computer storage medium in an exemplary embodiment of the present disclosure;
Fig. 15 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. in addition to the listed elements/components/etc.; the terms "first" and "second" and the like are used merely as labels, and are not intended to limit the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
At present, two-dimensional code image labels arranged in the same direction are generally pasted in a matrix mode at equal intervals in a passing area of a warehouse, and a walking path is planned through two-dimensional code associated position information so as to control the operation of an automatic guided vehicle. However, the distances between a plurality of operation points in the actual operation scene are random, and equidistant two-dimensional codes can not be paved due to site limitation, so that the site implementation difficulty is high. In addition, if two-dimension codes are paved at equal intervals, the two-dimension codes are possibly paved more, and the cost is wasted. Thus, the difficulty and cost of the prior art automatic guided vehicle control method are to be reduced.
In the embodiment of the disclosure, a control method of an automatic guided vehicle is provided first, which at least overcomes the defect of higher field implementation difficulty caused by paving equidistant two-dimension codes in the control method of the automatic guided vehicle provided in the prior art to a certain extent.
Fig. 1 is a flow chart illustrating a control method of an automatic guided vehicle according to an exemplary embodiment of the present disclosure, and an execution subject of the control method of the automatic guided vehicle may be a server for controlling the automatic guided vehicle.
Referring to fig. 1, a control method of an automated guided vehicle according to one embodiment of the present disclosure includes the steps of:
Step S110, acquiring two-dimensional code labels randomly paved in a working place of the automatic guided vehicle, wherein the two-dimensional code labels carry position information of the working place;
step S120, planning a task execution path of the automatic guided vehicle based on the two-dimensional code label;
step S130, acquiring position offset information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path;
Step S140, determining a safety boundary of the automatic guided vehicle according to the position offset information and a preset safety driving distance of the automatic guided vehicle;
and step S150, carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
In the technical scheme provided by the embodiment shown in fig. 1, on the one hand, on the other hand, two-dimension code labels which are randomly paved in a working place of an automatic guided vehicle are obtained, the two-dimension code labels carry position information of the working place, and a task execution path of the automatic guided vehicle is planned based on the two-dimension code labels, so that the technical problems that in the prior art, the field implementation difficulty is high, the path is long, one operation point cannot be directly reached from the other operation point, the operation speed is low, the operation efficiency of the automatic guided vehicle is improved, and the paving cost of the two-dimension code is reduced can be solved. On the other hand, according to the task execution path, the position offset information of the current position and the next position of the automatic guided vehicle is obtained in real time, the safety boundary of the automatic guided vehicle is determined according to the position offset information and the preset safety running distance of the automatic guided vehicle, and according to the safety boundary, the automatic guided vehicle is subjected to real-time collision detection, so that the accuracy of calculation of the safety boundary and the accuracy of collision detection can be improved, and the running safety of the automatic guided vehicle is ensured.
The specific implementation of each step in fig. 1 is described in detail below:
in step S110, two-dimensional code labels randomly laid in a worksite of the automated guided vehicle are acquired, where the two-dimensional code labels carry position information of the worksite.
In an exemplary embodiment of the present disclosure, an automated guided vehicle (Automated Guided Vehicle, abbreviated as AGV) refers to an unmanned automated vehicle having an automated guidance device such as a magnetic stripe, track, or laser, traveling along a planned path, powered by a battery, and equipped with safety protection and various auxiliary mechanisms (e.g., transfer, assembly mechanisms).
In the exemplary embodiment of the disclosure, the two-dimensional code tags are two-dimensional codes carrying position information in the worksite, that is, each two-dimensional code tag stores different position information (for example, position coordinates and the like) in the worksite, and then the position information of the current worksite can be read out by scanning the two-dimensional code tags. For example, the two-dimensional code may be a QR two-dimensional code (QR), which includes a large amount of information and has a high reading and writing speed. On the other hand, the QR code can also represent various character information such as Chinese characters, images and the like, has strong confidentiality and anti-counterfeiting performance and high reliability, and is very convenient to use. If the automatic guided vehicle leaks to sweep the two-dimensional code in the running process, the two-dimensional code of related information can be swept nearby, and the requirements of the system on recognition speed and precision are met.
In the exemplary embodiment of the disclosure, two-dimensional code labels can be laid randomly according to actual conditions of a working site, and exemplary two-dimensional code labels can be laid randomly according to actual operation point requirements and actual conditions of the working site, the spacing distance before each two-dimensional code label can be set according to actual requirements, the spacing distances between two-dimensional codes can be set to be equal, and the spacing distances between two-dimensional codes can also be set to be unequal, so that the technical problem that cost is high due to the fact that equidistant two-dimensional codes cannot be laid or are laid due to site limitation in the prior art can be solved, flexibility of related technologies is improved, and laying cost and field implementation difficulty are reduced.
In step S120, a task execution path of the automated guided vehicle is planned based on the two-dimensional code tag.
In an exemplary embodiment of the present disclosure, fig. 2 schematically illustrates a flowchart of a control method of an automatic guided vehicle according to another exemplary embodiment of the present disclosure, specifically illustrates a flowchart of planning a task execution path of the automatic guided vehicle based on the two-dimensional code tag, and a specific embodiment is explained below with reference to fig. 2.
In step S201, a current position of the automated guided vehicle is obtained based on the two-dimensional code tag.
In an exemplary embodiment of the present disclosure, the current position of the automated guided vehicle may be obtained based on the two-dimensional code tag.
In an exemplary embodiment of the present disclosure, the current position, that is, the position of the automated guided vehicle at the current time point, may be, for example, a position coordinate of the automated guided vehicle in the working site.
In the exemplary embodiment of the disclosure, the two-dimensional code scanning device may be disposed on the automatic guided vehicle, and further, when the automatic guided vehicle is in operation, the two-dimensional code tag may be scanned by the scanning device, and further, the position information stored on the two-dimensional code tag may be obtained, and further, the position information may be sent to the traffic scheduling system (an integrated transportation and management system that is built by integrating advanced information technology, communication technology, sensing technology, control technology, computer technology and the like and plays a role in a wide range in all directions, and is real-time, accurate and efficient), so that the operation route and the location of the automatic guided vehicle in the whole working site can be mastered in real time, and further, the current position of the automatic guided vehicle may be known according to the feedback information of the traffic scheduling system. The current position of the automatic guided vehicle may be obtained as coordinates (x, y), and it should be noted that the coordinates (x, y) may be a center point of the automatic guided vehicle, or may be any feature point on the automatic guided vehicle, and may be set by itself according to actual situations, which all belong to the protection scope of the present disclosure.
In step S202, a task execution path of the automated guided vehicle is planned according to the positional relationship between the current position and the preset operation point position.
In an exemplary embodiment of the present disclosure, after the current position is determined, a task execution path of the automatic guided vehicle may be planned according to a positional relationship between the current position and a preset operation point position.
In an exemplary embodiment of the present disclosure, the preset operation point position, that is, a place where the automated guided vehicle needs to take or put down the goods, may be, for example, a placement end point where the automated guided vehicle carries the goods.
In an exemplary embodiment of the present disclosure, the task execution path is a route that the automated guided vehicle described above passes through when performing a transportation task, i.e., a route that the automated guided vehicle passes through from a current location of the automated guided vehicle to a preset job point location. For example, the current position may be used as a starting point, the preset operation point position A, B may be used as a passing point, the preset operation point position C may be used as an end point, the task execution path may be determined according to the positional relationship of three points ABC, and the path having the shortest distance and passing through the position A, B, C may be determined, and after the path is determined, the obstacle point may be removed to determine the task execution path.
In an exemplary embodiment of the present disclosure, after the two-dimensional code label is randomly laid, a task execution path of the automated guided vehicle may be planned based on the two-dimensional code label. Therefore, the technical problems that in the prior art, a path is dead, the path is longer, the curved path is more and the automatic guided vehicle cannot directly reach another operation point from the operation point and has slower operation speed due to the fact that the operation path is planned according to the equidistant two-dimensional code can be solved, and the operation efficiency of the automatic guided vehicle is improved.
In step S130, according to the task execution path, position offset information of the current position and the next position of the automated guided vehicle is obtained in real time.
In an exemplary embodiment of the present disclosure, a next position of the automated guided vehicle may be determined according to the task execution path described above. The next position is the next position that the automated guided vehicle needs to reach after leaving the current position during the transportation of the automated guided vehicle along the task execution path.
In an exemplary embodiment of the present disclosure, after the next position is determined, position offset information of the current position and the next position may be acquired in real time. For example, referring to fig. 3, fig. 3 shows a schematic diagram of position offset information in an exemplary embodiment of the present disclosure, where 301 is a current position, 302 is a next position, and a coordinate system may be established with the current position 301 as a coordinate origin, and the position offset information of the next position 302 and the current position is a displacement angle q between the next position 302 and the X axis.
In step S140, a safety boundary of the automatic guided vehicle is determined according to the position offset information and a preset safe driving distance of the automatic guided vehicle.
In an exemplary embodiment of the present disclosure, the preset safe driving distance of the automatic guided vehicle may be obtained based on the equipment model of the automatic guided vehicle. The device model is a symbol that indicates the specification of the device by letters and numerical values. The safe driving distance is the distance which is marked in advance when the equipment leaves the factory and cannot collide with other vehicles or objects in the running process. Different equipment models correspond to different safe driving distances. The safe driving distance may be, for example, a distance between the automatic guided vehicle and other vehicles or objects in four directions, such as front, rear, left and right, based on any feature point representing the current position of the automatic guided vehicle, and may be f, b, l, r. For example, referring to fig. 4, fig. 4 schematically illustrates a schematic diagram of a safe driving distance corresponding to an automatic guided vehicle in an exemplary embodiment of the present disclosure, referring to fig. 4, a rectangle acde is the automatic guided vehicle, a point h is any feature point indicating a current position (x, y) of the automatic guided vehicle, a length of a line nh is a front safe distance f, a length of a line hm is a rear safe distance b, a length of a line kh is a left safe driving distance l, and a length of a line hg is a right safe distance r.
In an exemplary embodiment of the present disclosure, reference may be made to fig. 5, and fig. 5 schematically illustrates a flowchart of a control method of an automatic guided vehicle according to still another exemplary embodiment of the present disclosure, specifically illustrating a flowchart for determining a safety boundary of the automatic guided vehicle according to the above-mentioned position offset information and a preset safety driving distance of the automatic guided vehicle. Step S140 is explained below in conjunction with fig. 5.
In step S501, based on the position offset information, a projection distance of the safe driving distance on a preset coordinate axis is obtained.
In an exemplary embodiment of the present disclosure, for example, after the position offset information is determined, a projection distance of the safe driving distance on a preset coordinate axis may be obtained. As an example, referring to fig. 6, fig. 6 schematically illustrates a schematic diagram of a control method of an automatic guided vehicle according to an exemplary embodiment of the present disclosure, specifically illustrating a schematic diagram of a projection distance of a safe driving distance on a preset coordinate axis, specifically, describing a coordinate of a calculated boundary point z as an example, when the position offset information is q, it is known that a projection distance of a safe driving distance f on a preset coordinate axis X is hi=f×cosq, and a projection distance of a safe driving distance r on a preset coordinate axis X is jz=r×cosq. The projection distance of the safe driving distance f on the preset coordinate axis Y is ni=f× sinq, and the projection distance of the safe driving distance r on the preset coordinate axis Y is nj=r× sinq.
In step S502, according to the projection distance and the coordinate information of the current position, the coordinates of the boundary point corresponding to the automated guided vehicle are determined.
In an exemplary embodiment of the present disclosure, according to the above-mentioned coordinate information of the projection distance and the current position, with continued reference to fig. 6, the abscissa of the z point is x+f×cosq+r× sinq, and the ordinate of the z point is y+f× sinq-r×cosq. That is, the coordinates of the z point of the boundary point corresponding to the automatic guided vehicle are (x+f×cosq+r× sinq, y+f× sinq-r×cosq).
In an exemplary embodiment of the disclosure, similarly, with reference to the correlation calculation of the above steps, four boundary point coordinates of the above safety boundary may be determined as point z (x+f×cosq+r× sinq, y+f× sinq-r×cosq), point v (x+f×cosq-l× sinq, y+f× sinq +l×cosq), point u (x-b×cosq-l× sinq, y-b× sinq +l×cosq), and point w (x-b×cosq+r× sinq, y-b× sinq-r×cosq). And the position coordinates of the center point o of the automatic guided vehicle are as follows
In step S503, a safety boundary of the automated guided vehicle is constructed according to the boundary point coordinates.
In an exemplary embodiment of the present disclosure, the safety boundary is a rectangular boundary of the automated guided vehicle determined according to the boundary point coordinates, and after determining the boundary point coordinates, the safety boundary of the automated guided vehicle may be constructed according to the boundary point coordinates. With reference to the relevant explanation of the above steps, and with continued reference to fig. 6, the safety boundary constructed from the four boundary point coordinates may be a rectangle wzvu shown as a dotted line, having a length f+b and a width l+r.
In an exemplary embodiment of the present disclosure, after determining a safety boundary of the automated guided vehicle, collision detection may be performed on the automated guided vehicle according to the safety boundary. Through carrying out collision detection to the automated guided vehicle according to the safety boundary, the automated guided vehicle of different safety boundaries corresponds not equidimension to can consider the condition of the mixed field operation of the automated guided vehicle of equidimension difference, solve the technical problem that can't carry out collision detection or the inaccurate detection result of collision detection because of the size difference of car, improve collision detection result's validity and reliability.
In step S150, the automatic guided vehicle is subjected to real-time collision detection according to the safety boundary.
In an exemplary embodiment of the present disclosure, fig. 7 schematically illustrates a flowchart of a control method of an automatic guided vehicle in yet another exemplary embodiment of the present disclosure, specifically illustrates a flowchart of real-time collision detection of the automatic guided vehicle after determining a safety boundary, and a specific embodiment is explained below with reference to fig. 7.
In step S701, layering processing is performed on the automatic guided vehicles located in the same working site, so as to obtain a layering result.
In an exemplary embodiment of the present disclosure, automated guided vehicles located within the same worksite may be layered based on a quadtree algorithm to obtain layered results. It should be noted that, the layering processing may also be performed based on a trigeminal tree algorithm, a binary tree algorithm, etc., and may be set according to the actual situation, which belongs to the protection scope of the present disclosure.
In an exemplary embodiment of the present disclosure, a quadtree is a tree data structure with four child nodes per parent node. For example, reference may be made to fig. 8, which illustrates a schematic diagram of layering automated guided vehicles in the same work site in an exemplary embodiment of the present disclosure, where there may be multiple automated guided vehicles in the same work site, as illustrated in fig. 8, and the work site may be divided into four areas for distinguishing objects in different locations, where the four nodes of the quadtree are properly representative of the four areas. These four areas may be named quadrants 1,2,3, 4 for convenience. By way of example, a preset number of automated guided vehicles (e.g., 2) can be accommodated at most in each quadrant, then for quadrants exceeding 2 automated guided vehicles, the division can continue, yielding another 4 sub-quadrants, and so on. Finally, all automatic guided vehicles in the working site can be divided into a plurality of small areas, and the layering result is obtained. Therefore, the technical problems that in the prior art, when a plurality of automatic guided vehicles exist in a working site, collision detection is carried out on every two automatic guided vehicles (the automatic guided vehicles actually positioned at the lower left corner of the drawing and the automatic guided vehicles positioned at the upper right corner of the drawing are obviously unlikely to collide) so as to cause high algorithm complexity and lower efficiency can be avoided, and the collision detection efficiency is improved.
In step S702, a collision detection range corresponding to the automated guided vehicle is determined according to the layering result.
In an exemplary embodiment of the present disclosure, after the layering result is obtained and the safety boundary is determined, a collision detection range corresponding to the automated guided vehicle may be determined based on the layering result. The collision detection range may be, for example, a detection range of 20 meters in radius centered on any one of the automated guided vehicles based on the layered results described above. It should be noted that, the specific collision detection range may be set according to the actual situation, which belongs to the protection range of the present disclosure.
In step S703, a target automated guided vehicle that is within the collision detection range is acquired.
In an exemplary embodiment of the present disclosure, after the above-described collision detection range is determined, an automated guided vehicle located within the above-described collision detection range may be targeted.
In step S704, according to the safety boundary between the automated guided vehicle and the target automated guided vehicle, real-time collision detection is performed on the automated guided vehicle based on a direction bounding box algorithm.
In the exemplary embodiment of the present disclosure, referring to the explanation related to the step S703, the process of collision detection of any two automatic guided vehicles (automatic guided vehicle and target automatic guided vehicle) in the same working site is described as an example, and referring to fig. 9, fig. 9 schematically illustrates a schematic safety boundary diagram of an automatic guided vehicle a and a target automatic guided vehicle B in an exemplary embodiment of the present disclosure, and in conjunction with fig. 9, a rectangle a indicates a schematic safety boundary diagram of an automatic guided vehicle a, and a rectangle B indicates a schematic safety boundary diagram of a target automatic guided vehicle B. Where P A represents the center point of rectangle A and P B represents the center point of rectangle B. A x denotes a unit vector of rectangle a, and a y denotes a unit vector of rectangle a; b x denotes a unit vector of rectangle B, and B y denotes a unit vector of rectangle B. W A represents half the width of rectangle A, and H A represents half the height of rectangle A; w B represents half the width of rectangle B, and H B represents half the height of rectangle B. the magnitude T of vector T is P B-PA, which is oriented from point P A to point P B. Referring to steps S502-S503 and the explanation of FIG. 6, as exemplified by automated guided vehicle A, P A is the coordinates of center point o on the automated guided vehicleW A is 0.5 (l+r), H A is 0.5 (f+b), and the ratio of the difference between the coordinates of point v and point u (or the difference between the coordinates of point z and point W) to its modulus is vector A x, and the ratio of the difference between the coordinates of point u and point W (or the difference between the coordinates of point v and point z) to its modulus is vector A y. Similarly, the above-mentioned related parameters of the rectangle B may be determined according to the positional deviation information of the target automatic guided vehicle B and the preset safe driving distance, and further, the vector T may be determined, referring to the above-mentioned steps S110 to S130 and S501 to S503.
In an exemplary embodiment of the present disclosure, a direction bounding box algorithm (Oriented bounding box, abbreviated as OBB) may be used for collision detection, for example, in a manner of: if the projections of the two polygons on all axes overlap, judging that the two polygons collide; otherwise, it is determined that no collision has occurred. For example, reference may be made to fig. 10, fig. 10 schematically illustrates a schematic diagram of a control method of an automatic guided vehicle in another exemplary embodiment of the present disclosure, specifically illustrating a schematic diagram when two polygons overlap on one axis but do not collide, as shown in fig. 10, a projection of a side P 5P6 of a rectangular boundary P 5P6P7P8 of the automatic guided vehicle 1 on an X axis is P 1P3, a projection of a side P 9P10 of a rectangular boundary P 9P10P11P12 of the automatic guided vehicle 2 on the X axis is P 2P4, and it is seen that P 1P3 and P 2P4 generate an overlap P 2P3, and two automatic guided vehicles do not collide.
In an exemplary embodiment of the present disclosure, referring to the explanation related to the above steps, the above determination manner may be expressed as equation 1 (projection length of the center point line on the coordinate axis and sum of projection radii larger than two rectangles, proj is projection calculation): when |proj (T) | > 0.5| Proj (rectangle a) |+0.5| Proj (rectangle B) |, no collision occurs, and on the contrary, collision occurs. Further, substituting the relevant parameters shown in fig. 9 into the above formula 1 and expanding them can obtain a formula 2:|T·L|>|(WA*Ax)·L|+|(HA*Ay)·L|+|(WB*Bx)·L|+|(HB*By)·L|,, where "×" indicates that the numerical value is multiplied by the vector and "·" indicates that the vector is multiplied by the vector. The L vector may contain four cases: a X、Ay、BX、By. The four cases are substituted into the formula 2 respectively, if the calculation results corresponding to the four cases all meet the formula 2, it can be determined that the automatic guided vehicle a and the target automatic guided vehicle B collide, and if any one of the calculation results corresponding to the four cases does not meet the formula 2, it can be determined that the automatic guided vehicle a and the target automatic guided vehicle B do not collide.
In the exemplary embodiment of the present disclosure, when it is detected that the above-mentioned automated guided vehicle a and the target automated guided vehicle B may collide, the travel paths of the automated guided vehicle a, the target automated guided vehicle B, or the automated guided vehicle a and the target automated guided vehicle B may be modified to avoid the occurrence of a collision.
In an exemplary embodiment of the present disclosure, a distance between a real-time position of the automated guided vehicle and a preset operation point position may also be obtained, and a traveling speed of the automated guided vehicle may be controlled based on the distance. For example, when the distance between the automatic guided vehicle and the preset operation point is acquired to be close (for example, 10 meters), the running speed of the automatic guided vehicle can be controlled to be reduced (for example, reduced to 10 meters per minute). When the distance between the automatic guided vehicle and the preset operation point is long (for example, 500 meters), the running speed of the automatic guided vehicle can be controlled to be increased (for example, 50 meters per minute). It should be noted that, the setting of the related values of the distance and the running speed can be set according to the actual situation, which belongs to the protection scope of the present disclosure.
In an exemplary embodiment of the present disclosure, a travel distance of the automated guided vehicle may also be obtained, and a real-time position (a position that varies with movement of the automated guided vehicle) of the automated guided vehicle may be updated according to the travel distance. For example, if the initial position of the automatic guided vehicle is (x, y), when the travel distance of the automatic guided vehicle is t, and an exemplary diagram of a control method of the automatic guided vehicle according to still another exemplary embodiment of the disclosure is shown in fig. 11, specifically, a diagram of updating the real-time position of the automatic guided vehicle is shown, referring to fig. 11, the real-time position of the automatic guided vehicle may be updated to h 1 (x+t×cosq, y+t× sinq), and further, a new safety boundary may be calculated according to the real-time position of the automatic guided vehicle, so as to perform collision detection again according to the new safety boundary. Therefore, collision detection can be performed in real time according to the position of the automatic guided vehicle, and the running safety of the automatic guided vehicle is ensured.
In an exemplary embodiment of the present disclosure, fig. 12 schematically illustrates an overall flowchart of a control method of an automated guided vehicle in an exemplary embodiment of the present disclosure, and a specific implementation is explained below in connection with fig. 12.
In step S1201, scanning a two-dimensional code label laid randomly in advance to obtain a current position of an automatic guided vehicle;
in step S1202, a preset job point position is determined according to a currently executed job task, and a task execution path of the automated guided vehicle is planned according to the current position and the preset job point position.
In step S1203, a distance (remaining distance) between the real-time position of the automated guided vehicle and the position of the preset working point is acquired, and the traveling speed of the automated guided vehicle is controlled based on the distance.
In step S1204, determining a safety boundary of the automated guided vehicle according to the task execution path and the preset safety distance of the automated guided vehicle;
In step S1205, collision detection is performed on the automated guided vehicles located in the same work site based on the safety boundary;
In step S1206, the automated guided vehicle is controlled to travel the remaining task execution path until the job ends.
The present disclosure also provides a control device of an automatic guided vehicle, and fig. 13 shows a schematic structural diagram of the control device of the automatic guided vehicle in an exemplary embodiment of the present disclosure; as shown in fig. 13, the control apparatus 1300 of the automated guided vehicle may include a label laying module 1301, a path planning module 1302, an information acquisition module 1303, a boundary determination module 1304, and a collision detection module 1305. Wherein:
The label laying module 1301 is configured to obtain two-dimensional code labels randomly laid in a worksite of the automated guided vehicle, where the two-dimensional code labels carry position information of the worksite.
In an exemplary embodiment of the present disclosure, a tag laying module is configured to obtain two-dimensional code tags randomly laid in a worksite of an automatic guided vehicle, where the two-dimensional code tags carry position information of the worksite.
The path planning module 1302 is configured to plan a task execution path of the automated guided vehicle based on the two-dimensional code tag.
In an exemplary embodiment of the present disclosure, a path planning module is configured to obtain a current position of an automatic guided vehicle based on a two-dimensional code tag; and planning a task execution path of the automatic guided vehicle according to the position relation between the current position and the position of the preset operation point.
The information obtaining module 1303 is configured to obtain, in real time, positional offset information of a current position and a next position of the automated guided vehicle according to the task execution path.
In an exemplary embodiment of the present disclosure, the information obtaining module is configured to obtain, in real time, position offset information of a current position and a next position of the automated guided vehicle according to the task execution path.
The boundary determining module 1304 is configured to determine a safety boundary of the automated guided vehicle according to the position offset information and a preset safe driving distance of the automated guided vehicle.
In an exemplary embodiment of the present disclosure, the boundary determining module is configured to obtain a projection distance of the safe driving distance on a preset coordinate axis based on the position offset information; determining boundary point coordinates corresponding to the automatic guided vehicle according to the projection distance and the coordinate information of the current position; and constructing the safety boundary of the automatic guided vehicle according to the boundary point coordinates.
And the collision detection module 1305 is used for carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
In an exemplary embodiment of the present disclosure, a collision detection module is configured to perform layering processing on an automatic guided vehicle located in the same working site, so as to obtain a layering result; determining a collision detection range corresponding to the automatic guided vehicle according to the layering result; acquiring a target automatic guided vehicle in a collision detection range; and carrying out real-time collision detection on the automatic guided vehicle based on a direction bounding box algorithm according to the safety boundary of the automatic guided vehicle and the target automatic guided vehicle.
In an exemplary embodiment of the present disclosure, the collision detection module is further configured to perform layering processing on the automated guided vehicles located in the same working site based on a quadtree algorithm, so as to obtain a layering result.
In an exemplary embodiment of the present disclosure, the collision detection module is further configured to obtain a separation distance between a real-time position of the automated guided vehicle and a preset operation point position; and controlling the running speed of the automatic guided vehicle based on the interval distance.
In an exemplary embodiment of the present disclosure, the collision detection module is further configured to obtain a travel distance of the automated guided vehicle, and update the real-time position of the automated guided vehicle according to the travel distance.
The specific details of each module in the above-mentioned control device of the automatic guided vehicle are already described in detail in the corresponding control method of the automatic guided vehicle, so that they will not be described herein.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Furthermore, although the steps of the methods in the present disclosure are depicted in a particular order in the drawings, this does not require or imply that the steps must be performed in that particular order, or that all illustrated steps be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a mobile terminal, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer storage medium capable of implementing the above method is also provided. On which a program product is stored which enables the implementation of the method described above in the present specification. In some possible embodiments, the various aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device.
Referring to fig. 14, a program product 1400 for implementing the above-described method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read-only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 1500 according to such an embodiment of the present disclosure is described below with reference to fig. 15. The electronic device 1500 shown in fig. 15 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present disclosure in any way.
As shown in fig. 15, the electronic device 1500 is embodied in the form of a general purpose computing device. The components of electronic device 1500 may include, but are not limited to: the at least one processing unit 1511, the at least one storage unit 1520, a bus 1530 that connects the different system components (including the storage unit 1520 and the processing unit 1510).
Wherein the storage unit stores program code that is executable by the processing unit 1510 such that the processing unit 1510 performs steps according to various exemplary embodiments of the present disclosure described in the above section of the "exemplary method" of the present specification. For example, the processing unit 1510 may perform the steps as shown in fig. 1: step S110, acquiring two-dimensional code labels randomly paved in a working place of the automatic guided vehicle, wherein the two-dimensional code labels carry position information of the working place; step S120, planning a task execution path of the automatic guided vehicle based on the two-dimensional code label; step S130, acquiring position offset information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path; step S140, determining a safety boundary of the automatic guided vehicle according to the position offset information and a preset safety driving distance of the automatic guided vehicle; and step S150, carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
The storage unit 1520 may include readable media in the form of volatile memory units such as Random Access Memory (RAM) 15201 and/or cache memory 15202, and may further include Read Only Memory (ROM) 15203.
The storage unit 1520 may also include a program/utility 15204 having a set (at least one) of program modules 15205, such program modules 15205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 1530 may be a bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1500 may also communicate with one or more external devices 1600 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 1500, and/or any device (e.g., router, modem, etc.) that enables the electronic device 1500 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1550. Also, the electronic device 1500 may also communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, for example, the Internet, through a network adapter 1560. As shown, the network adapter 1560 communicates with other modules of the electronic device 1500 over the bus 1530. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 1500, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (9)

1. A method of controlling an automated guided vehicle, comprising:
Acquiring two-dimensional code labels randomly paved in a working place of the automatic guided vehicle, wherein the two-dimensional code labels carry position information of the working place;
planning a task execution path of the automatic guided vehicle based on the two-dimensional code label;
acquiring the position offset information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path;
Determining a safety boundary of the automatic guided vehicle according to the position deviation information and a preset safety driving distance of the automatic guided vehicle;
the determining the safety boundary of the automatic guided vehicle according to the position deviation information and the preset safety driving distance of the automatic guided vehicle comprises the following steps:
Based on the position offset information, acquiring a projection distance of the preset safe driving distance on a preset coordinate axis; the preset safe driving distance is determined according to the equipment model of the automatic guided vehicle;
determining boundary point coordinates corresponding to the automatic guided vehicle according to the projection distance and the coordinate information of the current position;
constructing a safety boundary of the automatic guided vehicle according to the boundary point coordinates;
And carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
2. The method according to claim 1, wherein the method further comprises:
layering the automatic guided vehicles in the same working place to obtain layering results;
Determining a collision detection range corresponding to the automatic guided vehicle according to the layering result;
Acquiring a target automatic guided vehicle in the collision detection range;
And carrying out real-time collision detection on the automatic guided vehicle based on a direction bounding box algorithm according to the safety boundary of the automatic guided vehicle and the target automatic guided vehicle.
3. The method of claim 1, wherein the planning the mission execution path of the automated guided vehicle based on the two-dimensional code tag comprises:
acquiring the current position of the automatic guided vehicle based on the two-dimensional code tag;
and planning a task execution path of the automatic guided vehicle according to the position relation between the current position and the position of the preset operation point.
4. The method according to claim 1, wherein the method further comprises:
And layering the automatic guided vehicles in the same working site based on a quadtree algorithm to obtain the layering result.
5. A method according to claim 3, characterized in that the method further comprises:
acquiring the interval distance between the real-time position of the automatic guided vehicle and the position of the preset operation point;
And adjusting the running speed of the automatic guided vehicle based on the interval distance.
6. The method of claim 5, wherein the method further comprises:
and acquiring the travelling distance of the automatic guided vehicle, and updating the real-time position of the automatic guided vehicle according to the travelling distance.
7. A control device for an automated guided vehicle, comprising:
the label laying module is used for obtaining two-dimensional code labels randomly laid in a working place of the automatic guided vehicle, and the two-dimensional code labels carry position information of the working place;
the path planning module is used for planning a task execution path of the automatic guided vehicle based on the two-dimensional code label;
The information acquisition module is used for acquiring the position offset information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path;
The boundary determining module is used for acquiring the projection distance of the preset safe driving distance on a preset coordinate axis based on the position offset information; the preset safe driving distance is determined according to the equipment model of the automatic guided vehicle;
determining boundary point coordinates corresponding to the automatic guided vehicle according to the projection distance and the coordinate information of the current position;
constructing a safety boundary of the automatic guided vehicle according to the boundary point coordinates;
and the collision detection module is used for carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
8. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the method of controlling an automated guided vehicle according to any one of claims 1 to 6.
9. An electronic device, comprising:
A processor; and
A memory for storing executable instructions of the processor;
wherein the processor is configured to execute the method of controlling an automated guided vehicle according to any one of claims 1-6 via execution of the executable instructions.
CN201910817729.0A 2019-08-30 2019-08-30 Automatic guided vehicle control method and device, storage medium and electronic equipment Active CN112445220B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910817729.0A CN112445220B (en) 2019-08-30 2019-08-30 Automatic guided vehicle control method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910817729.0A CN112445220B (en) 2019-08-30 2019-08-30 Automatic guided vehicle control method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112445220A CN112445220A (en) 2021-03-05
CN112445220B true CN112445220B (en) 2024-09-24

Family

ID=74733836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910817729.0A Active CN112445220B (en) 2019-08-30 2019-08-30 Automatic guided vehicle control method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112445220B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113928306B (en) * 2021-11-30 2023-05-02 合肥工业大学 Automobile integrated stability augmentation control method and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105388899A (en) * 2015-12-17 2016-03-09 中国科学院合肥物质科学研究院 An AGV navigation control method based on two-dimension code image tags
CN107703940A (en) * 2017-09-25 2018-02-16 芜湖智久机器人有限公司 A kind of air navigation aid based on ceiling Quick Response Code

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007188158A (en) * 2006-01-11 2007-07-26 Mitsui Eng & Shipbuild Co Ltd Component supply system
ITBO20090046A1 (en) * 2009-02-02 2010-08-02 Elettric 80 Spa POSITIONING SYSTEM FOR AUTOMATIC DRIVEN VEHICLES OPERATING WITH RADIO FREQUENCY WITH DIRECT ANTENNAS
JP2012225878A (en) * 2011-04-22 2012-11-15 Mitsubishi Heavy Ind Ltd Damage detection device and method
CN102495612B (en) * 2011-12-24 2014-08-13 长春艾希技术有限公司 Electrical automatic control system device of automatic guided vehicle adopting non-contact power supply technology
KR101318560B1 (en) * 2012-02-29 2013-10-16 부산대학교 산학협력단 Vision based guideline interpretation method for stable driving control of guideline tracing AGVs
CN104407615B (en) * 2014-11-03 2017-01-25 上海电器科学研究所(集团)有限公司 AGV robot guide deviation correction method
US9704043B2 (en) * 2014-12-16 2017-07-11 Irobot Corporation Systems and methods for capturing images and annotating the captured images with information
CN107168338B (en) * 2017-07-07 2023-09-15 中国计量大学 Inertial guided vehicle navigation method based on millimeter wave radar and inertial guided vehicle
CN107678433B (en) * 2017-10-20 2020-05-29 上海海事大学 Loading and unloading equipment scheduling method considering AGV collision avoidance
CN107844119A (en) * 2017-12-14 2018-03-27 中国计量大学 Visual guidance method and visual guidance car based on space-time conversion
CN108279026A (en) * 2018-01-19 2018-07-13 浙江科钛机器人股份有限公司 A kind of AGV inertial navigation systems and method based on T-type RFID beacons
CN108128155A (en) * 2018-02-01 2018-06-08 西华大学 Active protection type omnidirectional wheel walking and anti-collision system
CN108584258A (en) * 2018-02-01 2018-09-28 西华大学 Novel AGV dolly system
CN109189076B (en) * 2018-10-24 2021-08-31 湖北三江航天万山特种车辆有限公司 Heavy guided vehicle positioning method based on visual sensor and heavy guided vehicle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105388899A (en) * 2015-12-17 2016-03-09 中国科学院合肥物质科学研究院 An AGV navigation control method based on two-dimension code image tags
CN107703940A (en) * 2017-09-25 2018-02-16 芜湖智久机器人有限公司 A kind of air navigation aid based on ceiling Quick Response Code

Also Published As

Publication number Publication date
CN112445220A (en) 2021-03-05

Similar Documents

Publication Publication Date Title
US20210109521A1 (en) Redundant pose generation system
KR102539942B1 (en) Method and apparatus for training trajectory planning model, electronic device, storage medium and program
EP4141599B1 (en) Multi-robot route planning
US12051325B2 (en) Method and apparatus for coordinating multiple cooperative vehicle trajectories on shared road networks
CN110782092A (en) Trajectory planning method and device of unmanned distribution vehicle in unstructured scene
JPH07129238A (en) Generation system for obstacle avoiding path
EP4170581A1 (en) Method, device and system for cooperatively constructing point cloud map
CN111308996A (en) Training device and cooperative operation control method thereof
CN112466111B (en) Vehicle driving control method and device, storage medium and electronic equipment
US20200088536A1 (en) Method for trajectory planning of a movable object
CN113168189A (en) Flight operation method, unmanned aerial vehicle and storage medium
CN114322799A (en) Vehicle driving method and device, electronic equipment and storage medium
CN112445220B (en) Automatic guided vehicle control method and device, storage medium and electronic equipment
US20230242142A1 (en) Systems, methods, and computer-readable media for spatio-temporal motion planning
CN111796589A (en) Navigation control method, intelligent warehousing system and automatic guide vehicle
US20210365043A1 (en) System and method for guiding vehicles and computer program product
US20210398014A1 (en) Reinforcement learning based control of imitative policies for autonomous driving
CN112346480A (en) Indoor unmanned aerial vehicle, control method thereof and computer-readable storage medium
CN115416693A (en) Automatic driving trajectory planning method and system based on space-time corridor
CN113449798A (en) Port unmanned driving map generation method and device, electronic equipment and storage medium
CN112197763B (en) Map construction method, device, equipment and storage medium
CN113483770A (en) Path planning method and device in closed scene, electronic equipment and storage medium
CN114115293A (en) Robot obstacle avoidance method, device, equipment and storage medium
CN117719500B (en) Vehicle collision detection method, device, electronic equipment and storage medium
Neuman et al. Anytime policy planning in large dynamic environments with interactive uncertainty

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant