CN112445220A - Control method and device for automatic guided vehicle, storage medium and electronic equipment - Google Patents

Control method and device for automatic guided vehicle, storage medium and electronic equipment Download PDF

Info

Publication number
CN112445220A
CN112445220A CN201910817729.0A CN201910817729A CN112445220A CN 112445220 A CN112445220 A CN 112445220A CN 201910817729 A CN201910817729 A CN 201910817729A CN 112445220 A CN112445220 A CN 112445220A
Authority
CN
China
Prior art keywords
guided vehicle
automatic guided
automated guided
present disclosure
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910817729.0A
Other languages
Chinese (zh)
Inventor
侯锡锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN201910817729.0A priority Critical patent/CN112445220A/en
Publication of CN112445220A publication Critical patent/CN112445220A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions

Abstract

The disclosure relates to the technical field of intelligent vehicles, and provides a control method of an automatic guided vehicle, a control device of the automatic guided vehicle, a computer storage medium and electronic equipment, wherein the control method of the automatic guided vehicle comprises the following steps: acquiring a two-dimensional code label randomly laid in a work site of the automatic guided vehicle, wherein the two-dimensional code label carries position information of the work site; planning a task execution path of the automatic guided vehicle based on the two-dimension code label; acquiring the position deviation information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path; determining a safety boundary of the automatic guided vehicle according to the position deviation information and a preset safety driving distance of the automatic guided vehicle; and carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary. The control method of the automatic guided vehicle can solve the technical problem that in the prior art, implementation difficulty is high due to the fact that equidistant two-dimensional codes are laid, and can reduce the laying cost of the two-dimensional codes.

Description

Control method and device for automatic guided vehicle, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of intelligent vehicle technologies, and in particular, to a control method for an automated guided vehicle, a control device for an automated guided vehicle, a computer storage medium, and an electronic device.
Background
With the rapid development of computer and internet technologies, the field of intelligent vehicle technology is also rapidly developing. For example: an Automated Guided Vehicle (AGV) capable of transporting a load is a transport Vehicle equipped with an electromagnetic or optical Automated guidance device, which can travel along a predetermined guidance route and has safety protection and various transfer functions.
At present, two-dimension code image labels arranged in the same direction are generally pasted in a matrix form at equal intervals in a passing area of a warehouse, and a walking path is planned through the position information associated with the two-dimension codes so as to control the operation of an automatic guided vehicle. The distances among a plurality of operation points in an actual operation scene are random, and the situation that equidistant two-dimensional codes cannot be laid due to site limitation exists, so that the field implementation difficulty is high.
In view of the above, there is a need in the art to develop a new control method and device for an automated guided vehicle.
It is to be noted that the information disclosed in the background section above is only used to enhance understanding of the background of the present disclosure.
Disclosure of Invention
The present disclosure aims to provide a control method of an automated guided vehicle, a control device of an automated guided vehicle, a computer storage medium, and an electronic device, so as to at least avoid to some extent the defect of high difficulty in field implementation caused by laying equidistant two-dimensional codes in the control method of an automated guided vehicle in the prior art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a control method of an automated guided vehicle, including: acquiring a two-dimensional code label randomly laid in a work site of the automatic guided vehicle, wherein the two-dimensional code label carries position information of the work site; planning a task execution path of the automatic guided vehicle based on the two-dimension code label; acquiring position deviation information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path; determining a safety boundary of the automatic guided vehicle according to the position deviation information and a preset safety driving distance of the automatic guided vehicle; and carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
In an exemplary embodiment of the present disclosure, the method further comprises: carrying out layering processing on the automatic guided vehicles in the same working site to obtain a layering result; determining a collision detection range corresponding to the automatic guided vehicle according to the layering result; acquiring a target automatic guided vehicle in the collision detection range; and performing real-time collision detection on the automatic guided vehicle based on a direction bounding box algorithm according to the safety boundary between the automatic guided vehicle and the target automatic guided vehicle.
In an exemplary embodiment of the present disclosure, the planning a task execution path of the automated guided vehicle based on the two-dimensional code tag includes: acquiring the current position of the automatic guided vehicle based on the two-dimension code label; and planning a task execution path of the automatic guided vehicle according to the position relation between the current position and the position of a preset operation point.
In an exemplary embodiment of the present disclosure, the determining a safety boundary of the automated guided vehicle according to the position offset information and a preset safety driving distance of the automated guided vehicle includes: acquiring the projection distance of the safe driving distance on a preset coordinate axis based on the position deviation information; determining boundary point coordinates corresponding to the automatic guided vehicle according to the projection distance and the coordinate information of the current position; and constructing a safety boundary of the automatic guided vehicle according to the boundary point coordinates.
In an exemplary embodiment of the present disclosure, the method further comprises: and carrying out layering processing on the automatic guided vehicles in the same working site based on a quadtree algorithm to obtain the layering result.
In an exemplary embodiment of the present disclosure, the method further comprises: acquiring the spacing distance between the real-time position of the automatic guided vehicle and the position of the preset operation point; and adjusting the running speed of the automatic guided vehicle based on the spacing distance.
In an exemplary embodiment of the present disclosure, the method further comprises: and acquiring the travelling distance of the automatic guided vehicle, and updating the real-time position of the automatic guided vehicle according to the travelling distance.
According to a second aspect of the present disclosure, there is provided a control device of an automated guided vehicle, comprising: the label laying module is used for acquiring a two-dimensional code label laid randomly in a work site of the automatic guided vehicle, and the two-dimensional code label carries position information of the work site; the path planning module is used for planning a task execution path of the automatic guided vehicle based on the two-dimension code label; the information acquisition module is used for acquiring the position offset information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path; the boundary determining module is used for determining the safety boundary of the automatic guided vehicle according to the position deviation information and the preset safety driving distance of the automatic guided vehicle; and the collision detection module is used for carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
According to a third aspect of the present disclosure, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the method of controlling an automated guided vehicle of the first aspect described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to execute the control method of the automated guided vehicle of the first aspect via execution of the executable instructions.
As can be seen from the foregoing technical solutions, the control method of the automated guided vehicle, the control device of the automated guided vehicle, the computer storage medium, and the electronic device in the exemplary embodiment of the present disclosure have at least the following advantages and positive effects:
in the technical solutions provided in some embodiments of the present disclosure, on one hand, a two-dimensional code label randomly laid in a work site of an automated guided vehicle is obtained, the two-dimensional code label carries position information of the work site, and a task execution path of the automated guided vehicle is planned based on the two-dimensional code label, so that technical problems that in the prior art, since the task execution path is planned according to equidistant two-dimensional codes, the field implementation difficulty is high, the path is long, the task cannot directly reach another work point from one work point, and the work speed is slow can be solved, the work efficiency of the automated guided vehicle is improved, and the laying cost of the two-dimensional codes is reduced. On the other hand, according to the task execution path, the position deviation information of the current position and the next position of the automatic guided vehicle is obtained in real time, the safety boundary of the automatic guided vehicle is determined according to the position deviation information and the preset safe driving distance of the automatic guided vehicle, and real-time collision detection is carried out on the automatic guided vehicle according to the safety boundary, so that the accuracy of safety boundary calculation and the accuracy of collision detection can be improved, and the driving safety of the automatic guided vehicle is ensured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 illustrates a flow diagram of a method of controlling an automated guided vehicle in an exemplary embodiment of the present disclosure;
fig. 2 shows a schematic flow diagram of a method of controlling an automated guided vehicle according to another exemplary embodiment of the present disclosure;
FIG. 3 illustrates a schematic diagram of position offset information in an exemplary embodiment of the present disclosure;
fig. 4 schematically illustrates a safe driving distance corresponding to an automated guided vehicle in an exemplary embodiment of the disclosure;
fig. 5 shows a flow diagram of a method of controlling an automated guided vehicle in yet another exemplary embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating a method of controlling an automated guided vehicle according to an exemplary embodiment of the present disclosure;
fig. 7 shows a flow diagram of a method of controlling an automated guided vehicle in yet another exemplary embodiment of the present disclosure;
FIG. 8 is a schematic illustration of a automated guided vehicle layering process in the same work site in an exemplary embodiment of the disclosure;
figure 9 illustrates a safety boundary schematic of automated guided vehicle a and target automated guided vehicle B in an exemplary embodiment of the present disclosure;
FIG. 10 is a schematic diagram illustrating a method of controlling an automated guided vehicle according to another exemplary embodiment of the present disclosure;
fig. 11 shows a schematic view of a control method of an automated guided vehicle in yet another exemplary embodiment of the present disclosure;
fig. 12 is a schematic overall flow chart illustrating a control method of an automated guided vehicle according to an exemplary embodiment of the present disclosure;
fig. 13 is a schematic structural view showing a control device of an automated guided vehicle according to an exemplary embodiment of the present disclosure;
FIG. 14 shows a schematic diagram of a structure of a computer storage medium in an exemplary embodiment of the disclosure;
fig. 15 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
At present, two-dimension code image labels arranged in the same direction are generally pasted in a matrix form at equal intervals in a passing area of a warehouse, and a walking path is planned through the position information associated with the two-dimension codes so as to control the operation of an automatic guided vehicle. However, distances between a plurality of operation points in an actual operation scene are random, and equidistant two-dimensional codes cannot be laid due to site limitation, so that the field implementation difficulty is high. In addition, if the two-dimension codes are laid according to the equal distance, more two-dimension codes may be required to be laid, and the cost is wasted. Therefore, the field implementation difficulty and the cost of the control method of the automatic guided vehicle in the prior art need to be reduced.
In the embodiment of the disclosure, firstly, a control method of an automatic guided vehicle is provided, which overcomes the defect that the field implementation difficulty is high due to laying of equidistant two-dimensional codes in the control method of the automatic guided vehicle provided in the prior art at least to a certain extent.
Fig. 1 is a flowchart illustrating a control method of an automated guided vehicle according to an exemplary embodiment of the present disclosure, where an execution subject of the control method of the automated guided vehicle may be a server that controls the automated guided vehicle.
Referring to fig. 1, a control method of an automated guided vehicle according to one embodiment of the present disclosure includes the steps of:
step S110, acquiring a two-dimension code label randomly laid in a work site of the automatic guided vehicle, wherein the two-dimension code label carries position information of the work site;
step S120, planning a task execution path of the automatic guided vehicle based on the two-dimension code label;
step S130, acquiring position deviation information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path;
step S140, determining a safety boundary of the automatic guided vehicle according to the position deviation information and a preset safety driving distance of the automatic guided vehicle;
and S150, performing real-time collision detection on the automatic guided vehicle according to the safety boundary.
In the technical scheme provided by the embodiment shown in fig. 1, on one hand, two-dimensional code tags randomly paved in a work site of the automated guided vehicle are obtained, the two-dimensional code tags carry position information of the work site, and a task execution path of the automated guided vehicle is planned based on the two-dimensional code tags, so that the technical problems that in the prior art, the difficulty in field implementation is high, the path is long, the task execution path cannot directly reach another work point from one work point, and the work speed is slow due to the fact that the task execution path is planned according to equidistant two-dimensional codes can be solved, the work efficiency of the automated guided vehicle is improved, and the paving cost of the two-dimensional codes is reduced. On the other hand, according to the task execution path, the position deviation information of the current position and the next position of the automatic guided vehicle is obtained in real time, the safety boundary of the automatic guided vehicle is determined according to the position deviation information and the preset safe driving distance of the automatic guided vehicle, and real-time collision detection is carried out on the automatic guided vehicle according to the safety boundary, so that the accuracy of safety boundary calculation and the accuracy of collision detection can be improved, and the driving safety of the automatic guided vehicle is ensured.
The following describes the specific implementation of each step in fig. 1 in detail:
in step S110, a two-dimensional code label randomly laid in the work site of the automated guided vehicle is acquired, where the two-dimensional code label carries position information of the work site.
In an exemplary embodiment of the present disclosure, an Automated Guided Vehicle (AGV) refers to an unmanned Automated Vehicle that has an Automated guidance device such as a magnetic stripe, a track, or a laser, travels along a planned path, is powered by a battery, and is equipped with safety protection and various auxiliary mechanisms (e.g., a transfer and assembly mechanism).
In an exemplary embodiment of the present disclosure, the two-dimensional code tags are two-dimensional codes carrying position information in a work site, that is, each two-dimensional code tag stores different position information (for example, position coordinates and the like) in the work site, and further, the position information of the current position can be read out by scanning the two-dimensional code tags. For example, the two-dimensional code may be a QR (Quick Response, abbreviated as QR), and on one hand, the QR has a large amount of information and a fast read/write speed. On the other hand, the QR code can also represent various character information such as Chinese characters, images and the like, and has strong confidentiality and anti-counterfeiting performance, high reliability and very convenient use. If the automatic guided vehicle misses scanning the two-dimensional code in the running process, the two-dimensional code of related information can be scanned nearby, and the requirements of the system on identification speed and accuracy are met.
In this disclosed example embodiment, the two-dimensional code label can be laid according to the actual conditions of work place at random, exemplarily, can lay the two-dimensional code label according to actual operation point needs and the actual conditions of work place at random, the distance of interval can be set for by oneself according to actual need before every two-dimensional code label, can set for the interval distance between the two-dimensional code equal, also can set for the interval distance inequality between the two-dimensional code, thereby can solve the technical problem that can't lay the equidistance two-dimensional code or lay the equidistance two-dimensional code and lead to with higher costs among the prior art because of the place restriction, improve the flexibility of correlation technique, reduce and lay cost and the on-the-spot degree of difficulty of implementing.
In step S120, a task execution path of the automated guided vehicle is planned based on the two-dimensional code tag.
In an exemplary embodiment of the present disclosure, fig. 2 schematically illustrates a flow chart of a control method of an automated guided vehicle in another exemplary embodiment of the present disclosure, and specifically illustrates a flow chart of planning a task execution path of the automated guided vehicle based on the two-dimensional code tag, and a specific implementation is explained below with reference to fig. 2.
In step S201, a current position of the automated guided vehicle is acquired based on the two-dimensional code tag.
In an exemplary embodiment of the present disclosure, the current position of the automated guided vehicle may be acquired based on the two-dimensional code tag.
In an exemplary embodiment of the present disclosure, the current position, that is, the position of the automated guided vehicle at the current time point, may be, for example, a position coordinate of the automated guided vehicle in the work site.
In an exemplary embodiment of the disclosure, for example, a two-dimensional code scanning device may be provided on the automatic guided vehicle, furthermore, when the automatic guided vehicle runs, the two-dimensional code label can be scanned by the scanning device, further, the position information stored on the two-dimensional code label can be acquired, and further, the position information can be sent to a traffic scheduling system (which is a comprehensive transportation and management system which is established by integrating advanced information technology, communication technology, sensing technology, control technology, computer technology and the like, can play a role in a wide range and all directions, can accurately and efficiently master the running route and the located position of the automatic guided vehicle in the whole work site in real time), furthermore, the current position of the automatic guided vehicle can be obtained according to the feedback information of the traffic scheduling system. For example, the current position obtained by the automatic guided vehicle may be coordinates (x, y), it should be noted that the coordinates (x, y) may be a central point of the automatic guided vehicle, or may be any feature point on the automatic guided vehicle, and may be set by itself according to an actual situation, and all of the coordinates belong to the protection scope of the present disclosure.
In step S202, a task execution path of the automated guided vehicle is planned according to a position relationship between the current position and a preset operation point position.
In an exemplary embodiment of the present disclosure, after the current position is determined, a task execution path of the automated guided vehicle may be planned according to a position relationship between the current position and a preset operation point position.
In an exemplary embodiment of the present disclosure, the preset operation point position, which is a position where the automated guided vehicle needs to take or put down a cargo, may be, for example, a placement end point where the automated guided vehicle needs to take or put down the cargo.
In an exemplary embodiment of the present disclosure, the task execution path is a route that the automated guided vehicle described above passes when performing a transportation task, i.e., a route that the automated guided vehicle passes from a current position of the automated guided vehicle to a preset working point position. For example, the current position may be used as a starting point, the preset working point position A, B may be used as a must-pass point, and the preset working point position C may be used as an end point, the task execution path may be determined according to the positional relationship of three points ABC, and a route that passes through the position A, B, C and has the shortest distance may be determined, after the route is determined, the obstacle point may be removed to determine the task execution path.
In an exemplary embodiment of the present disclosure, after the two-dimensional code tag is randomly laid, a task execution path of the automated guided vehicle may be planned based on the two-dimensional code tag. Therefore, the technical problems that the path is dead and long, the number of curved paths is large, the operation point cannot be directly reached from one operation point, and the operation speed is slow due to the fact that the operation path is planned according to the equidistant two-dimensional code in the prior art can be solved, and the operation efficiency of the automatic guided vehicle is improved.
In step S130, position offset information of the current position and the next position of the automated guided vehicle is obtained in real time according to the task execution path.
In an exemplary embodiment of the present disclosure, the next position of the automated guided vehicle may be determined according to the task execution path. And the next position is the next position which needs to be reached after leaving the current position in the process of transporting the automatic guided vehicle along the task execution path.
In an exemplary embodiment of the present disclosure, after determining the next position, the position offset information of the current position and the next position may be acquired in real time. Exemplarily, referring to fig. 3, fig. 3 shows a schematic diagram of position offset information in an exemplary embodiment of the present disclosure, where 301 is a current position, 302 is a next position, and a coordinate system may be established with the current position 301 as a coordinate origin, so that the position offset information between the next position 302 and the current position is an offset angle q between the next position 302 and an X axis.
In step S140, a safety boundary of the automated guided vehicle is determined according to the position offset information and a preset safety driving distance of the automated guided vehicle.
In an exemplary embodiment of the present disclosure, the preset safe driving distance of the automated guided vehicle may be acquired based on the device model of the automated guided vehicle. The device model is a symbol indicating the specification of the device by letters and numerical values. The safe driving distance is a separation distance which is marked in advance when the equipment leaves a factory and cannot collide with other vehicles or objects in the running process. Different equipment models correspond to different safe driving distances. The safe travel distance may be, for example, a distance at which the automated guided vehicle does not collide with another vehicle or an object in the front, rear, left, and right directions with reference to an arbitrary feature point indicating the current position of the automated guided vehicle, and may be, for example, f, b, l, and r. For example, referring to fig. 4, fig. 4 schematically illustrates a schematic diagram of a safety driving distance corresponding to an automated guided vehicle in an exemplary embodiment of the present disclosure, referring to fig. 4, a rectangle acde is the automated guided vehicle, a point h is an arbitrary feature point representing a current position (x, y) of the automated guided vehicle, a length of a line segment nh is a front safety driving distance f, a length of a line segment hm is a rear safety driving distance b, a length of a line segment kh is a left safety driving distance l, and a length of a line segment hg is a right safety driving distance r.
In an exemplary embodiment of the present disclosure, reference may be made to fig. 5, where fig. 5 schematically illustrates a flowchart of a control method of an automated guided vehicle in yet another exemplary embodiment of the present disclosure, and specifically illustrates a flowchart of determining a safety boundary of the automated guided vehicle according to the above-mentioned position offset information and a preset safety driving distance of the automated guided vehicle. Step S140 is explained below with reference to fig. 5.
In step S501, based on the position offset information, a projection distance of the safe driving distance on a preset coordinate axis is acquired.
In an exemplary embodiment of the present disclosure, for example, after determining the position offset information, a projection distance of the safe driving distance on a preset coordinate axis may be acquired. For example, referring to fig. 6, fig. 6 schematically illustrates a schematic diagram of a control method of an automated guided vehicle in an exemplary embodiment of the present disclosure, specifically illustrates a schematic diagram of a projection distance of a safe driving distance on a preset coordinate axis, specifically, a coordinate of a calculated boundary point z is taken as an example for description, when the position deviation information is q, it is known that a projection distance of the safe driving distance f on an X axis of the preset coordinate axis is hi ═ f cosq, and a projection distance of the safe driving distance r on the X axis of the preset coordinate axis is jz ═ r cosq. The projection distance of the safe driving distance f on the Y axis of the preset coordinate axis is ni ═ f × sinq, and the projection distance of the safe driving distance r on the Y axis of the preset coordinate axis is nj ═ r × sinq.
In step S502, boundary point coordinates corresponding to the automated guided vehicle are determined according to the projection distance and the coordinate information of the current position.
In an exemplary embodiment of the present disclosure, with continued reference to fig. 6, the abscissa of the z-point is x + f × sinq + r × sinq, and the ordinate of the z-point is y + f × sinq-r × cosq, according to the above-mentioned coordinate information of the projection distance and the current position. Namely, the coordinate of the boundary point z corresponding to the automatic guided vehicle is (x + f + cosq + r + sinq, y + f + sinq-r + cosq).
In the disclosureIn an exemplary embodiment, similarly, the four boundary point coordinates of the safety boundary may be determined as point z (x + f + cosq + r + sinq, y + f sinq-r + cosq), point v (x + f cosq-l sinq, y + f sinq + l cosq), point w (x-b cosq + r sinq, y-b sinq-r-cosq). And the position coordinate of the center point o of the automated guided vehicle is
Figure BDA0002186771830000111
In step S503, a safety boundary of the automated guided vehicle is constructed according to the boundary point coordinates.
In an exemplary embodiment of the present disclosure, the safety boundary is a rectangular boundary of the automated guided vehicle determined according to the boundary point coordinates, and after the boundary point coordinates are determined, the safety boundary of the automated guided vehicle may be constructed according to the boundary point coordinates. Referring to the above explanation of steps, for example, and with continued reference to fig. 6, the safety boundary constructed according to the above coordinates of the four boundary points may be a rectangle wzvu shown by a dotted line, with a length f + b and a width l + r.
In an exemplary embodiment of the present disclosure, after the safety boundary of the automated guided vehicle is determined, collision detection may be performed on the automated guided vehicle according to the safety boundary. The automatic guided vehicles are subjected to collision detection according to the safety boundaries, and different safety boundaries correspond to automatic guided vehicles of different sizes, so that the mixed operation condition of the automatic guided vehicles of different sizes can be considered, the technical problem that collision detection cannot be carried out or the detection result is inaccurate due to different sizes of the vehicles is solved, and the effectiveness and the reliability of the collision detection result are improved.
In step S150, real-time collision detection is performed on the automated guided vehicle according to the safety margin.
In an exemplary embodiment of the present disclosure, fig. 7 schematically illustrates a flowchart of a control method of an automated guided vehicle in yet another exemplary embodiment of the present disclosure, and specifically illustrates a flowchart of performing real-time collision detection on the automated guided vehicle after a safety boundary is determined, and a specific implementation is explained below with reference to fig. 7.
In step S701, the automated guided vehicles in the same work site are layered to obtain a layering result.
In an exemplary embodiment of the present disclosure, automated guided vehicles in the same work site may be layered based on a quadtree algorithm, resulting in a layering result. It should be noted that the hierarchical processing may also be performed based on a ternary tree algorithm, a binary tree algorithm, or the like, and may be set according to actual situations, which belongs to the protection scope of the present disclosure.
In an exemplary embodiment of the present disclosure, a quadtree is a tree-like data structure with four child nodes per parent node. For example, referring to fig. 8, fig. 8 is a schematic diagram illustrating a layering process performed on automated guided vehicles in the same work site according to an exemplary embodiment of the present disclosure, as shown in fig. 8, there may be multiple automated guided vehicles in the same work site, and the work site may be divided into four regions for distinguishing objects at different positions, and four nodes of a quadtree just represent the four regions appropriately. These four regions may be designated as quadrants 1, 2, 3, 4 for convenience. For example, it may be set that a preset number of automated guided vehicles (e.g., 2) can be accommodated in each quadrant, and for the quadrants exceeding 2 automated guided vehicles, the division may be continued to generate another 4 sub-quadrants, and so on. Finally, all automatic guided vehicles in the work site can be divided into a plurality of small areas, and the layering result can be obtained. Therefore, the technical problems that when a plurality of automatic guide vehicles exist in a working site, collision detection is carried out on every two automatic guide vehicles (collision between the automatic guide vehicle actually positioned at the lower left corner of the drawing and the automatic guide vehicle actually positioned at the upper right corner of the drawing is obviously impossible) so that algorithm complexity is high and efficiency is low in the prior art can be solved, and collision detection efficiency is improved.
In step S702, a collision detection range corresponding to the automated guided vehicle is determined according to the layering result.
In an exemplary embodiment of the present disclosure, after the layering result is obtained and the safety boundary is determined, a collision detection range corresponding to the automated guided vehicle may be determined based on the layering result. For example, the collision detection range may be a detection range with a radius of 20 meters and centered on any one of the automated guided vehicles based on the above layering result. It should be noted that, the specific collision detection range may be set according to the actual situation, and belongs to the protection scope of the present disclosure.
In step S703, a target automated guided vehicle that is within the collision detection range is acquired.
In an exemplary embodiment of the present disclosure, after the collision detection range is determined, the automated guided vehicle located within the collision detection range may be a target automated guided vehicle.
In step S704, real-time collision detection is performed on the automated guided vehicle based on a directional bounding box algorithm according to a safety boundary between the automated guided vehicle and the target automated guided vehicle.
In an exemplary embodiment of the present disclosure, by taking the above-mentioned related explanation of step S703 as an example to explain a process of performing collision detection on any two automated guided vehicles (automated guided vehicle and target automated guided vehicle) in the same work site, for example, refer to fig. 9, fig. 9 schematically shows a safety boundary schematic diagram of an automated guided vehicle a and a target automated guided vehicle B in an exemplary embodiment of the present disclosure, and in conjunction with fig. 9, a rectangle a represents the safety boundary schematic diagram of the automated guided vehicle a, and a rectangle B represents the safety boundary schematic diagram of the target automated guided vehicle B. Wherein, PARepresents the center point, P, of the rectangle ABRepresenting the center point of rectangle B. A. thexA unit vector representing a rectangle A, AyA unit vector representing rectangle a; b isxRepresenting the unit vector of a rectangle B, ByRepresenting the unit vector of rectangle B. WARepresents half the width of the rectangle A, HARepresents half the height of rectangle a; wBRepresents half the width of the rectangle B, HBRepresenting half the height of rectangle B. The size T of the vector T is PB-PAIn the direction of the slave point PAPoint of orientation PB. Referring to the above steps S502-S503 and the related explanation of FIG. 6, taking the automated guided vehicle A as an example for explanation, P isAI.e. the coordinates of the centre point o on the automated guided vehicle
Figure BDA0002186771830000131
WA0.5(l + r), HA0.5(f + b), and vector A which is the ratio of the difference between the coordinates of points v and u (or the difference between the coordinates of points z and w) to the modulus thereofxVector a is the ratio of the difference between the coordinates of points u and w (or the difference between the coordinates of points v and z) to the modulus thereofy. Similarly, referring to the above steps S110 to S130 and S501 to S503, the relevant parameters of the rectangle B may be determined according to the position offset information of the target automated guided vehicle B and the preset safe driving distance, and further, the vector T may be determined.
In an exemplary embodiment of the present disclosure, for example, the collision detection may be performed by using an Oriented Bounding Box (OBB), where the determination method is as follows: if the projections of the two polygons on all the axes are overlapped, judging that the polygons collide; otherwise, it is determined that no collision has occurred. For example, referring to fig. 10, fig. 10 schematically shows a schematic diagram of a control method of an automated guided vehicle in another exemplary embodiment of the present disclosure, specifically a schematic diagram of two polygons when their projections on one axis overlap but do not collide with each other, as shown in fig. 10, a rectangular boundary P of the automated guided vehicle 15P6P7P8Side P of5P6Projection on the X-axis is P1P3Rectangular boundary P of automated guided vehicle 29P10P11P12Side P of9P10Projection on the X-axis is P2P4In this specification, it is shown that P1P3And P2P4Generating an overlap P2P3And the two automated guided vehicles do not collide.
In an exemplary embodiment of the present disclosure, the above determination manner may be referred to with reference to the related explanation of the above stepsExpressed as formula 1 (projection length of the center point connecting line on the coordinate axis and the sum of more than two rectangular projection radii, and Proj is projection calculation) as follows: | Proj (T) emittingUV>When 0.5 + pro j (rectangle a) | +0.5 + pro j (rectangle B) |, no collision occurs, and on the contrary, a collision occurs. Further, substituting and developing the correlation parameters shown in fig. 9 into the above equation 1 can obtain equation 2: | T. L->|(WA*Ax)·L|+|(HA*Ay)·L|+|(WB*Bx)·L|+|(HB*By) L, where "+" denotes the multiplication of a numerical value by a vector and "·" denotes the multiplication of a vector by a vector. The L vector may contain four cases: a. theX、Ay、BX、By. And substituting the four conditions into the formula 2, if the calculation results corresponding to the four conditions all satisfy the formula 2, determining that the automated guided vehicle a and the target automated guided vehicle B are collided, and if any one of the calculation results corresponding to the four conditions does not satisfy the formula 2, determining that the automated guided vehicle a and the target automated guided vehicle B are not collided.
In an exemplary embodiment of the present disclosure, for example, when it is detected that the automated guided vehicle a and the target automated guided vehicle B may collide with each other, the traveling paths of the automated guided vehicle a, the target automated guided vehicle B, or both the automated guided vehicle a and the target automated guided vehicle B may be changed to avoid the collision.
In an exemplary embodiment of the disclosure, a distance between a real-time position of the automated guided vehicle and a preset working point position may also be obtained, and the running speed of the automated guided vehicle may be controlled based on the distance. For example, when the obtained automated guided vehicle is close to the preset working point (for example, 10 meters), the running speed of the automated guided vehicle can be controlled to be reduced (for example, reduced to 10 meters per minute). When the distance between the obtained automated guided vehicle and the preset operation point is far (for example, 500 meters), the running speed of the automated guided vehicle can be controlled to be increased (for example, to be increased to 50 meters per minute). It should be noted that, the setting of the above-mentioned related values of the separation distance and the running speed may be set according to the actual situation, and belongs to the protection scope of the present disclosure.
In an exemplary embodiment of the present disclosure, a travel distance of the automated guided vehicle may also be acquired, and a real-time position (a position that changes with the movement of the automated guided vehicle) of the automated guided vehicle may be updated according to the travel distance. For example, if the initial position of the automated guided vehicle is (x, y), and when the obtained travel distance of the automated guided vehicle is t, for example, fig. 11 shows a schematic diagram of a control method of the automated guided vehicle in another exemplary embodiment of the present disclosure, specifically a schematic diagram of updating a real-time position of the automated guided vehicle, and referring to fig. 11, the real-time position of the automated guided vehicle may be updated to h1(x + t cosq, y + t sinq), and further, a new safety margin may be calculated according to the real-time position of the automated guided vehicle, so as to perform collision detection again according to the new safety margin. Therefore, collision detection can be carried out in real time according to the position of the automatic guided vehicle, and the driving safety of the automatic guided vehicle is ensured.
In an exemplary embodiment of the present disclosure, fig. 12 schematically illustrates an overall flowchart of a control method of an automated guided vehicle in an exemplary embodiment of the present disclosure, and a specific implementation is explained below with reference to fig. 12.
In step S1201, scanning a two-dimensional code label laid randomly in advance to obtain a current position of the automated guided vehicle;
in step S1202, a preset operation point position is determined according to the currently executed operation task, and a task execution path of the automated guided vehicle is planned according to the current position and the preset operation point position.
In step S1203, a distance (remaining distance) between the real-time position of the automated guided vehicle and the preset working point position is acquired, and the traveling speed of the automated guided vehicle is controlled based on the distance.
In step S1204, determining a safety boundary of the automated guided vehicle according to the task execution path and a preset safety interval of the automated guided vehicle;
in step S1205, collision detection is performed on automated guided vehicles in the same work site based on the safety margin;
in step S1206, the automated guided vehicle is controlled to travel the remaining task execution path until the end of the work.
The present disclosure also provides a control device of an automated guided vehicle, and fig. 13 shows a schematic structural diagram of the control device of the automated guided vehicle in an exemplary embodiment of the present disclosure; as shown in fig. 13, the control device 1300 of the automated guided vehicle may include a label laying module 1301, a path planning module 1302, an information acquisition module 1303, a boundary determination module 1304, and a collision detection module 1305. Wherein:
the label laying module 1301 is configured to acquire a two-dimensional code label laid randomly in a work site of the automated guided vehicle, where the two-dimensional code label carries position information of the work site.
In an exemplary embodiment of the disclosure, the label laying module is configured to acquire a two-dimensional code label randomly laid in a work site of the automated guided vehicle, where the two-dimensional code label carries position information of the work site.
And a path planning module 1302, configured to plan a task execution path of the automated guided vehicle based on the two-dimensional code tag.
In an exemplary embodiment of the present disclosure, the path planning module is configured to obtain a current position of the automated guided vehicle based on the two-dimensional code tag; and planning a task execution path of the automatic guided vehicle according to the position relation between the current position and the position of the preset operation point.
And the information obtaining module 1303 is configured to obtain, in real time, position offset information of the current position and the next position of the automated guided vehicle according to the task execution path.
In an exemplary embodiment of the disclosure, the information obtaining module is configured to obtain, in real time, position offset information of a current position and a next position of the automated guided vehicle according to the task execution path.
And a boundary determining module 1304, configured to determine a safety boundary of the automated guided vehicle according to the position offset information and a preset safety driving distance of the automated guided vehicle.
In an exemplary embodiment of the present disclosure, the boundary determining module is configured to obtain a projection distance of the safe driving distance on a preset coordinate axis based on the position offset information; determining boundary point coordinates corresponding to the automatic guided vehicle according to the projection distance and the coordinate information of the current position; and constructing a safety boundary of the automatic guided vehicle according to the boundary point coordinates.
And the collision detection module 1305 is used for performing real-time collision detection on the automatic guided vehicle according to the safety boundary.
In an exemplary embodiment of the disclosure, the collision detection module is used for performing layering processing on automatic guided vehicles in the same working site to obtain a layering result; determining a collision detection range corresponding to the automatic guided vehicle according to the layering result; acquiring a target automatic guided vehicle in a collision detection range; and performing real-time collision detection on the automatic guided vehicle based on a direction bounding box algorithm according to the safety boundary of the automatic guided vehicle and the target automatic guided vehicle.
In an exemplary embodiment of the disclosure, the collision detection module is further configured to perform a layering process on the automated guided vehicles in the same work site based on a quadtree algorithm, so as to obtain a layering result.
In an exemplary embodiment of the disclosure, the collision detection module is further configured to obtain a separation distance between a real-time position of the automated guided vehicle and a position of a preset operation point; and controlling the running speed of the automatic guided vehicle based on the separation distance.
In an exemplary embodiment of the disclosure, the collision detection module is further configured to obtain a travel distance of the automated guided vehicle, and update a real-time position of the automated guided vehicle according to the travel distance.
The specific details of each module in the control device of the automated guided vehicle have been described in detail in the corresponding control method of the automated guided vehicle, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer storage medium capable of implementing the above method. On which a program product capable of implementing the above-described method of the present specification is stored. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
Referring to fig. 14, a program product 1400 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 1500 according to such an embodiment of the disclosure is described below with reference to fig. 15. The electronic device 1500 shown in fig. 15 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present disclosure.
As shown in fig. 15, electronic device 1500 is in the form of a general purpose computing device. Components of electronic device 1500 may include, but are not limited to: the at least one processing unit 1511, the at least one memory unit 1520, and the bus 1530 that connects the various system components (including the memory unit 1520 and the processing unit 1510).
Wherein the memory unit stores program code that is executable by the processing unit 1510 to cause the processing unit 1510 to perform steps according to various exemplary embodiments of the present disclosure as described in the above section "exemplary methods" of this specification. For example, the processing unit 1510 may perform the following as shown in fig. 1: step S110, acquiring a two-dimension code label randomly laid in a work site of the automatic guided vehicle, wherein the two-dimension code label carries position information of the work site; step S120, planning a task execution path of the automatic guided vehicle based on the two-dimension code label; step S130, acquiring position deviation information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path; step S140, determining a safety boundary of the automatic guided vehicle according to the position deviation information and a preset safety driving distance of the automatic guided vehicle; and S150, performing real-time collision detection on the automatic guided vehicle according to the safety boundary.
The storage unit 1520 may include readable media in the form of volatile storage units, such as a random access memory unit (RAM)15201 and/or a cache memory unit 15202, and may further include a read only memory unit (ROM) 15203.
Storage unit 1520 may also include a program/utility 15204 having a set (at least one) of program modules 15205, such program modules 15205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1530 may be any bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1500 can also communicate with one or more external devices 1600 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 1500, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 1500 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 1550. Also, the electronic device 1500 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 1560. As shown, the network adapter 1560 communicates with the other modules of the electronic device 1500 over the bus 1530. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 1500, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. A control method of an automated guided vehicle, comprising:
acquiring a two-dimensional code label randomly laid in a work site of the automatic guided vehicle, wherein the two-dimensional code label carries position information of the work site;
planning a task execution path of the automatic guided vehicle based on the two-dimension code label;
acquiring position deviation information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path;
determining a safety boundary of the automatic guided vehicle according to the position deviation information and a preset safety driving distance of the automatic guided vehicle;
and carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
2. The method of claim 1, further comprising:
carrying out layering processing on the automatic guided vehicles in the same working site to obtain a layering result;
determining a collision detection range corresponding to the automatic guided vehicle according to the layering result;
acquiring a target automatic guided vehicle in the collision detection range;
and performing real-time collision detection on the automatic guided vehicle based on a direction bounding box algorithm according to the safety boundary between the automatic guided vehicle and the target automatic guided vehicle.
3. The method of claim 1, wherein planning a task execution path of the automated guided vehicle based on the two-dimensional code tag comprises:
acquiring the current position of the automatic guided vehicle based on the two-dimension code label;
and planning a task execution path of the automatic guided vehicle according to the position relation between the current position and the position of a preset operation point.
4. The method according to claim 1 or 2, wherein the determining a safety boundary of the automated guided vehicle according to the position offset information and a preset safety driving distance of the automated guided vehicle comprises:
acquiring the projection distance of the safe driving distance on a preset coordinate axis based on the position deviation information;
determining boundary point coordinates corresponding to the automatic guided vehicle according to the projection distance and the coordinate information of the current position;
and constructing a safety boundary of the automatic guided vehicle according to the boundary point coordinates.
5. The method of claim 4, further comprising:
and carrying out layering processing on the automatic guided vehicles in the same working site based on a quadtree algorithm to obtain the layering result.
6. The method of any of claims 1 to 3, further comprising:
acquiring the spacing distance between the real-time position of the automatic guided vehicle and the position of the preset operation point;
and adjusting the running speed of the automatic guided vehicle based on the spacing distance.
7. The method of any of claims 1 to 3, further comprising:
and acquiring the travelling distance of the automatic guided vehicle, and updating the real-time position of the automatic guided vehicle according to the travelling distance.
8. A control device for an automated guided vehicle, comprising:
the label laying module is used for acquiring a two-dimensional code label laid randomly in a work site of the automatic guided vehicle, and the two-dimensional code label carries position information of the work site;
the path planning module is used for planning a task execution path of the automatic guided vehicle based on the two-dimension code label;
the information acquisition module is used for acquiring the position offset information of the current position and the next position of the automatic guided vehicle in real time according to the task execution path;
the boundary determining module is used for determining the safety boundary of the automatic guided vehicle according to the position deviation information and the preset safety driving distance of the automatic guided vehicle;
and the collision detection module is used for carrying out real-time collision detection on the automatic guided vehicle according to the safety boundary.
9. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the automated guided vehicle control method of any of claims 1-7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to execute the method of controlling an automated guided vehicle of any of claims 1-7 via execution of the executable instructions.
CN201910817729.0A 2019-08-30 2019-08-30 Control method and device for automatic guided vehicle, storage medium and electronic equipment Pending CN112445220A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910817729.0A CN112445220A (en) 2019-08-30 2019-08-30 Control method and device for automatic guided vehicle, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910817729.0A CN112445220A (en) 2019-08-30 2019-08-30 Control method and device for automatic guided vehicle, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN112445220A true CN112445220A (en) 2021-03-05

Family

ID=74733836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910817729.0A Pending CN112445220A (en) 2019-08-30 2019-08-30 Control method and device for automatic guided vehicle, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112445220A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113928306A (en) * 2021-11-30 2022-01-14 合肥工业大学 Automobile integrated stability augmentation control method and system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007188158A (en) * 2006-01-11 2007-07-26 Mitsui Eng & Shipbuild Co Ltd Component supply system
KR20130099683A (en) * 2012-02-29 2013-09-06 부산대학교 산학협력단 Vision based guideline interpretation method for stable driving control of guideline tracing agvs
CN105388899A (en) * 2015-12-17 2016-03-09 中国科学院合肥物质科学研究院 An AGV navigation control method based on two-dimension code image tags
US20160167226A1 (en) * 2014-12-16 2016-06-16 Irobot Corporation Systems and Methods for Capturing Images and Annotating the Captured Images with Information
CN107168338A (en) * 2017-07-07 2017-09-15 中国计量大学 Inertial guide car air navigation aid and inertial guide car based on millimetre-wave radar
CN107703940A (en) * 2017-09-25 2018-02-16 芜湖智久机器人有限公司 A kind of air navigation aid based on ceiling Quick Response Code
CN107844119A (en) * 2017-12-14 2018-03-27 中国计量大学 Visual guidance method and visual guidance car based on space-time conversion
CN109189076A (en) * 2018-10-24 2019-01-11 湖北三江航天万山特种车辆有限公司 A kind of heavy guiding vehicle localization method and heavy guiding vehicle of view-based access control model sensor

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007188158A (en) * 2006-01-11 2007-07-26 Mitsui Eng & Shipbuild Co Ltd Component supply system
KR20130099683A (en) * 2012-02-29 2013-09-06 부산대학교 산학협력단 Vision based guideline interpretation method for stable driving control of guideline tracing agvs
US20160167226A1 (en) * 2014-12-16 2016-06-16 Irobot Corporation Systems and Methods for Capturing Images and Annotating the Captured Images with Information
CN105388899A (en) * 2015-12-17 2016-03-09 中国科学院合肥物质科学研究院 An AGV navigation control method based on two-dimension code image tags
CN107168338A (en) * 2017-07-07 2017-09-15 中国计量大学 Inertial guide car air navigation aid and inertial guide car based on millimetre-wave radar
CN107703940A (en) * 2017-09-25 2018-02-16 芜湖智久机器人有限公司 A kind of air navigation aid based on ceiling Quick Response Code
CN107844119A (en) * 2017-12-14 2018-03-27 中国计量大学 Visual guidance method and visual guidance car based on space-time conversion
CN109189076A (en) * 2018-10-24 2019-01-11 湖北三江航天万山特种车辆有限公司 A kind of heavy guiding vehicle localization method and heavy guiding vehicle of view-based access control model sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
于娟;: "基于QR二维码技术的AGV系统在仓储中的应用设计", 天津职业技术师范大学学报, no. 03, 28 September 2015 (2015-09-28) *
石柏军;章雪华;陈奥林;周友志;: "小型汽车零部件自动化立体仓库方案设计与研究", 机床与液压, no. 23, 15 December 2017 (2017-12-15) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113928306A (en) * 2021-11-30 2022-01-14 合肥工业大学 Automobile integrated stability augmentation control method and system
CN113928306B (en) * 2021-11-30 2023-05-02 合肥工业大学 Automobile integrated stability augmentation control method and system

Similar Documents

Publication Publication Date Title
US11740626B2 (en) Redundant pose generation system
US20190310653A1 (en) Topological map generation apparatus for navigation of robot and method thereof
Pomerleau et al. Long-term 3D map maintenance in dynamic environments
Guibas et al. Visibility-based pursuit-evasion in a polygonal environment
US11222530B2 (en) Driving intention determining method and apparatus
CN111308996A (en) Training device and cooperative operation control method thereof
CN111401779B (en) Robot positioning deployment method, device, equipment and storage medium
US10982966B2 (en) Generation of route network data for movement
CN111338343A (en) Automatic guided vehicle scheduling method and device, electronic equipment and storage medium
Andersen et al. Trajectory optimization and situational analysis framework for autonomous overtaking with visibility maximization
US20200135014A1 (en) Cooperative mapping for autonomous vehicles, robots or multi-agent systems
KR20220163426A (en) Method and apparatus for implementing vehicle-road cooperation at intersections without traffic lights
CN113168189A (en) Flight operation method, unmanned aerial vehicle and storage medium
CN112466111B (en) Vehicle driving control method and device, storage medium and electronic equipment
US20230230475A1 (en) Method and apparatus for coordinating multiple cooperative vehicle trajectories on shared road networks
CN112445220A (en) Control method and device for automatic guided vehicle, storage medium and electronic equipment
Boeing et al. WAMbot: Team MAGICian's entry to the Multi Autonomous Ground‐robotic International Challenge 2010
De Rose LiDAR-based Dynamic Path Planning of a mobile robot adopting a costmap layer approach in ROS2
Politi et al. Path planning and landing for unmanned aerial vehicles using ai
Tas et al. High-definition map update framework for intelligent autonomous transfer vehicles
Xidias A decision algorithm for motion planning of car-like robots in dynamic environments
CN113483770A (en) Path planning method and device in closed scene, electronic equipment and storage medium
CN113449798A (en) Port unmanned driving map generation method and device, electronic equipment and storage medium
KR20220166784A (en) Riding method, device, facility and storage medium based on autonomous driving
CN116795087A (en) Scheduling method, scheduling system, electronic equipment and storage medium of autonomous mobile robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination