CN102608998A - Vision guiding AGV (Automatic Guided Vehicle) system and method of embedded system - Google Patents

Vision guiding AGV (Automatic Guided Vehicle) system and method of embedded system Download PDF

Info

Publication number
CN102608998A
CN102608998A CN2011104366723A CN201110436672A CN102608998A CN 102608998 A CN102608998 A CN 102608998A CN 2011104366723 A CN2011104366723 A CN 2011104366723A CN 201110436672 A CN201110436672 A CN 201110436672A CN 102608998 A CN102608998 A CN 102608998A
Authority
CN
China
Prior art keywords
trolley
camera
image
agv
ground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011104366723A
Other languages
Chinese (zh)
Inventor
楼佩煌
王龙军
喻俊
钱晓明
武星
杨旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN2011104366723A priority Critical patent/CN102608998A/en
Publication of CN102608998A publication Critical patent/CN102608998A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a vision guiding AGV (Automatic Guided Vehicle) system and a method of an embedded system. Two cameras fixed on a trolley are used for acquiring guiding path information in real time, wherein the cameras and the ground form a certain angle to be inclined forwards and is used for acquiring prospect images; and the cameras are arranged at the middle part and the front part of the interior of the trolley, are vertical to the ground and are used for secondary exact location. The vision guiding AGV method comprises the following step of: embedding an anti-metal radio frequency mark on the ground surface of a key position, wherein a vehicle-mounted radio frequency card reader obtains information in the mark when the trolley passes through the upper part of the mark. When a laser scanner scans a front obstacle in real time, obstacle avoidance detection and obstacle avoidance are realized. A control box of the embedded system disclosed by the invention is as an inner kernel for image collection, image treatment and policy control, the acquired image is subjected to Gauss high-pass filtering, edge detection and two-step Hough transformation so that position deviation and angle deviation of the trolley relative to a current path can be calculated, and two-dimensional deviation value of the AGV currently corresponding to a location reference point is fed back at a station point.

Description

Visual guidance AGV system and method of embedded system
Technical Field
The invention relates to a visual navigation and positioning method for an Automatic Guided Vehicle (AGV) designed by an embedded system, which is suitable for the fields of automatic material conveying in automatic logistics and assembly of production lines.
Background
The automatic guiding vehicle is an autonomous mobile robot widely applied to automatic production lines of factories, warehouse logistics, airports, ports, banks and hospitals. The transport vehicle is a transport vehicle for carrying materials according to a set planned route. The most important feature with respect to other kinds of robots is their autonomous mobility, which is capable of autonomous movement in a known structured environment. Therefore, it is widely applied to national defense, aerospace, industry, traffic, scientific research and other civil fields. Automated guided vehicles are an effective means of logistics transportation in Flexible Manufacturing Systems (FMS), Computer Integrated Manufacturing Systems (CIMS) and automated storage systems of today, and the core of the system is an unmanned industrial handling vehicle. The common automatic guided vehicles use storage batteries as power, the load is from several kilograms to hundreds of tons, and the modern automatic guided vehicles are controlled by vehicle-mounted industrial controllers. For a system composed of a plurality of automatic guided vehicles, a system centralized control platform is provided to manage each automatic guided vehicle unit and supervise and optimize the operation process of the AGVs, for example: the system comprises a computer, a data processing system and a data processing system, wherein the computer is used for controlling the data processing system, the data processing system is used for creating tasks, generating maps, sending carrying instructions, tracking parts in transmission, and controlling the route and path planning of the AGV. The system has the advantages of wide service area, long transportation route, flexibility, changeability, low cost, strong reliability, strong predictability, improvement of logistics management and unmanned operation and the like. There are two main types of AGV navigation methods that have been used in large scale in the world at present: electromagnetic guidance and laser guidance. The electromagnetic guidance is poor in flexibility, and is the earliest AGV navigation mode, and the related technology is mature; the laser guiding positioning precision is high, but the cost is high, and the laser guiding positioning device is only suitable for being used indoors. The electromagnetic induction principle-based buried wire type guiding method is the guiding method which is successfully applied to the trackless AGV at the earliest time, the method is mature, and the current trackless AGV still mainly adopts the guiding mode. According to the method, a special cable needs to be buried under a path where the AGV needs to travel, and a sensor on the AGV tracks the cable through electromagnetic induction to achieve guidance. The method has the advantages of high reliability, economy and practicability, and has the main defects of relatively difficult AGV path change and higher cost, and greatly reduces the flexibility of the system.
The method is characterized in that a batch of reflectors of laser and infrared light beams are arranged at a specific position of an AGV walking space for selling, the AGV continuously irradiates the mirror surfaces with light beams when walking, and the absolute position coordinate of the AGV is calculated according to a mathematical model by utilizing included angle information, time difference information and the like provided by incident light beams and reflected light beams to realize guidance. The AGV based on laser guidance is required to receive the reflected beams of four mirrors at least at the same time during operation, and the position coordinates and mounting angles of the reflecting plates need to be strictly corrected. Therefore, the laser guiding method needs to provide enough reflecting mirrors and a wide scanning space, and is limited to a room. Meanwhile, the reflecting plate has high installation requirement, huge workload and higher cost. The advantage is that guide precision and positioning accuracy are higher.
The guidance method based on ultrasonic measurement is similar to laser guidance, except that no special mirror for emission is required. Secondly, the guide can be carried out by using a common wall surface or similar objects, thereby providing a more flexible and low-cost processing scheme under specific environments [13 ]. However, the reflective surface is large, and thus, the application in a manufacturing shop environment is often difficult.
Automated guided vehicles (V-AGVs) for vision-based guidance are a hot spot in AGV research at home and abroad in recent years. The guidance is typically provided by pasting or painting lines and symbols of a particular shape or color into the environment, and the vision system recognizes predefined routing characteristics, including the location and angle deviation of the guidance route relative to the AGV, route nodes, stations, turns, stops, acceleration, deceleration, and other indicators. The bottleneck in the implementation of visual guidance techniques is the real-time nature of the image processing. With the development of high-speed processors and the maturation of image processing algorithms, this technology is gaining increasing attention from world researchers. The visual guidance technology has the advantages of simple and flexible path setting, low cost, convenience in path change, strong environmental adaptability, wide application and the like, thereby having wider application prospect.
Chinese patent publication No.: the method described in CN1438138A adopts a forward-tilted camera to photograph the guidance path, and because the field of view is large, the actual size corresponding to the distance between pixels imaged by the camera is small, and the navigation and positioning accuracy is inaccurate. The core system formed by expanding the board cards in the industrial control computer is high in cost, large in power consumption, large in size and low in real-time performance. The system adopts a station address code identifier consisting of Arabic numerals and an acceleration, deceleration and parking identifier to generate errors from opposite directions or express opposite meanings. Therefore, the trolley travel path can only travel from one direction.
Chinese patent publication No.: the method described in CN101183265A also adopts an industrial control computer as a system processor core, and adopts a workstation address coding identifier composed of arabic numerals. In addition, the radio frequency technology is introduced into the system, space coordinate information is written into a radio frequency identification tag laid under a guide belt in advance, and when the trolley runs to the tag area, a radio frequency card reader acquires information in the tag and sends the information to a main control computer through a wireless module of the industrial personal computer. The method does not well utilize the function of abundant radio frequency memory resources, and the real-time performance is not high.
Chinese patent publication No.: in the method adopting radio frequency positioning in the method described in CN101197000A, a short-range card reader is adopted, and the positioning accuracy is 1-5 cm and is not high.
Chinese patent publication No.: the method described in CN1651863A adopts an ultrasonic sensor for positioning, and the distance of bluetooth wireless communication is short, which cannot meet the requirement of long-distance communication.
In summary, the existing methods have two obvious disadvantages: 1. the positioning accuracy is not high; 2. the industrial personal computer based on the board card is high in control cost and high in power consumption.
Disclosure of Invention
The invention designs the method of the image acquisition, processing and control system of the automatic guided vehicle by adopting an embedded system, and is suitable for the fields of automatic material transmission and production line assembly in automatic logistics.
A visual guidance AGV system of an embedded system adopts an embedded hardware system which is composed of a DSP (digital signal processor) as an image processor, an ARM (advanced RISC machine) as a task manager and an FPGA (field programmable gate array) as a coprocessor, and a camera is obliquely installed in a forward direction at a certain angle with the ground and is used for acquiring path information to be walked and predicting deceleration before turning, parking and positioning; and the camera is arranged in the center of the trolley and is vertical to the ground, and is used for acquiring the current path information of the trolley during running, including the position and angle deviation of the trolley relative to the guide path in the normal running process, the accurate positioning of a parking point and the accurate turning of a turning position.
The installation height of the camera I and the included angle between the camera I and the ground are related by the maximum running speed of the trolley, and the farthest point of the visual field is 1-1.2 times of the maximum running speed.
The calibration method of the visual guidance AGV system of the embedded system comprises the following steps: two cameras (7) and (12) fixed on the trolley are adopted to collect guide path information in real time, wherein the camera (12) inclines forwards at a certain angle with the ground to collect a long-range image and predict a path along which the trolley will walk; the camera II (7) is arranged at the middle front part in the trolley and is vertical to the ground for secondary accurate positioning; the method comprises the following steps that an anti-metal radio frequency tag is embedded in the ground surface of a specific position, and when a trolley passes above the tag, a vehicle-mounted radio frequency card reader (11) acquires information in the tag; the system adopts a laser scanner (14) to scan obstacles in the range of 180 degrees and 3 meters in front in real time, so as to realize obstacle avoidance detection and obstacle avoidance; the embedded system control box (25) is used as an inner core of image acquisition, image processing and control strategies; and calculating the position deviation and the angle deviation of the trolley relative to the current path through Gaussian high-pass filtering, edge detection and two-step Hough transformation of the acquired image, and feeding back the two-dimensional deviation amount of the AGV relative to the current positioning reference point at the work site.
The embedded system control box (25) comprises a DSP processor and an ARM microcontroller, the DSP processor only collects gray information of images, three cache regions are divided in an external SDRAM for the images collected by each camera, and the three cache regions are respectively as follows: a current processing frame, a last valid acquisition frame and a current acquisition frame; the DSP adopts an EDMA mode to collect images, management is not needed in the data collection process, and the target buffer area of next EDMA transmission is managed to be the last effective collection frame only in the hardware interruption generated when each frame is collected.
In the normal running process of the trolley, an image acquired by a camera is subjected to Gaussian high-pass filtering and edge detection based on two directions of horizontal and vertical gradients, after self-adaptive binarization, rough calculation (rho, theta) is performed by adopting Hough transformation with 2 pixel distance step lengths and 2-degree angle step lengths, and on the basis of the result, a precise solution is calculated within the range of (rho +/-3, theta +/-3) by adopting the Hough transformation with 1 pixel distance step length and 0.1-degree angle step length; and in the vicinity of the station point, a radio frequency card embedded in the ground triggers a vehicle-mounted card reader to acquire current station information, the DSP acquires the current station information in a serial port receiving interruption mode, and a camera is started to acquire images and feed back the two-dimensional deviation value of the AGV currently relative to a positioning reference point.
The method for realizing the image acquisition, processing and control embedded system is mainly applied to the field of material conveying and assembling of a factory automation logistics production line, and can also be applied to the conveying of patients or articles which are not suitable for manual conveying environments, such as hospitals, chemical engineering and the like.
The invention designs a visual guidance embedded method of an automatic guidance trolley, and compared with a design method of an extended image acquisition card of an industrial personal computer and an embedded platform taking a DSP as a core, the invention reduces the cost and the power consumption of an AGV system and improves the integration reliability of the system while ensuring the real-time performance. On the basis, the invention also realizes the image calibration and guidance of the double-camera arrangement and the RFID map building method, and greatly improves the navigation precision, the predictability and the flexibility of the whole system.
Drawings
FIG. 1 is a top view of the automated guided vehicle of the present invention;
FIG. 2 is a front view of the automated guided vehicle of the present invention;
FIG. 3 is a camera mount of the present invention;
FIG. 4 is a diagram of a backplane based embedded design architecture of the present invention;
FIG. 5 is a flow chart of image processing according to the present invention;
FIG. 6 is a schematic diagram of the image acquisition of the present invention;
fig. 7 is a camera calibration template according to the present invention.
FIG. 8 is a calibration image of the camera of the present invention.
FIG. 9 is the image after the radial distortion and nonlinear correction of the camera lens of the invention.
Fig. 10 is a schematic diagram of an embodiment of the present invention.
Detailed Description
Referring to fig. 1 to 4, 1, an automatic guided vehicle 2, a planetary reduction gear 3, a coupling 4, a left driving wheel 5, a coupling 6, a left driving dc servo motor 7, a camera 8, an audible and visual alarm 9, an emergency stop button 110, a left turn signal lamp 11, a radio frequency card reader 12, a camera 13, a camera view 14, a laser scanner 15, a front universal wheel 16, a right turn signal lamp 17, a contact sensing buffer sensor 18, a camera view 19, a planetary reduction gear 20, a coupling 21, a left driving wheel 22, a coupling 23, a right driving dc servo motor 24, an emergency stop button 225, a battery pack 26, a power switch 27, an upper protection key switch 28, an emergency stop button 329, a rear universal wheel 30, an embedded system control box 31, a camera support 32, a camera support 33, a radio frequency tag protector 34, a charging slot 35, an overload 36, The light source 37 comprises a bracket end surface 138, a bracket connecting rod 39, a spherical hinge 240, a bracket end surface 41, a fastening knob 42, a spherical hinge 143, a power supply board 44, a sensor signal input board 45, a control output board 46, an image acquisition and processing board 47, a motor control board 48, a wireless communication board 49, a reserved upgrading board 50, a circuit board slot 51, a golden finger slot 52 and a back board.
As shown in fig. 1 to 4, the automatic guided vehicle uses a vehicle-mounted battery pack as a power supply, determines the number of storage batteries according to the actual load and the continuous operation time of one-time charging, and obtains the required battery capacity and voltage in a series and parallel manner. Different parts in the system need different power supply voltages, the voltage of a direct current servo motor is generally 48V or 24V, and proper torque and power are selected according to the maximum load requirement of the system; the embedded control system has low power consumption, the sum of the power consumption does not exceed 10W, and the voltage of 12V, 5V and 3.3V can be obtained from the storage battery through a common DC-DC power supply isolation conversion module. The working voltage of the camera is 12V, the working voltage of the card reader is 5V, and the power of the corresponding DC-DC power conversion module is selected according to the power consumption of the components. The power switch and the power-on protection key switch are connected in series to control the power-on of the system.
An embedded system is adopted to realize image acquisition, image processing, human-computer interface, motor control, path tracking, positioning, obstacle avoidance and wireless communication of the two cameras. Placing a camera I obliquely forwards at a certain angle with the ground for collecting a long-range image; the camera II is arranged in the center of the trolley and is vertical to the ground, is used for secondary accurate positioning, and generally has small visual field and high magnification. Controlling image acquisition and processing by a DSP (digital signal processor) TMS320DM 642; an ARM microcontroller LPC2220 is used for processing a human-computer interface, motor control, wireless communication, obstacle avoidance and battery power detection.
As shown in figure 7, the mounting height of the(s) camera and the included angle between the(s) camera and the ground are related by the maximum running speed of the trolley, and the farthest point of the general visual field is 1-1.2 times of the maximum running speed. The object distance, focal length and effective pixel number of the camera determine the secondary positioning precision. The pose, the focal length and the geometric distortion of the two cameras relative to the trolley are respectively subjected to nonlinear calibration by adopting a planar grid two-step calibration method, and the accuracy of the calibration can influence the precision of secondary positioning.
As shown in fig. 5-6, in the program flow, the VP1 channel is connected to the camera:, and accurate displacement and angle information is obtained through filtering, marginalization and other methods; the VP2 channel is connected with the video camera and is obtained by algorithms such as binarization, curve fitting and the likeAnd obtaining rough information of the front path and predicting the front path. The two frames of images are rapidly transmitted by interrupting the EDMA, and are controlled by switching interruption in order to ensure the consistency of the images. And the RFID information is read through the serial port to acquire the position, the number, the turning information and the like of the vehicle in the whole situation. The whole vehicle is connected with the ARM controller through the laser sensor barrier, the CAN bus and the ARM controller to complete the navigation information communication function. The running route of the AGV trolley is formed by pasting a common black or white adhesive tape (the difference is large according to the gray level contrast with the ground), the two cameras are adopted to monitor the route of the AGV, and the direction-changing walking of any angle of the AGV trolley can be realized through the two cameras. An 1/3-inch CCD camera and a zoom camera are adopted to output a television analog signal of a standard PAL system, and the analog signal is converted into standard 8-bit ITU-R BT.656 YCbCr 4:2:2(720 × 576) embedded synchronous format digital image data through a decoding chip TVP 5150. According to I inside the DSP2The C interface controls and receives synchronous signals of horizontal synchronization, vertical synchronization, field synchronization and the like of YUV (4:2:2) to the TVP 5150. In consideration of the large amount of color data, the present design is based only on the image processing of the gradation for the purpose of improved real-time of the image processing, and therefore only the luminance component Y in the image data YCbCr is acquired.
In order to improve the real-time performance of the system as much as possible, the DSP only collects the gray information of the image, and three buffer areas are divided in the external SDRAM for the image collected by each camera, wherein the three buffer areas are respectively as follows: a current processed frame, a last valid acquisition frame, and a current acquisition frame. The DSP adopts an EDMA mode to collect images, management is not needed in the data collection process, and the target buffer area of next EDMA transmission is managed to be the last effective collection frame only in the hardware interruption generated when each frame is collected.
In the normal running process of the trolley, an image acquired by a camera is subjected to Gaussian high-pass filtering and edge detection based on two directions of horizontal and vertical gradients, after self-adaptive binarization, in order to improve the real-time performance of the system and ensure the calculation accuracy, firstly, 2 pixel distance step length and 2-degree angle step length Hough transformation are adopted to roughly calculate (rho, theta), and on the basis of the result, a calculation accurate solution is calculated within the range of (rho +/-3, theta +/-3) by the Hough transformation with 1 pixel distance step length and 0.1-degree angle step length. And in the vicinity of the station point, a radio frequency card embedded in the ground triggers a vehicle-mounted card reader to acquire current station information, the DSP acquires the current station information in a serial port receiving interruption mode, and a camera is started to acquire images and feed back the two-dimensional deviation value of the AGV currently relative to a positioning reference point.
The radio frequency card is embedded on the ground at the key position of the running path of the trolley, the card reader is arranged at the front part of the trolley, and the reading range of the antenna power can be adjusted according to the requirement, generally 20 mm-200 mm. When the trolley is going to pass through the radio frequency tag, the radio frequency tag triggers the card reader to acquire information, and the information amount can be set according to system requirements. The system can include a large amount of information such as speed levels, path numbers, station numbers, warehouse numbers, parking time, intersection numbers and the like, and can well solve the problems of insufficient instantaneity and ambiguity caused by visual identification of identification characteristics. As the ground has an absorption function on the electromagnetic waves emitted by the card reader, the reliability of radio frequency reading can be ensured only by adopting the 905-925 MHz anti-metal tag.
The automatic guided vehicle realizes data transmission with an upper computer control center and a handheld remote controller PDA through a wireless Ethernet mode module. The upper computer can save and check the state of any current trolley through a database and a human-computer interface; the PDA can remotely control the target trolley in real time through the wireless Ethernet and realize data acquisition and fault diagnosis of the current running state of the target trolley.
As shown in fig. 10, the hardware design of the embedded image acquisition, processing and control system is combined by using a backplane bus, and the embedded system is divided into a plurality of function boards, including a power board, a sensor signal input board, a control output board, an image acquisition and processing board, a motor control board, a wireless communication board and other reserved upgrading boards, wherein the reserved upgrading boards are reserved for expansion according to different application requirements. Each function board adopts the combination of backplate form, makes things convenient for system upgrade and maintenance. The function boards which need to communicate with each other are communicated in the form of an industrial field bus, and the communication mode has high real-time performance and strong anti-interference capability, is not limited by the number of nodes and is convenient for system upgrading.
As shown in fig. 10, the embedded system control box (2) designed based on the backplane bus of the present invention is used as an inner core of an image acquisition, image processing and control strategy, wherein the image acquisition and processing are controlled by a DSP processor TMS320DM 642; an ARM microcontroller LPC2210 is used for processing a human-computer interface, motor control, wireless communication, obstacle avoidance and battery power detection, and an HPI interface is used for communication between a DSP and an ARM. The DSP divides three buffer areas for the image collected by each camera to improve the real-time performance of the system as much as possible, the image is collected to an external SDRAM in an EDMA mode, the collected image is subjected to Gaussian high-pass filtering, edge detection and two-step Hough transformation to calculate the position deviation and the angle deviation of the trolley relative to the current path, and the two-dimensional deviation amount of the AGV relative to the current positioning reference point is fed back at a work site. The automatic guided vehicle adopting the method has low cost, low power consumption, high real-time performance and high reliability; the number of system paths and stations can reach tens of thousands; the secondary positioning precision reaches +/-1 mm. The system uses the laser scanner (6) to scan the obstacles in the range of 180 degrees and 3 meters in front in real time, so as to realize obstacle avoidance detection and obstacle avoidance.
An embedded control box is adopted as the core of image acquisition, image processing, path tracking, obstacle detection and radio frequency identification. The left driving wheel and the right driving wheel are arranged on two sides of the central line of the trolley to provide power for the trolley to advance. The front and the back are respectively provided with a universal wheel for supporting. The left driving wheel and the right driving wheel are driven through the left driving direct current servo motor and the right driving direct current servo motor respectively through the coupling transmission and the planetary gear speed reduction, the design has the advantage that when the two driving wheels rotate in the same speed in opposite directions, the automatic guiding car can be turned in situ, and the two-wheel differential turning mode can better highlight the advantage of small turning radius in a narrow working occasion.
As shown in fig. 1-4, the safety protection system of the automatic guided vehicle comprises a laser scanner, a contact sensing buffer sensor, an audible and visual alarm, a voltage real-time detection device, an overload protector, an emergency stop button 1, an emergency stop button 2, an emergency stop button 3 and a safety key switch; the green light is on when the trolley normally runs; when the barrier is encountered, the red light is on and the buzzer sounds; the yellow light is on and the buzzer sounds when the electric quantity of the storage battery is insufficient. Two-stage safety protection is arranged in front of the system: a laser scanner and a contact sensitive buffer sensor; the laser scanner can scan 30 barriers with a certain distance in a certain plane 180 degrees in the advance direction of the trolley every second, the scanning angle interval can be set to be 1-5 degrees, and the scanning distance range can be set to be 0-30 meters. The contact induction buffer sensor can detect objects in contact with the front of the trolley and trigger the microcontroller ARM to respond to an emergency stop task through the I/O port, and the contact induction buffer sensor has certain buffering and is controlled to avoid severe collision.
The automatic guiding trolley adopts a design similar to an automobile and is provided with a left turning indicator light and a right turning indicator light. The charging slot is used for charging the storage battery pack.
The camera is arranged on the end surface 1 of the camera bracket with spherical hinges at two ends as shown in figure 3 through screws, and the end surface 2 of the bracket is fixed on the trolley; the support has 6 degrees of freedom, and the installation position of the camera can be conveniently adjusted. The camera is inclined forwards at a certain angle with the ground, and is used for collecting long-range images, and the installation height and the posture of the camera relative to the center of the trolley are adjusted, so that the visual field of the camera is right ahead of the trolley, namely the transverse center line of the trolley is superposed with the horizontal center line of the visual field of the camera. The installation height and the installation inclination angle of the camera are related to the maximum running speed of the trolley, and the height of the visual field is 1-1.2 times of the moving distance per second. The illuminating lamp is used for providing illumination for the camera II, and the camera II is arranged in the middle front part in the trolley and is vertical to the ground for secondary accurate positioning. The installation height and the focal length of the camera II are adjusted, so that the position of the visual field of the camera II relative to the trolley is shown in figure 1, the center of the trolley is positioned in the visual field of the camera II and close to the lower edge of the visual field, the imaging magnification (the actual physical size represented by each pixel) of the camera II determines the precision of secondary positioning, and the object distance, the focal length and the effective pixel number determine the magnification of the camera II according to the optical imaging principle.
As shown in fig. 7-9, the pose, the focal length and the geometric distortion of the two cameras relative to the trolley are respectively nonlinearly calibrated by adopting a two-step calibration method of a planar grid, and the accuracy of calibration can influence the precision of secondary positioning. The positioning accuracy of the trolley can be obviously improved.
The camera calibration is divided into two steps, wherein in the first step, a mathematical model of camera imaging is established by using internal parameters and external parameters according to the principles of physical optics and camera imaging; and secondly, calculating writing parameters by adopting a direct or iterative algorithm. The internal parameters reflect the internal geometry of the camera and the optical characteristics of the lens, are independent of the scene view, and once calculated, can be reused (as long as the focal length is fixed). The extrinsic parameters reflect the motion of the camera relative to a fixed scene in a reference world coordinate system, or conversely, the rigid motion of objects around the camera. Camera calibration can be classified into linear calibration and nonlinear calibration depending on whether the distortion caused by the lens is considered. The linear calibration adopts a least square method to solve parameters, although the parameters are simple, distortion is not considered, the obtained system error is large, and the application significance is not large. The nonlinear calibration considers the radial and tangential distortion of the image generated by the influence of a non-ideal lens, and the parameter solving method mainly adopts the iterative operation of an optimization method to minimize a function value with a constraint condition. The function is usually expressed as the sum of the distances between the ideal value and the actual value of all the feature point models, the iterative operation has the advantages that almost all the nonlinear models can be calibrated, and as the iteration times are increased, the function is converged, the calibration precision is continuously improved, but a group of approximate initial values must be found to ensure the function to be converged. Tsai, Weng, and Wei propose two-step solution parameters. The linear model is firstly adopted to solve the estimation values of the parameters, and then the estimation values are used as the initial values of the iterative method for operation, so that the rapid convergence can be ensured.
The following four coordinate systems are involved in the pin-hole model (pin-hole model) vision system:
pixel coordinate system: (u, v) for representing the position of an image pixel in the image array; for an M x N digital image, the origin of coordinates is located in the upper left corner of the image, the number of columns to the right along the edge of the image is u, and the number of rows down the edge of the image is v. (u, v) represents the number of columns and rows of pixels in the array.
Image plane coordinate system: (x, y) is used to represent the projection of the scene point onto the image plane in millimeters. The optical axis is vertical to the image plane, and the origin of coordinates is located at the intersection point of the optical axis and the image plane, namely the center of the CCD image plane; the horizontal direction is x, and the vertical direction is y.
Camera coordinate system: (X)c,Yc,Zc) For indicating the origin, X, with the camera's optical centrecAxis and YcThe axis being parallel to the x-and y-axes of the image, ZcThe axis is the optical axis of the camera.
World coordinate system: (X)w,Yw,Zw) For describing the coordinates of any object in the environment. The method is mainly used for coordinates of the space point P in a world coordinate system.
Conversion of pixel coordinates to image plane coordinates: in the pixel coordinate system, since (u, v) only represents the column number and the row number of the pixel in the array, and does not represent the position of the pixel in the image, in the x, y coordinate system, the origin O1Defined at the intersection of the optical axis of the digital camera and the image plane, the point is generally located at the center of the image, but may be somewhat displaced due to the digital camera production, if O1The coordinate in the u, v coordinate system is (u)0,v0) And the size of each pixel in the directions of the x axis and the y axis is dx, dy, then the coordinates of any one pixel (u, v) in the image under two coordinate systems are expressed in a matrix form as follows:
u v 1 = 1 dx 0 u 0 0 1 dy v 0 0 0 1 x y 1
conversion between the camera coordinate system and the image plane coordinate system: the projection position P' of any point P in the space on the image, which can be approximated by a pinhole model, is the intersection point of the line OP connecting the optical center O and the point P and the image plane. From the proportional relationship, the following relationship can be obtained:
Z c x y 1 = f 0 0 0 f 0 0 0 1 X c Y c Z c - - - ( 2 )
wherein, (x, y) is the image coordinate of the point P; (X)c,Yc,Zc) Is the coordinate of the space point P in the digital camera coordinate system.
The relationship between the digital camera coordinate system and the world coordinate system can be described by a rotation matrix S and a translation vector r. Due to the fact thatThe homogeneous coordinate of a certain point P in the space under the world coordinate system and the digital camera coordinate system is (X)w,Yw,Zw,1)TAnd (X)c,Yc,Zc,1)TThe following relationship exists:
X c Y c Z c = S 11 S 12 S 13 R 1 S 21 S 22 S 23 R 2 S 31 S 32 S 33 R 3 X w Y w Z w 1 = M 2 X w Y w Z w 1 - - - ( 3 )
wherein, S = S 11 S 12 S 13 S 21 S 22 S 23 S 31 S 32 S 33 , is a 3 x 3 unit orthogonal matrix; R = R 1 R 2 R 3 is a three-dimensional translation vector; 0 ═ (0, 0, 0)T;M2Is a 3 x 4 matrix.
From the established world coordinate system, Zw0; the formula (3) can be written as
X c Y c Z c = S 11 S 12 R 1 S 21 S 22 R 2 S 31 S 32 R 3 X w Y w 1
By substituting (5-2) and (5-3) into the above equation, we obtain the relation between the coordinates of the point P in the world coordinate system and the coordinates (u, v) of its projected point P:
Z c u v 1 = a x 0 u 0 0 a y v 0 0 0 1 S | r X w Y w Z w 1 = M 1 M 2 X w Y w Z w 1 = M X w Y w Z w 1
Z c u v 1 = a x 0 u 0 0 a y v 0 0 0 1 S 11 S 12 R 1 S 21 S 22 R 2 S 31 S 32 R 3 X w Y w 1
wherein, ax=f/dx,ayF/dy is the focal length value in the unit of pixel number in the horizontal direction and the vertical direction respectively; m is a 3 × 4 matrix, called a projection matrix; m1Is totally formed by ax,ay,u0,v0Determine due to ax,ay,u0,v0Only in relation to the internal structure of the digital camera, we call these parameters as the internal parameters of the digital camera; m2The method is completely determined by the orientation of the digital camera to a world coordinate system, is called as external parameters of the digital camera, and determines internal and external parameters of a certain digital camera, and is called as digital camera calibration.
Real lenses usually have some distortion, the main distortion being radial distortion, and also have slight tangential distortion. The above model can be extended to:
<math> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <msup> <mi>x</mi> <mo>&prime;</mo> </msup> <mo>=</mo> <mi>x</mi> <mrow> <mo>(</mo> <mn>1</mn> <mo>+</mo> <msub> <mi>k</mi> <mn>1</mn> </msub> <msup> <mi>r</mi> <mn>2</mn> </msup> <mo>+</mo> <msub> <mi>k</mi> <mn>2</mn> </msub> <msup> <mi>r</mi> <mn>4</mn> </msup> <mo>)</mo> </mrow> <mo>+</mo> <mn>2</mn> <msub> <mi>p</mi> <mn>1</mn> </msub> <mi>xy</mi> <mo>+</mo> <msub> <mi>p</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>r</mi> <mn>2</mn> </msub> <mo>+</mo> <mn>2</mn> <msup> <mi>x</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msup> <mi>y</mi> <mo>&prime;</mo> </msup> <mo>=</mo> <mi>y</mi> <mrow> <mo>(</mo> <mn>1</mn> <mo>+</mo> <msub> <mi>k</mi> <mn>1</mn> </msub> <msup> <mi>r</mi> <mn>2</mn> </msup> <mo>+</mo> <msub> <mi>k</mi> <mn>2</mn> </msub> <msup> <mi>r</mi> <mn>4</mn> </msup> <mo>)</mo> </mrow> <mo>+</mo> <mn>2</mn> <msub> <mi>p</mi> <mn>2</mn> </msub> <mi>xy</mi> <mo>+</mo> <msub> <mi>p</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>r</mi> <mn>2</mn> </msub> <mo>+</mo> <mn>2</mn> <msup> <mi>x</mi> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mtd> </mtr> </mtable> </mfenced> </math>
k1and k2Is the radial deformation coefficient, p1And p1Is the tangential deformation coefficient.
A standard grid template (a 20 x 20mm square grid array is adopted in the method) shown in figure 8 is placed in the visual field of the camera, images are respectively collected as shown in figures 8 and 9, and coordinate transformation of the two cameras relative to the trolley is calculated by adopting a two-step method of a plane model. Because the camera is used for accurate positioning, the radial distortion of the lens of the camera is calculated by adopting the formula (1). And obtaining the values of K1 and K2 after convergence of iterative operation. Fig. 8 is a camera calibration template according to the present invention. FIG. 8 is a calibration image of the camera of the present invention. FIG. 9 is the image after the radial distortion and nonlinear correction of the camera lens of the invention. Fig. 10 is a schematic diagram of an embodiment of the present invention.

Claims (5)

1. A visual guidance AGV system of an embedded system adopts an embedded hardware system which is composed of a DSP (digital signal processor) as an image processor, an ARM as a task manager and an FPGA (field programmable gate array) as a coprocessor, and is characterized in that a camera is obliquely arranged in a forward direction at a certain angle with the ground and is used for acquiring path information to be traveled and predicting deceleration before turning, parking and positioning; and the camera is arranged in the center of the trolley and is vertical to the ground, and is used for acquiring the current path information of the trolley during running, including the position and angle deviation of the trolley relative to the guide path in the normal running process, the accurate positioning of a parking point and the accurate turning of a turning position.
2. Visual guidance AGV system according to an embedded system according to claim 1, characterised in that the camera (r) is mounted at a height and angle to the ground related to the maximum travelling speed of the car, the furthest point of the field of view being 1-1.2 times the maximum travelling speed.
3. A method for calibrating a visual guidance AGV system according to claim 1, characterized in that the method comprises the following steps: two cameras (7) and (12) fixed on the trolley are adopted to collect guide path information in real time, wherein the camera (12) inclines forwards at a certain angle with the ground to collect a long-range image and predict a path along which the trolley will walk; the camera II (7) is arranged at the middle front part in the trolley and is vertical to the ground for secondary accurate positioning; the method comprises the following steps that an anti-metal radio frequency tag is embedded in the ground surface of a specific position, and when a trolley passes above the tag, a vehicle-mounted radio frequency card reader (11) acquires information in the tag; the system adopts a laser scanner (14) to scan obstacles in the range of 180 degrees and 3 meters in front in real time, so as to realize obstacle avoidance detection and obstacle avoidance; the embedded system control box (25) is used as an inner core of image acquisition, image processing and control strategies; and calculating the position deviation and the angle deviation of the trolley relative to the current path through Gaussian high-pass filtering, edge detection and two-step Hough transformation of the acquired image, and feeding back the two-dimensional deviation amount of the AGV relative to the current positioning reference point at the work site.
4. The method for calibrating a visual guidance AGV system according to claim 3, wherein the embedded system control box (25) comprises a DSP processor and an ARM microcontroller, the DSP processor only collects gray scale information of images, and the images collected by each camera are divided into three buffers in the external SDRAM, the three buffers are respectively: a current processing frame, a last valid acquisition frame and a current acquisition frame; the DSP adopts an EDMA mode to collect images, management is not needed in the data collection process, and the target buffer area of next EDMA transmission is managed to be the last effective collection frame only in the hardware interruption generated when each frame is collected.
5. The calibration method of the AGV system according to claim 4, wherein during normal operation of the cart, the image collected by the camera is subjected to Gaussian high-pass filtering, edge detection based on two directions of horizontal and vertical gradients, after adaptive binarization, the length and angle are roughly calculated by using Hough transformation with 2 pixel distance step length and 2 ° angle step length, and the result is a position parameter (ρ, θ), and on the basis of the result, the calculation accuracy solution is calculated within the range of (ρ ± 3, θ ± 3) by using Hough transformation with 1 pixel distance step length and 0.1 ° angle step length; and in the vicinity of the station point, a radio frequency card embedded in the ground triggers a vehicle-mounted card reader to acquire current station information, the DSP acquires the current station information in a serial port receiving interruption mode, and a camera is started to acquire images and feed back the two-dimensional deviation value of the AGV currently relative to a positioning reference point.
CN2011104366723A 2011-12-23 2011-12-23 Vision guiding AGV (Automatic Guided Vehicle) system and method of embedded system Pending CN102608998A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011104366723A CN102608998A (en) 2011-12-23 2011-12-23 Vision guiding AGV (Automatic Guided Vehicle) system and method of embedded system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011104366723A CN102608998A (en) 2011-12-23 2011-12-23 Vision guiding AGV (Automatic Guided Vehicle) system and method of embedded system

Publications (1)

Publication Number Publication Date
CN102608998A true CN102608998A (en) 2012-07-25

Family

ID=46526450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011104366723A Pending CN102608998A (en) 2011-12-23 2011-12-23 Vision guiding AGV (Automatic Guided Vehicle) system and method of embedded system

Country Status (1)

Country Link
CN (1) CN102608998A (en)

Cited By (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749922A (en) * 2012-07-26 2012-10-24 苏州工业园区职业技术学院 Artificially assembled and disassembled automatic guided vehicle control system
CN102830704A (en) * 2012-09-19 2012-12-19 苏州工业园区职业技术学院 Single drive manual loading and unloading automatic guided vehicle (AGV) control system
CN102955476A (en) * 2012-11-12 2013-03-06 宁波韵升股份有限公司 Automatic guided vehicle (AGV) path planning method based on radio frequency identification (RFID) technology
CN102968119A (en) * 2012-11-22 2013-03-13 日东电子发展(深圳)有限公司 Automatic visual guide vehicle for constant illumination
CN102980555A (en) * 2012-12-06 2013-03-20 紫光股份有限公司 Method and device for detecting direction of optical imaging type wheeled mobile robot
CN103064417A (en) * 2012-12-21 2013-04-24 上海交通大学 Global localization guiding system and method based on multiple sensors
CN103196678A (en) * 2013-03-26 2013-07-10 北京嘉悦和汽车科技有限公司 Real-time revise device and method of input picture of four-wheel position finder based on digital signal processor (DSP)
CN103294059A (en) * 2013-05-21 2013-09-11 无锡普智联科高新技术有限公司 Hybrid navigation belt based mobile robot positioning system and method thereof
CN103488174A (en) * 2013-09-16 2014-01-01 北京邮电大学 Automatic guiding control method, device and system
CN103488176A (en) * 2013-09-29 2014-01-01 中国科学院深圳先进技术研究院 Automatic guided vehicle scheduling method and system
CN103558855A (en) * 2013-11-10 2014-02-05 广西柳工路创制造科技有限公司 Intelligent logistics tracking trolley and control system thereof
CN103713633A (en) * 2012-10-04 2014-04-09 财团法人工业技术研究院 Travel control device and automatic guide vehicle with same
CN104029207A (en) * 2013-03-08 2014-09-10 科沃斯机器人科技(苏州)有限公司 Laser-guided walking operation system for self-moving robot and control method for same
CN104050729A (en) * 2013-03-12 2014-09-17 雷蒙德股份有限公司 System and Method for Gathering Video Data Related to Operation of an Autonomous Industrial Vehicle
CN104503451A (en) * 2014-11-27 2015-04-08 华南农业大学 Obstacle-avoidance automatic guidance method and automatic guided vehicle based on vision and ultrasonic sensing
CN104635735A (en) * 2014-12-03 2015-05-20 上海好创机电工程有限公司 Novel AGV visual navigation control method
CN104699104A (en) * 2015-03-17 2015-06-10 武汉纺织大学 Self-adaptive AGV (Automatic Guided Vehicle) visual navigation sight adjusting device and trace tracking method
CN105446333A (en) * 2015-11-10 2016-03-30 中辰环能技术(株洲)有限公司 Visual agv navigation system
CN105467991A (en) * 2014-09-29 2016-04-06 日立建机株式会社 Stop position determining device for transport vehicle and transport vehicle with the same
CN105468005A (en) * 2016-02-03 2016-04-06 天津市乐图软件科技有限公司 Automatic trolley guiding system and method based on RFID and CCD
CN105679168A (en) * 2015-12-04 2016-06-15 南京航空航天大学 Teaching experimental platform simulating ramp vehicle dispatching
CN105737838A (en) * 2016-02-22 2016-07-06 广东嘉腾机器人自动化有限公司 AGV path tracking method
CN105843229A (en) * 2016-05-17 2016-08-10 中外合资沃得重工(中国)有限公司 Unmanned intelligent vehicle and control method
CN106094815A (en) * 2016-05-31 2016-11-09 芜湖智久机器人有限公司 A kind of control system of AGV
CN106125738A (en) * 2016-08-26 2016-11-16 北京航空航天大学 A kind of identification of pallets device and method based on AGV
CN106325267A (en) * 2015-06-26 2017-01-11 北京卫星环境工程研究所 Omnidirectional mobile platform vehicle with automatic line patrolling and obstacle avoiding functions
CN106394244A (en) * 2015-07-29 2017-02-15 无锡美驱科技有限公司 Wireless control system of electric vehicle drive device
CN106444758A (en) * 2016-09-27 2017-02-22 华南农业大学 Road identification and path optimization AGV (automatic guided vehicle) based on machine vision and control system of AGV
CN106774335A (en) * 2017-01-03 2017-05-31 南京航空航天大学 Guiding device based on multi-vision visual and inertial navigation, terrestrial reference layout and guidance method
TWI585561B (en) * 2013-06-03 2017-06-01 新智控私人有限公司 Method and apparatus for offboard navigation of a robotic device
CN106843218A (en) * 2017-02-16 2017-06-13 上海理工大学 Workshop homing guidance device dispatching method
CN106873590A (en) * 2017-02-21 2017-06-20 广州大学 A kind of carrier robot positioning and task management method and device
CN106950985A (en) * 2017-03-20 2017-07-14 成都通甲优博科技有限责任公司 A kind of automatic delivery method and device
CN107045677A (en) * 2016-10-14 2017-08-15 北京石油化工学院 A kind of harmful influence warehouse barrier Scan orientation restoring method, apparatus and system
CN107422730A (en) * 2017-06-09 2017-12-01 武汉市众向科技有限公司 The AGV transportation systems of view-based access control model guiding and its driving control method
CN107491076A (en) * 2017-09-21 2017-12-19 南京中高知识产权股份有限公司 Intelligent storage robot
CN107601202A (en) * 2017-11-09 2018-01-19 上海木爷机器人技术有限公司 Lift space detection method and device
CN107831675A (en) * 2016-09-16 2018-03-23 天津思博科科技发展有限公司 Online robot control device based on intelligence learning technology
CN107885199A (en) * 2017-10-11 2018-04-06 上海艾崇机器人有限公司 A kind of AGV arrives at a station positioner and its method
CN107918840A (en) * 2016-10-09 2018-04-17 浙江国自机器人技术有限公司 A kind of mobile unit, stock article management system and the method for positioning mobile unit
CN108021113A (en) * 2017-12-14 2018-05-11 大连四达高技术发展有限公司 Cutter transport system
CN108255195A (en) * 2018-01-19 2018-07-06 启迪国信科技有限公司 Unmanned plane and UAV system
CN108267139A (en) * 2018-03-07 2018-07-10 广州大学 A kind of positioning device and localization method of AGV trolleies
CN108767933A (en) * 2018-07-30 2018-11-06 杭州迦智科技有限公司 A kind of control method and its device, storage medium and charging equipment for charging
CN108845573A (en) * 2018-05-30 2018-11-20 上海懒书智能科技有限公司 A kind of laying of AGV visual track and optimization method
CN108955519A (en) * 2018-04-09 2018-12-07 江苏金海湾智能制造有限公司 Express delivery living object detection system and method
CN108983777A (en) * 2018-07-23 2018-12-11 浙江工业大学 A kind of autonomous exploration and barrier-avoiding method based on the selection of adaptive forward position goal seeking point
CN109214483A (en) * 2018-09-13 2019-01-15 公安部道路交通安全研究中心 A kind of motor vehicle checking system and method
CN109571411A (en) * 2019-01-08 2019-04-05 贵州大学 It is a kind of to be convenient for fixed mobile robot platform
CN109656252A (en) * 2018-12-29 2019-04-19 广州市申迪计算机系统有限公司 A kind of middle control degree system and positioning navigation method based on AGV
FR3073508A1 (en) * 2017-11-14 2019-05-17 Renault S.A.S. DELIVERY LINE COMPRISING AN IMPROVED TRANSPORT TROLLEY
CN110070581A (en) * 2019-04-29 2019-07-30 达泊(东莞)智能科技有限公司 Double vision open country localization method, apparatus and system
CN110162066A (en) * 2019-06-27 2019-08-23 广东利元亨智能装备股份有限公司 Intelligent cruise control system
CN110182509A (en) * 2019-05-09 2019-08-30 盐城品迅智能科技服务有限公司 A kind of track guidance van and the barrier-avoiding method of logistic storage intelligent barrier avoiding
CN110347160A (en) * 2019-07-17 2019-10-18 武汉工程大学 A kind of automatic guide vehicle and its air navigation aid based on dual camera barcode scanning
WO2020191711A1 (en) * 2019-03-28 2020-10-01 Baidu.Com Times Technology (Beijing) Co., Ltd. A camera-based low-cost lateral position calibration method for level-3 autonomous vehicles
CN112607293A (en) * 2017-01-16 2021-04-06 浙江国自机器人技术股份有限公司 Safety protection method and safety protection structure of AGV robot
CN112896546A (en) * 2021-02-22 2021-06-04 浙江大学 Aircraft mobile support platform layout
CN114056863A (en) * 2020-08-03 2022-02-18 中西金属工业株式会社 Unmanned transport vehicle system
CN114296443A (en) * 2021-11-24 2022-04-08 贵州理工学院 Unmanned modular combine harvester
CN114578772A (en) * 2021-04-16 2022-06-03 西南交通大学 AGV cluster control system design framework and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1696854A (en) * 2004-05-14 2005-11-16 三星光州电子株式会社 Mobile robot and system and method of compensating for path diversions
CN1721142A (en) * 2004-07-15 2006-01-18 中国科学院自动化研究所 A kind of stereoscopic vision monitoring device with five degrees of freedom
CN101183265A (en) * 2007-11-15 2008-05-21 浙江大学 Automatic guidance system based on radio frequency identification tag and vision and method thereof
CN101561680A (en) * 2009-05-11 2009-10-21 南京航空航天大学 Embedded guidance device of autonomous vehicle and intelligent composite guidance method thereof
EP2354877A1 (en) * 2010-02-02 2011-08-10 Firac Method for controlling an automatically guided vehicle and assocaited vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1696854A (en) * 2004-05-14 2005-11-16 三星光州电子株式会社 Mobile robot and system and method of compensating for path diversions
CN1721142A (en) * 2004-07-15 2006-01-18 中国科学院自动化研究所 A kind of stereoscopic vision monitoring device with five degrees of freedom
CN101183265A (en) * 2007-11-15 2008-05-21 浙江大学 Automatic guidance system based on radio frequency identification tag and vision and method thereof
CN101561680A (en) * 2009-05-11 2009-10-21 南京航空航天大学 Embedded guidance device of autonomous vehicle and intelligent composite guidance method thereof
EP2354877A1 (en) * 2010-02-02 2011-08-10 Firac Method for controlling an automatically guided vehicle and assocaited vehicle

Cited By (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749922A (en) * 2012-07-26 2012-10-24 苏州工业园区职业技术学院 Artificially assembled and disassembled automatic guided vehicle control system
CN102830704A (en) * 2012-09-19 2012-12-19 苏州工业园区职业技术学院 Single drive manual loading and unloading automatic guided vehicle (AGV) control system
CN103713633A (en) * 2012-10-04 2014-04-09 财团法人工业技术研究院 Travel control device and automatic guide vehicle with same
CN102955476A (en) * 2012-11-12 2013-03-06 宁波韵升股份有限公司 Automatic guided vehicle (AGV) path planning method based on radio frequency identification (RFID) technology
CN102968119A (en) * 2012-11-22 2013-03-13 日东电子发展(深圳)有限公司 Automatic visual guide vehicle for constant illumination
CN102980555A (en) * 2012-12-06 2013-03-20 紫光股份有限公司 Method and device for detecting direction of optical imaging type wheeled mobile robot
CN103064417A (en) * 2012-12-21 2013-04-24 上海交通大学 Global localization guiding system and method based on multiple sensors
CN103064417B (en) * 2012-12-21 2016-06-01 上海交通大学 A kind of Global localization based on many sensors guiding system and method
CN104029207A (en) * 2013-03-08 2014-09-10 科沃斯机器人科技(苏州)有限公司 Laser-guided walking operation system for self-moving robot and control method for same
CN104050729B (en) * 2013-03-12 2017-12-01 雷蒙德股份有限公司 The system and method for collecting the video data relevant with autonomous industrial vehicle operating
CN104050729A (en) * 2013-03-12 2014-09-17 雷蒙德股份有限公司 System and Method for Gathering Video Data Related to Operation of an Autonomous Industrial Vehicle
CN103196678A (en) * 2013-03-26 2013-07-10 北京嘉悦和汽车科技有限公司 Real-time revise device and method of input picture of four-wheel position finder based on digital signal processor (DSP)
CN103294059A (en) * 2013-05-21 2013-09-11 无锡普智联科高新技术有限公司 Hybrid navigation belt based mobile robot positioning system and method thereof
TWI585561B (en) * 2013-06-03 2017-06-01 新智控私人有限公司 Method and apparatus for offboard navigation of a robotic device
CN103488174A (en) * 2013-09-16 2014-01-01 北京邮电大学 Automatic guiding control method, device and system
CN103488174B (en) * 2013-09-16 2015-11-25 北京邮电大学 Homing guidance control method, control device and system
CN103488176A (en) * 2013-09-29 2014-01-01 中国科学院深圳先进技术研究院 Automatic guided vehicle scheduling method and system
CN103558855A (en) * 2013-11-10 2014-02-05 广西柳工路创制造科技有限公司 Intelligent logistics tracking trolley and control system thereof
CN105467991A (en) * 2014-09-29 2016-04-06 日立建机株式会社 Stop position determining device for transport vehicle and transport vehicle with the same
CN105467991B (en) * 2014-09-29 2020-07-31 日立建机株式会社 Stop position calculation device for transport vehicle and transport vehicle having the same
CN104503451B (en) * 2014-11-27 2017-05-24 华南农业大学 Obstacle-avoidance automatic guidance method and automatic guided vehicle based on vision and ultrasonic sensing
CN104503451A (en) * 2014-11-27 2015-04-08 华南农业大学 Obstacle-avoidance automatic guidance method and automatic guided vehicle based on vision and ultrasonic sensing
CN104635735A (en) * 2014-12-03 2015-05-20 上海好创机电工程有限公司 Novel AGV visual navigation control method
CN104699104A (en) * 2015-03-17 2015-06-10 武汉纺织大学 Self-adaptive AGV (Automatic Guided Vehicle) visual navigation sight adjusting device and trace tracking method
CN104699104B (en) * 2015-03-17 2018-02-02 武汉纺织大学 A kind of stitching tracking of adaptive AGV vision guided navigation sight adjusting apparatus
CN106325267A (en) * 2015-06-26 2017-01-11 北京卫星环境工程研究所 Omnidirectional mobile platform vehicle with automatic line patrolling and obstacle avoiding functions
CN106394244A (en) * 2015-07-29 2017-02-15 无锡美驱科技有限公司 Wireless control system of electric vehicle drive device
CN105446333A (en) * 2015-11-10 2016-03-30 中辰环能技术(株洲)有限公司 Visual agv navigation system
CN105679168A (en) * 2015-12-04 2016-06-15 南京航空航天大学 Teaching experimental platform simulating ramp vehicle dispatching
CN105468005A (en) * 2016-02-03 2016-04-06 天津市乐图软件科技有限公司 Automatic trolley guiding system and method based on RFID and CCD
CN105737838B (en) * 2016-02-22 2019-04-05 广东嘉腾机器人自动化有限公司 A kind of AGV path following method
CN105737838A (en) * 2016-02-22 2016-07-06 广东嘉腾机器人自动化有限公司 AGV path tracking method
CN105843229A (en) * 2016-05-17 2016-08-10 中外合资沃得重工(中国)有限公司 Unmanned intelligent vehicle and control method
CN105843229B (en) * 2016-05-17 2018-12-18 中外合资沃得重工(中国)有限公司 Unmanned intelligent carriage and control method
CN106094815A (en) * 2016-05-31 2016-11-09 芜湖智久机器人有限公司 A kind of control system of AGV
CN106125738A (en) * 2016-08-26 2016-11-16 北京航空航天大学 A kind of identification of pallets device and method based on AGV
CN107831675A (en) * 2016-09-16 2018-03-23 天津思博科科技发展有限公司 Online robot control device based on intelligence learning technology
CN106444758B (en) * 2016-09-27 2019-07-23 华南农业大学 A kind of road Identification based on machine vision and the preferred AGV transport vehicle in path
CN106444758A (en) * 2016-09-27 2017-02-22 华南农业大学 Road identification and path optimization AGV (automatic guided vehicle) based on machine vision and control system of AGV
CN107918840A (en) * 2016-10-09 2018-04-17 浙江国自机器人技术有限公司 A kind of mobile unit, stock article management system and the method for positioning mobile unit
CN107045677A (en) * 2016-10-14 2017-08-15 北京石油化工学院 A kind of harmful influence warehouse barrier Scan orientation restoring method, apparatus and system
CN106774335A (en) * 2017-01-03 2017-05-31 南京航空航天大学 Guiding device based on multi-vision visual and inertial navigation, terrestrial reference layout and guidance method
CN112607293A (en) * 2017-01-16 2021-04-06 浙江国自机器人技术股份有限公司 Safety protection method and safety protection structure of AGV robot
CN112607293B (en) * 2017-01-16 2022-05-03 浙江国自机器人技术股份有限公司 Safety protection method and safety protection structure of AGV robot
CN106843218A (en) * 2017-02-16 2017-06-13 上海理工大学 Workshop homing guidance device dispatching method
CN106873590B (en) * 2017-02-21 2020-04-14 广州大学 Method and device for positioning and task management of conveying robot
CN106873590A (en) * 2017-02-21 2017-06-20 广州大学 A kind of carrier robot positioning and task management method and device
CN106950985B (en) * 2017-03-20 2020-07-03 成都通甲优博科技有限责任公司 Automatic delivery method and device
CN106950985A (en) * 2017-03-20 2017-07-14 成都通甲优博科技有限责任公司 A kind of automatic delivery method and device
CN107422730A (en) * 2017-06-09 2017-12-01 武汉市众向科技有限公司 The AGV transportation systems of view-based access control model guiding and its driving control method
CN107491076A (en) * 2017-09-21 2017-12-19 南京中高知识产权股份有限公司 Intelligent storage robot
CN107885199A (en) * 2017-10-11 2018-04-06 上海艾崇机器人有限公司 A kind of AGV arrives at a station positioner and its method
CN107601202A (en) * 2017-11-09 2018-01-19 上海木爷机器人技术有限公司 Lift space detection method and device
FR3073508A1 (en) * 2017-11-14 2019-05-17 Renault S.A.S. DELIVERY LINE COMPRISING AN IMPROVED TRANSPORT TROLLEY
CN108021113A (en) * 2017-12-14 2018-05-11 大连四达高技术发展有限公司 Cutter transport system
CN108255195B (en) * 2018-01-19 2019-03-05 启迪国信科技有限公司 Unmanned plane and UAV system
CN108255195A (en) * 2018-01-19 2018-07-06 启迪国信科技有限公司 Unmanned plane and UAV system
CN108267139B (en) * 2018-03-07 2023-05-16 广州大学 AGV trolley positioning device and positioning method
CN108267139A (en) * 2018-03-07 2018-07-10 广州大学 A kind of positioning device and localization method of AGV trolleies
CN108955519A (en) * 2018-04-09 2018-12-07 江苏金海湾智能制造有限公司 Express delivery living object detection system and method
CN108845573A (en) * 2018-05-30 2018-11-20 上海懒书智能科技有限公司 A kind of laying of AGV visual track and optimization method
CN108983777A (en) * 2018-07-23 2018-12-11 浙江工业大学 A kind of autonomous exploration and barrier-avoiding method based on the selection of adaptive forward position goal seeking point
CN108767933A (en) * 2018-07-30 2018-11-06 杭州迦智科技有限公司 A kind of control method and its device, storage medium and charging equipment for charging
CN109214483A (en) * 2018-09-13 2019-01-15 公安部道路交通安全研究中心 A kind of motor vehicle checking system and method
CN109656252A (en) * 2018-12-29 2019-04-19 广州市申迪计算机系统有限公司 A kind of middle control degree system and positioning navigation method based on AGV
CN109571411A (en) * 2019-01-08 2019-04-05 贵州大学 It is a kind of to be convenient for fixed mobile robot platform
WO2020191711A1 (en) * 2019-03-28 2020-10-01 Baidu.Com Times Technology (Beijing) Co., Ltd. A camera-based low-cost lateral position calibration method for level-3 autonomous vehicles
CN110070581A (en) * 2019-04-29 2019-07-30 达泊(东莞)智能科技有限公司 Double vision open country localization method, apparatus and system
CN110182509A (en) * 2019-05-09 2019-08-30 盐城品迅智能科技服务有限公司 A kind of track guidance van and the barrier-avoiding method of logistic storage intelligent barrier avoiding
CN110182509B (en) * 2019-05-09 2021-07-13 杭州京机科技有限公司 Intelligent obstacle avoidance tracking guide carrier for logistics storage and obstacle avoidance method
CN110162066A (en) * 2019-06-27 2019-08-23 广东利元亨智能装备股份有限公司 Intelligent cruise control system
CN110347160A (en) * 2019-07-17 2019-10-18 武汉工程大学 A kind of automatic guide vehicle and its air navigation aid based on dual camera barcode scanning
CN114056863A (en) * 2020-08-03 2022-02-18 中西金属工业株式会社 Unmanned transport vehicle system
CN114056863B (en) * 2020-08-03 2023-10-20 中西金属工业株式会社 Unmanned carrier system
CN112896546A (en) * 2021-02-22 2021-06-04 浙江大学 Aircraft mobile support platform layout
CN114578772A (en) * 2021-04-16 2022-06-03 西南交通大学 AGV cluster control system design framework and method
CN114578772B (en) * 2021-04-16 2023-08-11 青岛中车四方车辆物流有限公司 AGV cluster control system design framework and method
CN114296443A (en) * 2021-11-24 2022-04-08 贵州理工学院 Unmanned modular combine harvester
CN114296443B (en) * 2021-11-24 2023-09-12 贵州理工学院 Unmanned modularized combine harvester

Similar Documents

Publication Publication Date Title
CN102608998A (en) Vision guiding AGV (Automatic Guided Vehicle) system and method of embedded system
CN108388245B (en) AGV trolley indoor positioning navigation system and control method thereof
US10916035B1 (en) Camera calibration using dense depth maps
CN108571971B (en) AGV visual positioning system and method
US11555903B1 (en) Sensor calibration using dense depth maps
CN106680290B (en) Multifunctional detection vehicle in narrow space
JP5588812B2 (en) Image processing apparatus and imaging apparatus using the same
CN113085896B (en) Auxiliary automatic driving system and method for modern rail cleaning vehicle
CN104217439A (en) Indoor visual positioning system and method
CN214520204U (en) Port area intelligent inspection robot based on depth camera and laser radar
JP2015212942A (en) Methods and systems for object detection using laser point clouds
CN108388244A (en) Mobile-robot system, parking scheme based on artificial landmark and storage medium
Liu et al. Deep learning-based localization and perception systems: approaches for autonomous cargo transportation vehicles in large-scale, semiclosed environments
CN111413963B (en) Multifunctional robot autonomous delivery method and system
CN108022448A (en) Reverse vehicle searching system and managing system of car parking
CN207037462U (en) AGV dolly embedded control systems based on ROS
CN112567264A (en) Apparatus and method for acquiring coordinate transformation information
CN110162066A (en) Intelligent cruise control system
CN110766761B (en) Method, apparatus, device and storage medium for camera calibration
CN115223039A (en) Robot semi-autonomous control method and system for complex environment
CN114397877A (en) Intelligent automobile automatic driving system
CN111776942A (en) Tire crane running control system, method and device and computer equipment
CN113081525B (en) Intelligent walking aid equipment and control method thereof
CN111121639B (en) Rigid-flexible integrated crack detection system for narrow building space
Tsukiyama Global navigation system with RFID tags

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20120725