CN110980084A - Warehousing system and related method - Google Patents

Warehousing system and related method Download PDF

Info

Publication number
CN110980084A
CN110980084A CN201911284029.6A CN201911284029A CN110980084A CN 110980084 A CN110980084 A CN 110980084A CN 201911284029 A CN201911284029 A CN 201911284029A CN 110980084 A CN110980084 A CN 110980084A
Authority
CN
China
Prior art keywords
mobile device
server
movable device
image
warehousing system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911284029.6A
Other languages
Chinese (zh)
Inventor
边铁栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lingdong Technology Beijing Co Ltd
Original Assignee
Lingdong Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lingdong Technology Beijing Co Ltd filed Critical Lingdong Technology Beijing Co Ltd
Priority to CN201911284029.6A priority Critical patent/CN110980084A/en
Publication of CN110980084A publication Critical patent/CN110980084A/en
Priority to PCT/CN2020/133552 priority patent/WO2021115189A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/0492Storage devices mechanical with cars adapted to travel in storage aisles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/137Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
    • B65G1/1373Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses

Abstract

A warehousing system and associated methods, the warehousing system comprising: the mobile device, the server and the image sensor. The movable device is used for moving in the warehouse. The server is electrically connected to the movable device and is used for monitoring the movable device. The image sensor is electrically connected to the server and used for capturing the image of the movable device. The server judges the position of the movable device in real time at least according to the size and the position of the characteristic point of the movable device in the image and the shooting angle and the position of the image sensor. The warehousing system and the related method disclosed by the invention position the movable device through the image sensor, so that the movable device can still smoothly perform the functions of moving, obstacle avoidance and the like under the condition of reducing the number of installed sensors, and the cost can be greatly reduced.

Description

Warehousing system and related method
Technical Field
The present invention relates to a system, and more particularly, to a warehousing system and related methods.
Background
When an existing autonomous moving device (such as an unmanned logistics vehicle) moves in storage, the sensor of the autonomous moving device is usually used for positioning, and a positioning result is transmitted back to a server. However, the sensors of the autonomous mobile devices often interfere with each other electromagnetically, which results in the sensors not functioning properly. Therefore, it takes time and labor to remove the obstacle if the dispatching personnel go to, and the work efficiency is reduced. However, removing all the sensors will cause the autonomous moving apparatus to lose the functions of autonomous moving, obstacle avoidance and cargo handling.
Disclosure of Invention
It is therefore an object of the present invention to provide a storage system and related method to solve the above problems, such as positioning a mobile device with fewer sensors, and enabling the mobile device to smoothly perform the functions of moving, avoiding obstacles, and transporting goods.
According to an embodiment of the present invention, a storage system is disclosed, which includes: the mobile device, the server and the image sensor. The movable device is used for moving in the warehouse. The server is electrically connected to the movable device and is used for monitoring the movable device. The image sensor is electrically connected to the server and used for capturing the image of the movable device. The server judges the position of the movable device in real time at least according to the size and the position of the characteristic point of the movable device in the image and the shooting angle and the position of the image sensor.
According to an embodiment of the invention, the characteristic point is an identification mark on the body of the mobile device.
According to an embodiment of the present invention, the mobile device includes a transmission port for connecting an external sensing device, wherein the mobile device obtains the position of the mobile device in the warehouse through the external sensing device and transmits the position to the server.
According to an embodiment of the present invention, the server optimizes the accuracy of determining the position of the mobile device in real time according to the position transmitted by the mobile device, the size and the position of the feature point in the image, and the shooting angle and the position of the image sensor.
According to an embodiment of the present invention, the external sensing device includes at least one of an optical radar, a depth camera, an ultrasonic sensor, and an infrared sensor.
According to an embodiment of the present invention, the movable device includes an odometer for measuring a moving distance of the movable device and an inertial measurement unit for measuring a deflection angle, a velocity, and an acceleration of the movable device; the movable device transmits auxiliary positioning information to the server, so that the server optimizes and judges the accuracy of the position of the movable device in real time according to the auxiliary positioning information, the size and the position of the feature point in the image and the shooting angle and the position of the image sensor; wherein the auxiliary positioning information comprises the movement distance, the deflection angle, the velocity and the acceleration.
According to an embodiment of the present invention, when the moving distance reaches a preset value, the movable device moves to a preset position in the warehouse and zeros the odometer.
According to an embodiment of the present invention, the server determines the position of the movable device in real time according to the relative position of the movable device in the image and the fixed mark in the warehouse, the size and the position of the feature point of the movable device in the image, and the shooting angle and the position of the image sensor.
According to an embodiment of the present invention, the fixed mark is a mark line marked on the storage ground.
According to an embodiment of the present invention, the server controls the movable device to stop moving when a predetermined percentage of the movable device in the image is blocked.
According to an embodiment of the present invention, when an obstacle appears in a sensing range of the image sensor, the server determines a distance between the obstacle and the movable device from an image captured by the image sensor, and when the distance between the obstacle and the movable device is less than a preset distance, the server controls the movable device to stop moving until the movable device receives a movement instruction.
According to an embodiment of the invention, the movement indication is transmitted from the server to the mobile device.
According to an embodiment of the present invention, the movement indication is transmitted from an external electronic device to the mobile device.
According to an embodiment of the present invention, the image sensor is integrated on the security camera.
According to an embodiment of the present invention, a method applied to a warehousing system is disclosed, the method comprising: capturing an image of the movable device through an image sensor; and judging the position of the movable device in real time at least according to the size and the position of the characteristic point of the movable device in the image and the shooting angle and the position of the image sensor.
The warehousing system and the related method disclosed by the invention position the movable device through the image sensor, so that the movable device can still smoothly perform the functions of moving, obstacle avoidance, goods carrying and the like under the condition of reducing the number of installed sensors, and the cost can be greatly reduced.
Drawings
FIG. 1 is a schematic diagram of a warehousing system according to an embodiment of the invention.
FIG. 2 is a diagram of a mobile device according to an embodiment of the invention.
Fig. 3A and 3B are schematic diagrams of positioning a mobile device according to an embodiment of the invention.
FIG. 4A is a diagram of a mobile device according to another embodiment of the invention.
FIG. 4B is a diagram illustrating a mobile device connected to an external sensing device according to an embodiment of the invention.
FIG. 5 is a diagram of a mobile device according to another embodiment of the invention.
FIG. 6 is a diagram of a reset odometer according to an embodiment of the invention.
FIG. 7 is a diagram illustrating positioning of a mobile device according to a fixed index in a warehouse, according to an embodiment of the present invention.
FIG. 8 is a diagram illustrating a stop travel command being issued to a mobile device according to an embodiment of the invention.
FIGS. 9A and 9B are schematic diagrams illustrating a stop travel command being issued to a mobile device according to another embodiment of the invention.
FIG. 10 is a system block diagram of a warehousing system according to one embodiment of the invention.
FIG. 11 is a flowchart illustrating a method applied to a warehousing system according to an embodiment of the invention.
Detailed Description
The following disclosure provides various embodiments or illustrations that can be used to implement various features of the disclosure. The embodiments of components and arrangements described below serve to simplify the present disclosure. It is to be understood that such descriptions are merely illustrative and are not intended to limit the present disclosure. For example, in the description that follows, forming a first feature on or over a second feature may include certain embodiments in which the first and second features are in direct contact with each other; and may also include embodiments in which additional elements are formed between the first and second features described above, such that the first and second features may not be in direct contact. In addition, the present disclosure may repeat reference numerals and/or characters in the various embodiments. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
Moreover, spatially relative terms, such as "under," "below," "over," "above," and the like, may be used herein to facilitate describing a relationship between one element or feature relative to another element or feature as illustrated in the figures. These spatially relative terms are intended to encompass a variety of different orientations of the device in use or operation in addition to the orientation depicted in the figures. The device may be otherwise oriented (e.g., rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
Although numerical ranges and parameters setting forth the broad scope of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as possible. Any numerical value, however, inherently contains certain standard deviations found in their respective testing measurements. As used herein, "about" generally refers to actual values within plus or minus 10%, 5%, 1%, or 0.5% of a particular value or range. Alternatively, the term "about" means that the actual value falls within the acceptable standard error of the mean, subject to consideration by those of ordinary skill in the art to which this application pertains. It is understood that all ranges, amounts, values and percentages used herein (e.g., to describe amounts of materials, length of time, temperature, operating conditions, quantitative ratios, and the like) are modified by the term "about" in addition to the experimental examples or unless otherwise expressly stated. Accordingly, unless indicated to the contrary, the numerical parameters set forth in the specification and attached claims are approximations that may vary depending upon the desired properties sought to be obtained. At the very least, these numerical parameters are to be understood as meaning the number of significant digits recited and the number resulting from applying ordinary carry notation. Herein, numerical ranges are expressed from one end to the other or between the two ends; unless otherwise indicated, all numerical ranges set forth herein are inclusive of the endpoints.
At present, an autonomous moving device (such as an unmanned logistics vehicle) on the market is often applied to warehousing to transport goods, and the autonomous moving device realizes the functions of obstacle avoidance, moving, goods carrying and the like by means of sensors of the autonomous moving device, such as a distance sensor, an optical radar (Light Laser Detection and ranging), a depth camera (or called as an RGBD camera) and the like. In addition, the autonomous moving device can also carry out self positioning according to the sensor and transmit positioning information to the server by the communication device to control and manage. However, the sensors mentioned above are prone to fail to operate normally due to electromagnetic interference, and it takes time and labor to dispatch personnel to get rid of obstacles, which reduces the work efficiency. However, if all the sensors are removed, the autonomous moving apparatus loses not only the functions of moving and obstacle avoidance, but also the function of positioning. Therefore, the present invention provides a warehousing system, in which the movable device can perform the functions of moving, obstacle avoidance, positioning, etc. with less installed sensors.
FIG. 1 is a diagram illustrating a stocker system 100 according to an embodiment of the present invention. The stocker system 100 includes a plurality of mobile devices for moving in the stocker, such as mobile device 110 in fig. 1. In the present embodiment, the movable device 110 is used for moving between storage racks of a warehouse, for example, the storage racks H1 to H4 in the figure, so as to carry goods.
The stocker system 100 further includes a server 130 and an image sensor 140. The server 130 is electrically connected to the mobile device 110, wherein the mobile device 110 can communicate with the server 130 through a communication device disposed therein, so that the server 130 can manage and monitor the mobile device 110, manage the cargo handling condition, order status, and the like. In the embodiment, the image sensor 140 is integrated into the security camera, wherein the image sensor 140 is electrically connected to the server 130 and is used for capturing the images in the warehouse. In particular, the image sensor 140 is used for capturing an image of the mobile device 110, so that the server 130 can position the mobile device 110 through the image sensor 140 and control the movement of the mobile device 110, thereby achieving the tasks of moving, avoiding obstacles and carrying goods in storage.
It should be noted that the above "electrical connection" is not limited to physical connection through physical lines, and the electrical connection may also belong to the category of electrical connection if the two are wirelessly communicated to transmit signals. For example, the image sensor 140 may transmit the captured image to the server 130 through a cable. For another example, the image sensor 140 may transmit the captured image to the server 130 through the ap, thereby implementing wireless communication.
FIG. 2 is a diagram of a mobile device 110 according to an embodiment of the invention. As with the embodiment of fig. 1, the mobile device 110 is used to move in a warehouse to handle goods. The mobile device 110 includes a communication device 201, a processor 202, a drive assembly 203, and a body 204. The communication device 201 is used for wireless communication with the server 130. In detail, the mobile device 110 can receive various commands from the server 130, such as a continue command, a pause command, a turn command, and the like, through the communication device 201. It should be noted that the present invention is not limited to the type of communication between the mobile device 110 and the server 130. For example, the communication between the mobile device 110 and the server 130 may be bluetooth, wireless fidelity, zigbee (ZIG-BEE), or other wireless communication technologies. The processor 202 is used for processing various commands transmitted by the server 130 and controlling the movable device 110 to operate accordingly according to the commands. The driving assembly 203 is used to drive the movable device 110 to move. In this embodiment, drive assembly 203 includes a motor and a powered wheel, the motor providing kinetic energy to the powered wheel to drive mobile device 110. The body 204 has a plurality of identifiers thereon. For example, the body 204 may be affixed on the top surface as well as the side surfaces with a two-dimensional identification code corresponding to the mobile device 110. In addition, the body 204 may be affixed on the top and sides with a model identification (e.g., the typeface 110) corresponding to the mobile device 110. Identification markings on the fuselage 204 may be used to identify the model of the mobile device 110 to the stocker.
In detail, the mobile device 110 communicates with the server 130 through the communication device 201, so that the server 130 judges the position of the mobile device 110 in real time according to the size and position of the feature point of the mobile device 110 in the image and the shooting angle and position of the security camera after capturing the image of the mobile device 110 through the security camera (more precisely, the image sensor 140) disposed in the warehouse, and transmits an instruction to the mobile device 110 to control the traveling, stopping, turning, and the like of the mobile device 110 according to the judgment result. In this way, the movable device 110 can be positioned with fewer sensors (such as image sensors, optical radars, depth cameras, ultrasonic sensors, infrared sensors, and the like for moving and obstacle avoidance), and the tasks of moving and transporting goods in storage can be achieved, thereby greatly reducing the manufacturing cost of the movable device 110.
In one embodiment of the present invention, the characteristic point is an identification mark on the body 204 of the mobile device 110. In other words, the server 130 determines the accurate position of the mobile device 110 in real time according to the size and position of the identification mark on the body 204 in the image captured by the security camera (more precisely, the image sensor 140) and the shooting angle and position of the security camera (more precisely, the image sensor 140).
It should be noted that the mobile device 110 may also include other components and elements to implement other functions of the mobile device 110. For example, the mobile device 110 further includes a storage device for storing information, a battery for supplying power, and a power distribution module for distributing power to the respective components. It should be noted that the mobile device 110 shown in fig. 2 is only an example, and the invention is not limited to the detailed architecture of the mobile device 110, and similarly, the invention is not limited to the detailed architecture of other mobile devices in the stocker system 100.
Fig. 3A and 3B are schematic diagrams of positioning the movable device 110 according to an embodiment of the invention. In the embodiment of FIG. 3A, the warehousing system 100 also includes a plurality of security cameras disposed within the warehouses. For example, the security cameras 310, 320, 330, 340 and 350 are disposed in the warehouse, and the image sensor 140 is integrated into each of the security cameras 310, 320, 330, 340 and 350. The security cameras 310, 320, 330, 340, and 350 are used as monitors to monitor the warehouse for security, and the server 130 positions the mobile device 110 through the security cameras 310, 320, 330, 340, and 350. For example, the server 130 captures the image of the mobile device 110 in the monitoring screen of the security camera 330, and the security camera 330 is set to capture the screen between the shelf H2 and the shelf H3, so the server 130 determines that the mobile device 110 is located between the shelf H2 and the shelf H3.
Then, the server 130 determines the accurate position of the mobile device 110 according to the size and the position of the feature point of the mobile device 110 in the image and the shooting angle of the security camera 330. Referring to FIG. 3B, FIG. 3B shows an image of the mobile device 110 captured by the security camera 330. The server 130 determines the actual distance between the mobile device 110 and the security camera 330 according to the pixel size occupied by the identification mark on the body 204 in the image. For example, if the width of the two-dimensional identification code of the body 204 occupies 55 to 60 pixels in the image, it can be inferred that the two-dimensional identification code is about 6 meters from the security camera 330; the width of the two-dimensional identification code of the body 204 occupies 65 to 70 pixels in the image, and it can be inferred that the two-dimensional identification code is about 5 meters away from the security camera 330. Different cameras, different hardware and field environment configurations may affect the actual distance determination between the main mobile device 110 and the security camera 330, which is only illustrated here to indicate that the server 130 can perform the determination according to the feature points, and the pixel size occupied in the image is inversely proportional to the actual distance between the main mobile device 110 and the security camera 330 during the determination. In this embodiment, the actual distance corresponding to the pixel size of the two-dimensional identification code in the image can be input into the server 130 in advance, so that when the server 130 detects the image captured by the security camera 330, the actual distance from the two-dimensional identification code to the security camera 330 can be obtained by interpolation. It should be noted that the pixel size and the corresponding actual distance of the two-dimensional identification code in the image are only exemplary. The actual correspondence depends on the resolution of the security camera 330 (or more precisely, the image sensor 140), and so on.
Then, the server 130 determines the relative position of the mobile device 110 according to the position of the identification mark on the body 204 in the image and the shooting angle of the security camera 330. For example, the shooting angle of the security camera 330 is deflected downward by θ, the server 130 determines that the mobile device 110 is located between the aisle between the shelves H2 and H3, and the identification mark of the body 204 is located in the lower right corner of the image, so the server 130 can determine that the mobile device 110 is located relatively close to the shelf H2.
In summary, the server 130 determines the accurate position of the mobile device 110 according to the size and the position of the identification mark of the mobile device 110 in the image and the shooting angle and the position of the security camera 330.
After determining the accurate position of the mobile device 110, the server 130 sends instructions to the mobile device 110 to continue to move, stop moving, turn, etc. according to the task currently performed by the mobile device 110 and the determined position, so as to realize the functions of moving and carrying goods in storage under the condition that the number of sensors (such as image sensor, optical radar, depth camera, ultrasonic sensor, infrared sensor, etc.) of the mobile device 110 is reduced, thereby greatly reducing the manufacturing cost of the mobile device 110.
It should be noted that the feature points are not limited to the identification marks on the fuselage 204 in the present invention. In another embodiment of the present invention, the characteristic point is the whole or a part of the movable apparatus 110, for example, the whole or a part of the body 204 as the characteristic point. The server 130 may infer the distance between the mobile device 110 and the security camera 330 based on the size of the pixels occupied by the whole or a portion of the mobile device 110 in the image. The details of determining the distance between the mobile device 110 and the security camera 330 according to different feature points can be easily understood by those skilled in the art, and the detailed description is omitted here for brevity.
In order to optimize the accuracy of determining the position of the movable device 110, calibration may be performed by an external sensing device. FIG. 4A is a diagram of a mobile device 110 according to another embodiment of the invention. The mobile device 110 shown in FIG. 4A is substantially the same as the mobile device 110 shown in FIG. 2, except that the mobile device 110 shown in FIG. 4A further includes a transmission port 410 for connecting an external sensing device. For example, the external sensing device may be a distance sensor, an optical radar, a depth camera, an ultrasonic sensor, an infrared sensor, and the like. Referring to fig. 4B, fig. 4B shows a schematic diagram of the mobile device 110 connected to the external sensing device, when the external sensing device is connected, the mobile device 110 can sense the distance to the surrounding object or capture the surrounding image, etc. through the external sensing device, thereby completing the self-positioning. And the mobile device 110 can transmit the positioning information to the server 130 so that the server 130 can perform calibration according to the positioning information.
For example, when the security camera 330 captures an image of the mobile device 110 connected to the external sensing device, according to the above-mentioned embodiment, the server 130 first determines the position of the mobile device 110 according to the size and the position of the feature point of the mobile device 110 in the image and the shooting angle and the shooting position of the security camera 330. Meanwhile, the position information transmitted to the server 130 by the mobile device 110 is compared with the determined position, so as to correct the determination of the server 130. The process of correction can be simply summarized as the following formula:
PAMR(x,y)=f(α·Pcamera,β·Angle,γ·Depth,κ)
where f () is a function, PAMRIs the positioning information, P, transmitted by the mobile device 110 itself to the server 130cameraIs the setting position of the security camera 330, Angle is the shooting Angle of the security camera 330, Depth is the distance between the security camera 330 and the movable device 110 judged by the server 130, α, β, γ, κ are constants, the server 130 makes the function value of f () approximate to the positioning information P transmitted to the server 130 by the movable device 110 itself by adjusting constants α, β, γ, κAMRThereby achieving the effect of correction. In this way, when the mobile device 110 is connected to the external sensor and moves in the warehouse, the server 130 can perform calibration by using each security camera capturing the mobile device 110.
In the present invention, the mobile device 110 may be connected to the external sensing device to perform a calibration before the device is actually put into use. It is also possible to perform a calibration within a fixed time period to maintain the accuracy of determining the position of the mobile device 110.
It should be noted that the present invention is not limited to the above-mentioned external sensing device to optimize the accuracy of determining the position of the movable device 110. Referring to fig. 5, fig. 5 is a schematic diagram of a mobile device 110 according to another embodiment of the invention. The mobile device 110 shown in fig. 5 is substantially the same as the mobile device 110 shown in fig. 2, except that the mobile device 110 shown in fig. 5 further includes an odometer 510 and an Inertial Measurement Unit (IMU) 520, wherein the odometer 510 is used for measuring a moving distance of the mobile device 110, and the IMU 520 is used for measuring an inertial measurement unit of a deflection angle, a velocity, and an acceleration of the mobile device 110.
In detail, the mobile device 110 generates the assistant positioning information according to the moving distance measured by the odometer 510 and the deflection angle, velocity and acceleration measured by the inertia measurement unit 520, and transmits the assistant positioning information to the server 130. Then, the server 130 determines the accurate position of the mobile device 110 according to the size and position of the feature point in the image captured by the security camera 330 and the shooting angle and position of the security camera 330, and performs calibration according to the auxiliary positioning information transmitted by the mobile device 110. The formula used in the calibration process can refer to the foregoing embodiments, and the detailed description is omitted here for brevity.
It should be understood by those having ordinary skill in the art that the odometer 510 is prone to errors when used for a long time, and thus, periodically resetting the odometer 510 can maintain the accuracy of the odometer 510. Referring to fig. 6, in the embodiment of fig. 6, the stocker has a plurality of preset reset regions therein, such as the illustrated reset regions R1, R2, R3. When the moving distance sensed by the odometer 510 reaches a preset value, the movable device 110 will move to one of the preset reset regions R1, R2, R3 and reset the odometer 510 to zero. After the odometer 510 is reset to zero, the mobile device 110 will start from a reset region R1, R2, or R3 of known position, so that the assisted positioning information generated according to the recalculated moving distance will have higher accuracy.
In the present invention, the server 130 may provide the auxiliary positioning information in other ways, so as to optimize the accuracy of determining the position of the mobile device 110. For example, the fixed mark in the warehouse can be used as the auxiliary positioning information. Referring to FIG. 7, in the embodiment of FIG. 7, the warehouse floor is marked with a plurality of indication lines, such as indication lines A1-A3 and indication lines B1-B6, which belong to fixed indications in the warehouse and can be used as auxiliary positioning information for the server 130 to determine the mobile device 110. In detail, the server 130 determines the position of the mobile device 110 in real time according to the relative position of the mobile device 110 and the mark line in the image, the size and position of the feature point in the image, and the shooting angle and position of the image sensor 140. In order to make the server 130 clearly determine on which marking line the movable device 110 is located, the marking lines a1-A3 and B1-B6 may have different characteristics, in other words, the dotted line intervals, patterns, and line colors of the marking lines a1, a2, and A3 may be different; similarly, the spacing, pattern, and line color of the dashed lines labeled B1, B2, B3, B4, B5, and B6 may vary. In this way, the server 130 can also determine the position of the mobile device 110 from the intersection of the marked lines in the image, thereby improving the accuracy of the determination. It should be noted that the pattern of the labeled lines in fig. 7 is only for illustration and is not a limitation of the present invention. In addition, the number of the indication lines in the warehouse can be changed according to the actual requirement, and this is not a limitation of the present invention.
It should be understood by those skilled in the art that if the pattern and number of the mark lines are sufficiently high, the server 130 can determine the position of the mobile device 110 according to the intersection of the mark lines in the image, and the detailed description is omitted here for brevity.
Since the server 130 performs the positioning of the mobile device 110 according to the image captured by the image sensor 140, when the image sensor 140 is blocked and the image of the mobile device 110 cannot be captured successfully, the server 130 cannot determine the position of the mobile device 110 successfully. Referring to fig. 8, in the embodiment of fig. 8, the image sensor 140 captures an obstacle (e.g., a worker) in the image of the mobile device 110 to block the view of the image sensor 140, so that a part of the mobile device 110 is blocked, and the server 130 cannot determine the accurate position of the mobile device 110 according to the size and the position of the feature point in the image. In this case, if the movable device 110 is continuously moved, collision of the movable device 110 with other devices, obstacles, or persons may be caused. Therefore, in an embodiment of the present invention, when an obstacle blocks the view of the image sensor 140 in the image captured by the image sensor 140 of the mobile device 110, so that a predetermined proportion of the mobile device 110 is blocked, the server 130 will issue a command to stop traveling to the mobile device 110 to avoid collision. For example, if an obstacle blocks the view of the image sensor 140 in the image captured by the image sensor 140 of the mobile device 110, which results in 50% of the mobile device 110 being blocked, the server 130 sends a command to stop traveling to the mobile device 110 to avoid collision. It should be noted that the actual value of the preset ratio is not limited by the present invention.
In addition, if the obstacle is a moving person or other movable device, the server 130 can estimate the moving speed and the traveling path of the person or other movable device according to the images captured by the image sensor 140 at two time points. Referring to fig. 9A and 9B, fig. 9A is a diagram of an image captured by the image sensor 140 at a first time point, and fig. 9B is a diagram of an image captured by the image sensor 140 at a second time point. If the server 130 determines that the worker may collide with the mobile device 110 in the images captured at two time points, the server 130 may send a command to stop traveling to the mobile device 110 when the distance between the worker and the mobile device 110 is less than a predetermined distance, so as to avoid collision. Taking fig. 9B as an example, if the server 130 determines that there is a possibility of collision with the mobile device 110 according to the speed and the travel path of the worker, the server 130 sends a command to stop traveling to the mobile device 110 when the mobile device 110 is 1 meter away from the worker, so as to avoid the collision. It should be noted that the present invention is not limited to the actual value of the predetermined distance.
In the above embodiment, when the servo 130 issues a command to stop traveling to the movable device 110, the movable device will stop traveling until receiving a moving command. In one embodiment of the present invention, the movement indication is sent by the server 130. In detail, when the server 130 determines that the line of sight of the image sensor 140 is no longer obstructed by the obstacle, or when the server 130 determines that the obstacle (a worker or other movable device) is far away from the movable device 110, the server 130 sends a movement instruction to the movable device 110 to control the movable device 110 to continue to move.
In another embodiment of the present invention, the movement indication is sent by an external electronic device. For example, the server 130 determines that the collision with the mobile device 110 may occur according to the speed of the worker and the travel path, and therefore, the server 130 sends a command to stop traveling to the mobile device 110. Then, the staff member can send a moving instruction to the movable device 110 through an electronic device (such as a mobile phone) worn by the staff member, and control the movable device 110 to continue to travel.
The warehousing system of the present invention is summarized in FIG. 10 to facilitate understanding of the invention. Fig. 10 is a system block diagram of a stocker system according to an embodiment of the present invention, and the stocker system 100 is illustrated as a stocker system 100, which includes a plurality of mobile devices, such as mobile device 110, moving in a stocker. The mobile device includes a communication device 201, a processor 202, and a drive assembly 203. The processor 202 is configured to process the indication received by the communication device 201; the driving assembly 203 is used for driving the autonomous moving device to move. The stocker system 100 further includes a server 130 and an image sensor 140. The server 130 is electrically connected to the plurality of mobile devices, which can communicate with the server 130 through a communication device (such as the communication device 201) disposed therein, and receive various instructions from the server 130, such as instructions to continue traveling, stop traveling, turn a corner, and the like. The image sensor 140 is integrated in the security camera and electrically connected to the server 130, and the image sensor 140 is used for capturing images in the warehouse, particularly, for capturing images of the movable device 110, so that the server 130 can position the movable device 110 and control the movement of the movable device 110, thereby achieving the task of moving and carrying goods in the warehouse.
FIG. 11 is a flow chart of a method 600 applied to a warehousing system in an embodiment of the invention. The present invention is not limited to practice solely in accordance with the flow chart illustrated in fig. 11, provided substantially the same results are achieved. The method 600 can be summarized as follows:
step 610: an image of the movable device is captured by the image sensor.
Step 620: and judging the position of the movable device in real time at least according to the size and the position of the characteristic point of the movable device in the image and the shooting angle and the position of the image sensor.
The operation of the method 600 can be easily understood by those skilled in the art after reading the above embodiments, and the detailed description is omitted here for brevity.
The foregoing outlines features of several embodiments so that those skilled in the art may better understand the aspects of the present disclosure. Those skilled in the art should appreciate that they may readily use the present application as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should also realize that such equivalent constructions do not depart from the spirit and scope of the present disclosure and that they may make various changes, substitutions, and alterations herein without departing from the spirit and scope of the present disclosure.
Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification. As one of ordinary skill in the art will readily appreciate from the disclosure of the present disclosure, processes, machines, manufacture, compositions of matter, means, methods, or steps, presently existing or later to be developed that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized according to the present disclosure. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.

Claims (15)

1. A warehousing system, comprising:
a movable device for movement within the warehouse;
a server electrically connected to the movable device for monitoring the movable device; and
the image sensor is electrically connected to the server and used for capturing the image of the movable device;
the server judges the position of the movable device in real time at least according to the size and the position of the characteristic point of the movable device in the image and the shooting angle and the position of the image sensor.
2. The warehousing system of claim 1, wherein said characteristic point is an identification mark on a body of said mobile device.
3. The warehousing system of claim 1, wherein said mobile device includes a transfer port for connection to an external sensing device, wherein said mobile device obtains a location of said mobile device within said warehouse via said external sensing device and transmits its location to said server.
4. The warehousing system of claim 3, wherein the server optimizes the accuracy of determining the position of the mobile device in real-time based on the position transmitted by the mobile device, the size and position of the feature points in the image, and the capturing angle and position of the image sensor.
5. The warehousing system of claim 3, wherein the external sensing device comprises at least one of an optical radar, a depth camera, an ultrasonic sensor, and an infrared sensor.
6. The warehousing system of claim 1, wherein said mobile device includes an odometer for measuring a distance moved by said mobile device and an inertial measurement unit for measuring a deflection angle, velocity and acceleration of said mobile device; the movable device transmits auxiliary positioning information to the server, so that the server optimizes and judges the accuracy of the position of the movable device in real time according to the auxiliary positioning information, the size and the position of the feature point in the image and the shooting angle and the position of the image sensor; wherein the auxiliary positioning information comprises the movement distance, the deflection angle, the velocity and the acceleration.
7. The warehousing system of claim 6, wherein said movable device moves to a preset position within said warehouse and zeroes said odometer when said travel distance reaches a preset value.
8. The warehousing system of claim 1, wherein said server determines the position of said mobile device in real time by the relative position of said mobile device in said image to a fixed marker within said warehouse, the size and position of said feature point of said mobile device in said image, and the capture angle and position of said image sensor.
9. The warehousing system of claim 8, wherein said fixed markings are marking lines marked on said warehousing floor.
10. The warehousing system of claim 1, wherein the server controls the movable device to stop moving when a predetermined percentage of the movable device in the image is occluded.
11. The warehousing system of claim 1, wherein when an obstacle is present in a sensing range of the image sensor, the server determines a distance from the obstacle to the movable device from an image captured by the image sensor, and when the distance from the obstacle to the movable device is less than a preset distance, the server controls the movable device to stop moving until the movable device receives a movement indication.
12. The warehousing system of claim 11, wherein said movement indication is transmitted by said server to said movable device.
13. The warehousing system of claim 11, wherein said movement indication is transmitted to said movable device by an external electronic device.
14. The warehousing system of claim 1, wherein the image sensor is integrated onto a security camera.
15. A method for use in a warehousing system, comprising:
capturing an image of the movable device through an image sensor; and
and judging the position of the movable device in real time at least according to the size and the position of the characteristic point of the movable device in the image and the shooting angle and the position of the image sensor.
CN201911284029.6A 2019-12-13 2019-12-13 Warehousing system and related method Pending CN110980084A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911284029.6A CN110980084A (en) 2019-12-13 2019-12-13 Warehousing system and related method
PCT/CN2020/133552 WO2021115189A1 (en) 2019-12-13 2020-12-03 Warehouse system and related method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911284029.6A CN110980084A (en) 2019-12-13 2019-12-13 Warehousing system and related method

Publications (1)

Publication Number Publication Date
CN110980084A true CN110980084A (en) 2020-04-10

Family

ID=70093475

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911284029.6A Pending CN110980084A (en) 2019-12-13 2019-12-13 Warehousing system and related method

Country Status (2)

Country Link
CN (1) CN110980084A (en)
WO (1) WO2021115189A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111605943A (en) * 2020-05-27 2020-09-01 新石器慧通(北京)科技有限公司 Logistics equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004355419A (en) * 2003-05-30 2004-12-16 Hitachi Industries Co Ltd Physical distribution system
CN202255404U (en) * 2011-08-24 2012-05-30 国营红林机械厂 Binocular vision navigation system of indoor mobile robot
CN104881028A (en) * 2015-05-11 2015-09-02 皖西学院 Intelligent vehicle
CN104991560A (en) * 2015-07-12 2015-10-21 仲恺农业工程学院 Autonomous mobile intelligent robot
CN107885198A (en) * 2017-09-25 2018-04-06 湖南大学 AGV dispatching methods
CN108955683A (en) * 2018-04-28 2018-12-07 温州大学激光与光电智能制造研究院 Localization method based on overall Vision

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105716611B (en) * 2016-01-29 2018-06-05 西安电子科技大学 Indoor mobile robot and its localization method based on environmental information
CN107067794B (en) * 2016-11-18 2023-04-18 安徽超清科技股份有限公司 Indoor vehicle positioning and navigation system and method based on video image processing
US11314254B2 (en) * 2019-03-26 2022-04-26 Intel Corporation Methods and apparatus for dynamically routing robots based on exploratory on-board mapping
CN110186451B (en) * 2019-06-12 2023-04-18 英业达科技有限公司 Navigation system suitable for warehousing system and navigation method of material conveying carrier
CN111661550B (en) * 2019-06-24 2021-04-16 灵动科技(北京)有限公司 Automatic conveyer
CN110286682A (en) * 2019-07-08 2019-09-27 国网山东省电力公司枣庄供电公司 A kind of electric power storage Multifunctional security sniffing robot, method and system
CN111105455B (en) * 2019-12-13 2024-04-16 灵动科技(北京)有限公司 Warehouse system and related method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004355419A (en) * 2003-05-30 2004-12-16 Hitachi Industries Co Ltd Physical distribution system
CN202255404U (en) * 2011-08-24 2012-05-30 国营红林机械厂 Binocular vision navigation system of indoor mobile robot
CN104881028A (en) * 2015-05-11 2015-09-02 皖西学院 Intelligent vehicle
CN104991560A (en) * 2015-07-12 2015-10-21 仲恺农业工程学院 Autonomous mobile intelligent robot
CN107885198A (en) * 2017-09-25 2018-04-06 湖南大学 AGV dispatching methods
CN108955683A (en) * 2018-04-28 2018-12-07 温州大学激光与光电智能制造研究院 Localization method based on overall Vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郭文亮: "基于全局视觉的移动机器人导航系统研究", 《信息科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111605943A (en) * 2020-05-27 2020-09-01 新石器慧通(北京)科技有限公司 Logistics equipment

Also Published As

Publication number Publication date
WO2021115189A1 (en) 2021-06-17

Similar Documents

Publication Publication Date Title
EP3304491B1 (en) Systems and methods for image capture device calibration for a materials handling vehicle
AU2020203481B2 (en) Systems and methods for materials handling vehicle odometry calibration
US11681300B2 (en) Systems and methods for out of aisle localization and vehicle position calibration using rack leg identification
US10850927B2 (en) Work system, method for executing work on object, and robot
Kelly et al. Field and service applications-an infrastructure-free automated guided vehicle based on computer vision-an effort to make an industrial robot vehicle that can operate without supporting infrastructure
KR101323705B1 (en) Autonomous freight transportation system using mobile robot for autonomous freight transportation
CN106227212B (en) The controllable indoor navigation system of precision and method based on grating map and dynamic calibration
KR100447308B1 (en) Method and device for detecting the position of a vehicle a given area
Galasso et al. Efficient calibration of four wheel industrial AGVs
CN111891927A (en) First floor container placement method and computer readable storage medium
CN112214012A (en) Navigation method, mobile carrier and navigation system
CN110980084A (en) Warehousing system and related method
WO2021115185A1 (en) Warehousing system and related method
CN111552297B (en) Navigation method and navigation device
EP4227256A1 (en) Substrate conveying hand, substrate conveying apparatus, and substrate conveying method
CN112224793B (en) Intelligent logistics selection path planning system
EP4137906A1 (en) Navigation method and navigation apparatus
CA3184664A1 (en) Material handling vehicle guidance systems and methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination