WO2017188292A1 - 移動体の管理システム、方法、およびコンピュータプログラム - Google Patents
移動体の管理システム、方法、およびコンピュータプログラム Download PDFInfo
- Publication number
- WO2017188292A1 WO2017188292A1 PCT/JP2017/016460 JP2017016460W WO2017188292A1 WO 2017188292 A1 WO2017188292 A1 WO 2017188292A1 JP 2017016460 W JP2017016460 W JP 2017016460W WO 2017188292 A1 WO2017188292 A1 WO 2017188292A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- travel
- marker
- image
- travel route
- management system
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 19
- 238000004590 computer program Methods 0.000 title claims description 11
- 239000003550 marker Substances 0.000 claims abstract description 114
- 238000004891 communication Methods 0.000 claims abstract description 54
- 238000012545 processing Methods 0.000 claims abstract description 46
- 230000001133 acceleration Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 15
- 230000003287 optical effect Effects 0.000 description 5
- 239000004065 semiconductor Substances 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 240000006829 Ficus sundaica Species 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0044—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0022—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Definitions
- the present disclosure relates to a management system, a method, and a computer program for managing traveling of a moving object.
- Automated guided vehicles are sometimes referred to as “AGV” (Automatic Guided Vehicle).
- Patent Document 1 discloses a mobile body having a tag communication unit.
- a tag communication unit In the travel target area, a plurality of IC tags each having position information are arranged in a distributed manner.
- the tag communication unit performs wireless communication with the IC tag and reads position information of the IC tag. Thereby, the moving body can acquire information on the current position and perform automatic traveling.
- Patent Document 2 discloses a system for moving an AGV to a designated position.
- the AGV reads a location marker representing a position and moves to a designated position
- the AGV corrects it using its own navigation system if the position is shifted.
- Patent Document 3 discloses a technique for determining the position of an address mark by simulation prior to laying the address mark on a course on which an AGV runs.
- the IC tag or location marker necessary for detecting the position is placed in advance in the traveling area of the AGV, and the route on which the AGV can travel is determined. About. When it becomes necessary to change the position of the IC tag or the location marker at the site after starting the operation of the AGV, it takes a great deal of work for the change.
- One non-limiting exemplary embodiment of the present application provides an AGV management system that can make it easier to change a driving route on site.
- the management system of the present disclosure is a management system that includes at least one mobile body and a travel management device, and manages the travel of the mobile body using the travel management device.
- the moving body rotates each of the plurality of driving wheels by independently controlling a plurality of motors, a plurality of driving wheels respectively connected to the plurality of motors, and a voltage applied to each motor according to a control signal.
- a first communication circuit that communicates with the travel management device and receives data indicating a travel route, and a control circuit that generates the control signal for causing the mobile body to travel along the travel route
- the travel management device includes an image display device, an input device that accepts a user operation, and an image processing circuit that generates an image to be displayed on the image display device.
- an image processing circuit for generating an image including a plurality of marker objects indicating the plurality of positions, and coordinates of each marker object on the image
- a signal processing circuit that converts each of the coordinates in the space in which the mobile body travels, and sets a line segment or a curve on the image display device that connects the plurality of marker objects as the travel route in the space
- a second communication circuit that transmits data indicating each coordinate in the space and the travel route to the moving body.
- Another exemplary management system is a management system that includes a plurality of mobile units and a management computer in the exemplary embodiment, and manages the traveling of each mobile unit using the management computer. Travels with a plurality of drive wheels and can communicate with the management computer, and the management computer can create a travel route of the moving body from a plurality of markers arranged in a map image. When the travel route passes the second marker next to the first marker among the plurality of markers, the first marker passes the information on the coordinate position of the first marker and then passes. Information specifying the second marker is included as attribute information.
- the image processing circuit of the travel management apparatus displays a plurality of marker objects indicating the plurality of positions. Generate an image that contains.
- the marker object corresponds to a position through which the moving body passes.
- the signal processing circuit of the travel management device converts the coordinates of each marker object on the image into coordinates in the space where the moving body travels, and converts line segments or curves on the image display device connecting the plurality of marker objects, It is set as the travel route of the moving body in the space.
- the mobile body receives the set travel route data and moves along the travel route.
- the user can recognize a segment or a curve connecting a plurality of passing positions as a virtual travel route of the moving object, with the position of the marker object displayed on the image display device as the passing position of the moving object. Thereby, there is no need to arrange an IC tag or the like storing position information in the traveling area of the moving body. Further, since the travel route can be changed by changing the position of the marker object on the image, the travel route of the moving object can be easily changed.
- FIG. 1 is a diagram illustrating an overview of a management system 100 that manages traveling of each AGV according to the present disclosure.
- FIG. 2 is a diagram illustrating an example in which the user 3 runs the AGV 10 using the tablet computer 4.
- FIG. 3 is an external view of an exemplary AGV 10 according to the present embodiment.
- FIG. 4 is a diagram illustrating a hardware configuration of the AGV 10.
- FIG. 5 is a diagram showing the self-position (x, y, ⁇ ) and reliability data of the AGV 10 displayed in the screen area 7 of the tablet computer 4.
- FIG. 6 is a diagram illustrating a hardware configuration of the travel management device 20.
- FIG. 7 is a diagram illustrating an example of an image 60 displayed on the monitor 30 when the travel management device 20 is activated.
- FIG. 8 is a diagram illustrating an example of an image 70 displayed on the monitor 30 after the button object 63a (FIG. 7) is selected.
- FIG. 9 is a diagram illustrating an example of an image 80 displayed on the monitor 30 after the button object 63b (FIG. 7) is selected.
- FIG. 10 is a diagram illustrating an example of an image 110 displayed on the monitor 30 after the button object 63c (FIG. 7) is selected.
- FIG. 11 is a diagram illustrating an example of the marker objects 116a, 116b, and 116c displayed at the positions 114a, 114b, and 114c selected by the user 3, respectively.
- FIG. 12 is a diagram illustrating an example of the first image 120 displayed on the monitor 30 after the button object 63d (FIG.
- FIG. 14A is a diagram illustrating a moving route of the AGV 10 when traveling straight.
- FIG. 14B is a diagram illustrating a movement route of the AGV 10 that makes a left turn at the position M n + 1 and moves toward the position M n + 2 .
- FIG. 14C is a diagram illustrating a movement path of the AGV 10 when moving in an arc shape from the position M n + 1 to the position M n + 2 .
- FIG. 15 is a flowchart showing the steps of the process of the travel management device 20 and the process of the AGV 10 that travels in response to the result of the process.
- an automatic guided vehicle is mentioned as an example of a moving body.
- the automatic guided vehicle is called AGV (Automated Guided Vehicle) and is also described as “AGV” in this specification.
- FIG. 1 shows an overview of a management system 100 that manages the running of each AGV according to the present disclosure.
- the AGVs 10 and 11 have map data and travel while recognizing which position they are currently traveling.
- Each of the AGVs 10 and 11 receives the travel route data transmitted from the travel management device 20, and travels in the space S according to the travel route data.
- the AGVs 10 and 11 move by driving a plurality of built-in motors so as to travel along the travel route and rotating wheels (drive wheels) connected to the motors.
- the travel route data is transmitted from the travel management device 20 to the AGVs 10 and 11 by radio.
- Communication between the AGV 10 and the travel management device 20 and communication between the AGV 11 and the travel management device 20 are performed using wireless access points 2a and 2b provided near the ceiling of the factory, respectively.
- the communication conforms to, for example, the Wi-Fi (registered trademark) standard.
- the number of wireless access points is arbitrary.
- FIG. 1 shows two AGVs 10 and 11, but the number of AGVs may be one, three, four, or five or more.
- the travel management device 20 generates travel route data for each AGV and transmits the data to each AGV.
- the AGV managed by the traveling management device 20 is, for example, an AGV registered in the traveling management device 20 by the user.
- “to be managed” includes, in addition to the above-described management of the travel route, the operation management of each AGV, the management of the state such as the driving state and the stop state, the management of the error history, the management of the history of the driving route Can be included.
- the AGV 10 is described as an example. The same description as the following description can be applied to the AGV 11 and other AGVs not shown.
- the outline of the operation of the management system 100 is as follows.
- the management system 100 includes at least one AGV 10 and a travel management device 20, and manages the travel of the AGV 10 using the travel management device 20.
- the travel management device 20 includes a monitor 30 that is an image display device, a keyboard 40a and a mouse 40b that are input devices that receive user operations, and a PC 50.
- the keyboard 40a and / or the mouse 40b are devices that accept designation of a plurality of positions on the monitor 30 from the user.
- the keyboard 40a and the mouse 40b are collectively referred to as the “input device 40”.
- the PC 50 includes a CPU (Central Processing Unit) that is a signal processing circuit, an image processing circuit that generates an image to be displayed on the monitor 30, and a communication circuit.
- the monitor 30, the keyboard 40a and the mouse 40b, and the PC 50 are collectively referred to as “PC” or “computer”.
- the travel management apparatus 20 illustrated in FIG. 1 may be referred to as a “management PC” or a “management computer”.
- the management PC may be a laptop PC.
- the planar map image of the space S acquired via a data terminal may be displayed on the monitor 30.
- the user can specify the position on the plane map image of the space S as the travel position of the AGV 10.
- the image processing circuit generates an image including a plurality of marker objects indicating a plurality of positions.
- the marker object image is, for example, “ ⁇ ”. Specific examples will be described later.
- the image of the marker object may be additionally displayed on the image of the monitor 30 each time a position is designated, or after a plurality of positions are designated and a user completes the designation operation, the marker object image is collected. May be displayed on the monitor 30.
- arranging a marker object image on a planar map image may be referred to as “arranging a marker”.
- the marker object attribute information (described later) may be referred to as “marker attribute information”.
- the CPU 21 converts the coordinates of each marker object on the image into coordinates in the space S where the AGV 10 travels. At this time, the CPU converts a line segment or a curve on the monitor 30 connecting a plurality of marker objects into a route in the space S and sets it as a travel route of the AGV 10.
- the communication circuit transmits data indicating the travel route to the AGV 10.
- the AGV 10 has a communication circuit, and receives data indicating a communication travel route from the travel management device 20.
- the AGV 10 further includes a plurality of motors, a plurality of drive wheels respectively connected to the plurality of motors, a drive device for each motor, and a control circuit.
- the control circuit When the control circuit generates a control signal, for example, a PWM signal, for causing the AGV 10 to travel along the travel route, the drive device independently controls the voltage applied to each motor in accordance with the PWM signal. As a result, each motor rotates and the AGV 10 moves along the travel route received from the travel management device 20.
- the travel management device 20 may be connected to the external system 5 so as to be communicable.
- the travel management device 20 can perform serial communication with the external system 5 in accordance with the Ethernet (registered trademark) standard.
- the travel management device 20 may communicate with the external system 5 via the PLC communication terminal 6.
- serial communication based on the Ethernet (registered trademark) standard is performed between the travel management device 20 and the PLC communication terminal 6, and serial communication using a power line is performed between the PLC communication terminal 6 and the external system 5. May be performed.
- FIG. 1 shows an example in which the travel management device 20 manages the travel of the AGV 10 by transmitting the travel route to the AGV 10.
- the user may directly operate the AGV 10 using a communication terminal such as a tablet computer.
- FIG. 2 shows an example in which the user 3 runs the AGV 10 using the tablet computer 4.
- the tablet computer 4 and the AGV 10 are connected, for example, on a one-to-one basis, and may perform communication conforming to the Bluetooth (registered trademark) standard, or Wi-Fi (registration) via the wireless access points 2a and 2b. (Trademark) standard communication may be performed.
- Bluetooth registered trademark
- Wi-Fi registration
- the AGV 10 When the AGV 10 is directly operated using the tablet computer 4, the AGV 10 travels according to the operation of the user 3 even if the travel route data is received from the travel management device 20. When the connection with the tablet computer 4 is disconnected, the AGV 10 can travel according to the travel route data received from the travel management device 20.
- FIG. 3 is an external view of an exemplary AGV 10 according to the present embodiment.
- the AGV 10 includes four wheels 11a to 11d, a frame 12, a transport table 13, a travel control device 14, and a laser range finder 15.
- the AGV 10 also has a plurality of motors, which are not shown in FIG. 3 shows the front wheel 11a, the rear wheel 11b, and the rear wheel 11c, but the front wheel 11d is not clearly shown because it is hidden behind the frame 12.
- the traveling control device 14 is a device that controls the operation of the AGV 10, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a board on which they are mounted.
- the travel control device 14 performs data transmission / reception with the travel management device 20 and pre-processing calculation described above.
- the laser range finder 15 is an optical device that measures the distance to a target by, for example, irradiating the target with infrared laser light 15a and detecting the reflected light of the laser light 15a.
- the laser range finder 15 of the AGV 10 has a pulsed laser beam while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 10, for example. 15a is emitted, and the reflected light of each laser beam 15a is detected. Thereby, the data of the distance to the reflection point in the direction determined by the angle for a total of 1080 steps every 0.25 degrees can be obtained.
- the arrangement of objects around the AGV can be obtained from the position and orientation of the AGV 10 and the scan result of the laser range finder 15.
- the position and posture of a moving object are called poses.
- the position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
- the position and posture of the AGV 10, that is, the pose (x, y, ⁇ ) may be simply referred to as “position” hereinafter.
- the position of the reflection point seen from the radiation position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
- the laser range finder 15 outputs sensor data expressed in polar coordinates.
- the laser range finder 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
- Examples of objects that can be detected by the laser range finder 15 are people, luggage, shelves, and walls.
- the laser range finder 15 is an example of an external sensor for sensing the surrounding space and acquiring sensor data.
- Other examples of such an external sensor include an image sensor and an ultrasonic sensor.
- the traveling control device 14 can estimate its current position by comparing the measurement result of the laser range finder 15 with the map data held by itself.
- the map data may be acquired by the AGV 10 itself using SLAM (Simultaneous Localization and Mapping) technology.
- FIG. 4 shows the hardware configuration of AGV10.
- FIG. 4 also shows a specific configuration of the travel control device 14.
- the AGV 10 includes a travel control device 14, a laser range finder 15, two motors 16 a and 16 b, and a drive device 17.
- the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a positioning device 14e.
- the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the positioning device 14e are connected by a communication bus 14f and can exchange data with each other.
- the laser range finder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 14a, the positioning device 14e, and / or the memory 14b.
- the microcomputer 14a is a processor or a control circuit (computer) that performs calculations for controlling the entire AGV 10 including the travel control device 14.
- the microcomputer 14a is a semiconductor integrated circuit.
- the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the drive device 17 to control the drive device 17 and adjust the voltage applied to the motor.
- PWM Pulse Width Modulation
- the memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a.
- the memory 14b can also be used as a work memory when the microcomputer 14a and the positioning device 14e perform calculations.
- the storage device 14c is a non-volatile semiconductor memory device that stores map data.
- the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk.
- the storage device 14c may include a head device for writing and / or reading data on any recording medium and a control device for the head device.
- the map data is acquired and stored in the storage device 14c prior to the start of traveling of the AGV 10.
- the communication circuit 14d is a wireless communication circuit that performs wireless communication based on, for example, Bluetooth (registered trademark) and / or Wi-Fi (registered trademark) standards. Each standard includes a wireless communication standard using a frequency of 2.4 GHz band.
- the positioning device 14e receives the sensor data from the laser range finder 15 and reads out the map data stored in the storage device 14c. By comparing (matching) the local map data created from the scan result of the laser range finder 15 with a wider range of environment map data, the self-position (x, y, ⁇ ) on the environment map is identified. The positioning device 14e generates “reliability” indicating the degree to which the local map data matches the environmental map data. The self position (x, y, ⁇ ) and reliability data can be transmitted from the AGV 10 to the travel management device 20 and / or the tablet computer 4.
- the tablet computer 4 receives each data of its own position (x, y, ⁇ ) and reliability and displays them on a built-in display device.
- FIG. 5 shows the self-position (x, y, ⁇ ) and reliability data of the AGV 10 displayed in the screen area 7 of the tablet computer 4.
- the microcomputer 14a and the positioning device 14e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 14a and the positioning device 14e.
- FIG. 4 shows a chip circuit 14g including the microcomputer 14a and the positioning device 14e.
- the microcomputer 14a, the positioning device 14e, and / or the chip circuit 14g may be referred to as a computer or a signal processing circuit.
- the microcomputer 14a and the positioning device 14e are separately provided will be described.
- the two motors 16a and 16b are attached to the two wheels 11b and 11c, respectively, and rotate each wheel. That is, the two wheels 11b and 11c are drive wheels, respectively.
- the drive device 17 has motor drive circuits 17a and 17b for adjusting the voltage applied to each of the two motors 16a and 16b.
- Each of the motor drive circuits 17a and 17b is a so-called inverter circuit, and the current applied to each motor is turned on or off by the PWM signal transmitted from the microcomputer 14a, thereby adjusting the voltage applied to the motor.
- FIG. 6 shows a hardware configuration of the travel management device 20.
- the travel management device 20 includes the monitor 30, the input device 40 such as the keyboard 40a and the mouse 40b, and the PC 50.
- the PC 50 includes a CPU 21, a memory 22, a marker database (marker DB) 23, a communication circuit 24, an AGV database (AGVDB) 25, and an image processing circuit 26.
- the CPU 21, the memory 22, the marker DB 23, the communication circuit 24, and the image processing circuit 26 are connected by a communication bus 27 and can exchange data with each other.
- the CPU 21 is a signal processing circuit (computer) that controls the operation of the travel management device 20.
- the CPU 21 is a semiconductor integrated circuit.
- the memory 22 is a volatile storage device that stores a computer program executed by the CPU 21.
- the memory 22 can also be used as a work memory when the CPU 21 performs calculations.
- the computer program may be stored in a nonvolatile storage device (not shown) such as an EEPROM.
- the CPU 21 reads out the computer program from the non-volatile storage device when the PC 50 is started up, develops it in the memory 22 and executes it.
- the marker DB 23 stores information on the position on the image designated by the user.
- a marker object is arranged at a position on the image designated by the user 3.
- the marker DB 23 stores various data related to the marker object.
- the marker DB 23 holds a rule for associating the position on the image with the coordinates of the space S in which the AGV 10 travels. The latter rule may be held in the memory 22.
- the marker DB 23 may be constructed on a non-volatile semiconductor memory, or may be constructed on a magnetic recording medium represented by a hard disk or an optical recording medium represented by an optical disk.
- the communication circuit 24 performs wired communication based on, for example, the Ethernet (registered trademark) standard.
- the communication circuit 24 is connected to the wireless access points 2a, 2b and the like by wire, and can communicate with the AGV 10 via the wireless access points 2a, 2b and the like.
- the communication circuit 24 receives data to be transmitted to the AGV 10 from the CPU 21 via the bus 27.
- the communication circuit 24 transmits data (notification) received from the AGV 10 to the CPU 21 and / or the memory 22 via the bus 27.
- the AGVDB 25 stores state data of each AGV 10.
- the AGVDB 25 can be updated by receiving data from each AGV 10, and can also be updated when a travel route is generated by the CPU 21.
- the image processing circuit 26 is a circuit that generates an image to be displayed on the monitor 30.
- the image processing circuit 26 operates exclusively when the user 3 operates the travel management device 20.
- the monitor 30 and / or the input device 40 may be integrated with the travel management device 20. Further, the CPU 21 may perform the processing of the image processing circuit 26.
- the marker DB 23 and the AGVDB 25 may be data stored in a storage device or a combination of a computer program that functions as a database server and data.
- the marker DB 23 and the AGVDB 25 may be a combination of hardware that functions as a database server and data.
- the CPU 21 of the travel management device 20 operates in accordance with a user operation by executing a computer program stored in the memory 22, generates an image described below, and displays the image on the monitor 30.
- FIG. 7 shows an example of an image 60 displayed on the monitor 30 when the travel management device 20 is activated.
- the image 60 has a list area 60a, a state display area 60b, and an operation monitor area 60c.
- the AGV 10 placed under the management of the management system 100 by being registered by the user 3 is displayed.
- the status display area 60b the status of the selected AGV 10 is displayed.
- An example of “state” is a number that identifies the currently set traveling route and the remaining state of the battery, whether the AGV 10 is currently traveling or stopped, whether an error has occurred, or not.
- a planar map image of the space S in which the AGV 10 travels is displayed.
- the traveling management apparatus 20 acquires the image via a data terminal (not shown), and displays the image by incorporating it into the image 60.
- An object 10obj indicating the position of each AGV10 displayed in the list area 60a is displayed on the planar map image. Thereby, the user 3 can grasp
- the image 60 further includes a plurality of button objects 61a to 61c and 63a to 63d.
- the CPU 21 executes a process associated with the button object, and the image processing circuit 26 generates and displays a new image indicating the result of the process.
- the selection of the button object is realized, for example, when the user moves the cursor onto the button object using the mouse 40b and clicks the button of the mouse 40b. Alternatively, this is realized by the user moving the cursor on the button object using the up / down / left / right keys of the keyboard 40a and pressing the enter button of the keyboard 40a.
- the button objects 61a to 61c are provided to start, stop, and emergency stop the management system 100, respectively.
- An area 62 of the image 60 displays the current state of the management system 100. In the illustrated example, the system is shown to be currently operating.
- the button objects 63a to 63d are provided for displaying the error history of the selected AGV 10, displaying the route history, editing the course that is the travel route, and setting the operation, respectively.
- the operation of the travel management apparatus 20 when the button objects 63a to 63d are selected will be described.
- FIG. 8 shows an example of an image 70 displayed on the monitor 30 after the button object 63a (FIG. 7) is selected.
- an error history of the selected AGV 10 is displayed.
- the history for example, the time when the error occurred (year / month / day / hour / minute / second), the coordinates of the location where the error occurred, the error code, and the specific contents of the error are shown.
- the error information is transmitted from the AGV 10 to the travel management device 20 and stored in the AGVDB 25 of the travel management device 20.
- FIG. 9 shows an example of an image 80 displayed on the monitor 30 after the button object 63b (FIG. 7) is selected.
- the traveling history of the selected AGV 10 is displayed.
- the history for example, the time (year / month / day / hour / minute / second) when passing through the position in the actual space S corresponding to the position of the marker object designated by the user 3, and the name given to the corresponding marker object Is shown.
- FIG. 10 shows an example of an image 110 displayed on the monitor 30 after the button object 63c (FIG. 7) is selected.
- a planar map image 112 of the space S in which the AGV 10 travels is displayed in the image 110.
- the user 3 can determine the travel route of the selected AGV 10 by designating a position on the planar map image 112 using the input device 40.
- three positions 114a, 114b, and 114c designated by the user 3 are indicated by “X”.
- the user 3 can correct the position indicated by “X” as necessary.
- the user 3 designates positions 114a, 114b, and 114c in order on the planar map image 112. Thereafter, when the user 3 selects a button object (not shown) indicating that the designation of the position is to be ended, the CPU 21 performs image processing on the coordinates of the positions 114a, 114b, 114c and an instruction to display the marker object at the coordinates. Send to circuit 26. In response to receiving the instruction, the image processing circuit 26 generates an image in which the marker object is displayed at the designated coordinates.
- FIG. 11 shows an example of the marker objects 116a, 116b, and 116c displayed at the positions 114a, 114b, and 114c selected by the user 3, respectively.
- the shape of each marker object is “ ⁇ ”, but the shape is arbitrary.
- the CPU 21 determines the travel route so that the AGV 10 travels in the order specified by the user 3 at the coordinate positions in the space S corresponding to the positions of the marker objects 116a, 116b, and 116c. Specifically, the CPU 21 determines a route from the position 114a toward the position 114b when the AGV 10 reaches the position 114b and toward the position 114c. The path may be a straight line or a curved line.
- the CPU 21 converts the position and route where each marker object is set on the image into coordinates and a travel route in the space S, respectively. For convenience, the coordinates in the space S converted from the position 116a are represented as “coordinates A”.
- Travel route data may be described according to a predetermined rule. For example, it is assumed that the user 3 designates another marker object following a certain marker object. For convenience, the previously designated marker object is called a “first marker object”, and the next designated marker object is called a “second marker object”.
- the travel route can be determined by “connection information” indicating the second marker object to be directed next to the first marker object and “trajectory information” indicating the shape of the trajectory from the first marker object to the second marker object.
- connection information indicating the second marker object to be directed next to the first marker object
- trajectory information indicating the shape of the trajectory from the first marker object to the second marker object.
- attribute information information that determines the traveling condition of the AGV 10
- the connection information and the trajectory information described above may be included in part of the attribute information of the first marker object.
- each position of the first marker object and the second marker object on the image is converted into “first coordinates” and “second coordinates” which are coordinates in the space S, respectively.
- the attribute information of the first marker object includes a set of X-axis coordinates and Y-axis coordinates that specify the first coordinates
- the attribute information of the second marker object includes a set of X-axis coordinates and Y-axis coordinates that specify the second coordinates. Including.
- the names “first coordinate” and “second coordinate” are also given for convenience.
- the travel route data is transmitted to each AGV 10 by the communication circuit 24 via the wireless access points 2a, 2b and the like.
- FIG. 12 shows an example of the first image 120 displayed on the monitor 30 after the button object 63d (FIG. 7) is selected.
- the illustrated example shows a list of attribute information of a marker object.
- the attribute information is exemplified as follows.
- the coordinates of the space S obtained by converting the coordinates of the marker object specified next to the marker object are referred to as “the coordinates of the next target position” (hereinafter the same in this specification).
- the coordinates (x, y) of the space S converted from the coordinates of the marker object.
- ⁇ An angle ( ⁇ ) indicating the traveling direction of the AGV 10 toward the coordinates of the next target position -Direction of AGV10 ("front” indicating forward, "back” indicating backward) -Information (name etc.) indicating the marker object specified next to the marker object -AGV10 speed toward the next marker object
- the CPU 21 sets or changes the traveling condition for each AGV 10 for each marker object set as shown in FIG. 11 according to the operation of the user 3.
- FIG. 13 shows an example of the second image 130 displayed on the monitor 30 after the button object 63d (FIG. 7) is selected.
- the second image 130 can be displayed when a marker object indicating the travel start position of the AGV 10 and a marker object indicating the travel end position of the AGV 10 are set.
- the second image 130 includes three regions 130a, 130b and 130c. Regions 130a and 130b indicate marker object attribute information indicating a travel start position and a travel end position, respectively. Specifically, the name of each marker object and the coordinates (x, y) of the space S converted from the coordinates of each marker object.
- the area 130c shows detailed attribute information related to the travel route.
- the attribute information is exemplified as follows.
- -AGV ID or name that identifies the AGV 10 to which the traveling condition is applied -Angle indicating the traveling direction of the AGV 10 toward the coordinates of the next target position-Direction of the AGV 10 ("front” indicating forward, "back” indicating backward ") -Speed of AGV10 toward the coordinates of the next target position-Trajectory shape of the travel route (straight line, arc) ⁇ Acceleration time and deceleration time of AGV10 ⁇ In-position range ⁇ Avoidance direction (right or left), distance to avoid, and length of time to avoid when an obstacle is detected
- in-position range means a range (area) that can be regarded as reached even when the AGV 10 has not exactly reached the coordinates of the next target position.
- the size of the area can be set for each next target position. For example, if the area is a circular area centered on the next target position, the user 3 can set the value of the radius of the circular area as attribute information.
- the unit is, for example, millimeter.
- a charging condition for determining whether to charge in accordance with the remaining amount of charging or the like, an entry prohibition condition for setting an area where entry of the AGV 10 is prohibited, and the like may be set as attribute information.
- An example of a method for detecting whether or not the AGV 10 has reached the region may be to use the output of the positioning device 14e (FIG. 4) provided in the AGV 10. If the AGV 10 collates the output of the positioning device 14e with the map data, estimates the position on the map data that most matches as the self position, and determines whether or not the estimated self position is within the area. Good.
- User 3 can change the areas 130a, 130b and 130c shown in FIG. CPU21 stores the attribute information after a change in AGVDB25 (FIG. 6), and sets or changes a driving condition for every AGV10.
- FIG. 14A shows a moving path of the AGV 10 when traveling straight.
- AGV10 starts traveling from the position M n, after reaching to the position M n + 1, can continue to move linearly to the next position M n + 2.
- FIG. 14B shows a movement path of the AGV 10 that makes a left turn at the position M n + 1 and moves toward the position M n + 2 .
- AGV10 starts traveling from the position M n, in the position M n + 1, the motor is rotated to a position to the right side in the traveling direction, to stop the motor located leftward in the traveling direction.
- the AGV 10 rotates counterclockwise by an angle ⁇ on the spot, all the motors rotate at a constant speed toward the position M n + 2 and go straight.
- FIG. 14C shows a movement path of the AGV 10 when moving in a circular arc shape from the position M n + 1 to the position M n + 2 .
- the AGV 10 After reaching the position M n + 1 , the AGV 10 increases the rotation speed of the outer peripheral motor more than the inner peripheral motor. As a result, the AGV 10 can move along an arc-shaped path toward the next position M n + 2 .
- the driving device 17 generates a difference in relative rotational speed between the motors 16a and 16b according to the control signal, so that the AGV 10 can turn or rotate in a direction in which the rotational speed is relatively slow.
- FIG. 15 shows each procedure of the process of the travel management device 20 and the process of the AGV 10 that travels in response to the result of the process.
- the right column of FIG. 15 shows the procedure of processing executed by the CPU 21 of the travel management device 20, and the left column shows the procedure of processing executed by the microcomputer 14a of the AGV 10.
- step S1 the CPU 21 receives designation of the passing position of the AGV 10 via the input device 40.
- step S2 the CPU 21 instructs the image processing circuit 26 to place a marker object at a designated position.
- step S3 the CPU 21 repeats steps S1 and S2 until the position designation is completed. When the designation of the position ends, the process proceeds to step S4.
- step S4 the CPU 21 converts the coordinates on the image where the marker object is arranged into the coordinates of the space where the AGV travels.
- step S5 the CPU 21 converts the virtual travel route passing through the plurality of marker objects into travel route data of the space where the AGV travels.
- step S6 the CPU 21 transmits the travel route data to the AGV 10 via the communication circuit 24.
- step S11 the microcomputer 14a of the AGV 10 receives the travel route data.
- step S12 the microcomputer 14a generates a control signal (PWM signal) according to the travel route data.
- step S13 the microcomputer 14a causes the driving device 17 to independently control the voltage applied to each motor in accordance with the control signal to rotate each driving wheel (wheels 11b and 11c). Thereby, AGV10 can drive
- FIG. 1 AGV10 can drive
- the mobile body management system of the present disclosure can be widely used for controlling the travel of a mobile body that moves indoors or outdoors.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
・当該マーカオブジェクトの座標から変換された空間Sの座標(x,y)
・次の目的位置の座標に向かうAGV10の走行方向を示す角度(θ)
・AGV10の向き(前進を示す「前」、後退を示す「後」)
・当該マーカオブジェクトの次に指定されたマーカオブジェクトを示す情報(名称等)
・次のマーカオブジェクトに向かうAGV10の速度
・次の目的位置の座標に向かうAGV10の走行方向を示す角度
・AGV10の向き(前進を示す「前」、後退を示す「後」)
・次の目的位置の座標に向かうAGV10の速度
・走行経路の軌道の形状(直線、円弧)
・AGV10の加速時間および減速時間
・インポジション範囲
・障害物に遭遇したことを検出した場合の回避方向(右または左)、回避する距離および回避を行う時間長
3 ユーザ
4 タブレットコンピュータ
10 AGV(移動体)
20 走行管理装置
21 CPU(コンピュータ)
22 メモリ
23 マーカデータベース(マーカDB)
24 通信回路
25 AGVデータベース(AGVDB)
26 画像処理回路
27 通信バス
30 モニタ
40 入力装置
40a キーボード
40b マウス
50 PC(管理コンピュータ)
100 管理システム
Claims (24)
- 少なくとも1台の移動体および走行管理装置を有し、前記走行管理装置を用いて前記移動体の走行を管理する管理システムであって、
前記移動体は、
複数のモータと、
前記複数のモータにそれぞれ接続された複数の駆動輪と、
制御信号に従って各モータに印加される電圧を独立して制御して前記複数の駆動輪の各々を回転させる駆動装置と、
前記走行管理装置と通信して、走行経路を示すデータを受信する第1通信回路と、
前記走行経路に沿って前記移動体を走行させるための前記制御信号を生成する制御回路とを有し、
前記走行管理装置は、
画像表示装置と、
ユーザの操作を受け付ける入力装置と、
前記画像表示装置に表示する画像を生成する画像処理回路であって、前記入力装置が前記ユーザから前記画像表示装置上の複数の位置の指定を受け付けたとき、前記複数の位置を示す複数のマーカオブジェクトを含む画像を生成する画像処理回路と、
前記画像上の各マーカオブジェクトの座標を、前記移動体が走行する空間内の各座標に変換し、前記複数のマーカオブジェクトを接続する前記画像表示装置上の線分または曲線を、空間内の前記走行経路として設定する信号処理回路と、
前記空間内の各座標および前記走行経路を示すデータを前記移動体に送信する第2通信回路と
を備える、管理システム。 - 前記画像処理回路は、前記複数のマーカオブジェクトの各々について、属性情報の一覧を示すコース設定画像をさらに生成し、前記一覧において、前記複数のマーカオブジェクトのうちの第1マーカオブジェクトの属性情報は、前記線分または前記曲線が接続される第2マーカオブジェクトを示す接続情報を含む、請求項1に記載の管理システム。
- 前記第1マーカオブジェクトの属性情報は、前記信号処理回路によって変換された、前記移動体が走行する空間内の座標を含む、請求項2に記載の管理システム。
- 前記信号処理回路は、前記第1マーカオブジェクトの座標を、前記移動体が走行する空間内の第1座標に変換し、
前記第1マーカオブジェクトの属性情報は前記第1座標の値を含む、請求項3に記載の管理システム。 - 前記第1マーカオブジェクトの属性情報は、前記第1座標からの前記移動体の走行方向を示す角度の値を含む、請求項4に記載の管理システム。
- 前記第1マーカオブジェクトの属性情報は、前記第1座標からの前記移動体の走行速度の値を含む、請求項4に記載の管理システム。
- 前記複数のマーカオブジェクトは、第1マーカオブジェクトおよび第2マーカオブジェクトを含み、
前記信号処理回路は、前記第1マーカオブジェクトの座標を、前記空間内の第1座標に変換し、前記第2マーカオブジェクトの座標を、前記空間内の第2座標に変換し、
前記移動体が前記第1座標から前記第2座標まで走行する場合において、
前記画像処理回路は、前記第1座標の値を含む前記第1マーカオブジェクトの属性情報、および、前記第2座標の値を含む前記第2マーカオブジェクトの属性情報を含むコース設定画像をさらに生成する、請求項1に記載の管理システム。 - 前記コース設定画像は、前記走行経路に関する属性情報をさらに含む、請求項7に記載の管理システム。
- 前記走行経路に関する属性情報は、前記移動体の走行方向を示す角度の値を含む、請求項8に記載の管理システム。
- 前記走行経路に関する属性情報は、前記移動体の走行速度の値を含む、請求項8または9に記載の管理システム。
- 前記走行経路に関する属性情報は、前記移動体の加速時間および減速時間の各値を含む、請求項8から10のいずれかに記載の管理システム。
- 前記走行経路に関する属性情報は、前記第2座標に応じて定められた領域を規定する情報を含む、請求項8から11のいずれかに記載の管理システム。
- 前記走行経路に関する属性情報は、前記移動体が障害物に遭遇したときの回避方向、回避する距離、および、回避を試行する時間の各値を含む、請求項8から12のいずれかに記載の管理システム。
- 前記信号処理回路は、前記複数のマーカオブジェクトを接続する前記画像表示装置上の円弧を、前記空間内の前記走行経路として設定する、請求項1から13のいずれかに記載の管理システム。
- 前記少なくとも1台の移動体が複数の移動体であるときにおいて、
前記走行管理装置の前記入力装置は、移動体ごとに前記ユーザの操作を受け付け、
前記信号処理回路は、移動体ごとに前記走行経路を設定する、請求項1から14のいずれかに記載の管理システム。 - 前記信号処理回路は、移動体ごとに走行および停止を設定する、請求項1から15のいずれかに記載の管理システム。
- 前記空間の地図画像のデータを取得するデータ端子をさらに備え、
前記画像処理回路は、前記複数のマーカオブジェクトを含む画像および前記空間の地図画像を表示する、請求項1から16のいずれかに記載の管理システム。 - 前記走行経路が曲線であるとき、前記駆動装置は、前記各モータに相対的な回転速度の差を生じさせて旋回させる、請求項1から17のいずれかに記載の管理システム。
- 前記移動体は、自己位置を推定し、推定した座標の値を出力する測位装置をさらに備え、
前記第1通信回路は、前記座標の値を送信する、請求項1から18のいずれかに記載の管理システム。 - 前記画像処理回路は、前記画像は、前記移動体の走行の管理を停止させるためのボタンオブジェクトをさらに含み、
前記入力装置が前記ユーザから前記ボタンオブジェクトの指定を受け付けたとき、前記信号処理回路は、前記移動体の走行を停止させる信号を生成し、前記第2通信回路は前記信号を前記移動体に送信する、請求項1から19のいずれかに記載の管理システム。 - 前記画像処理回路は、前記移動体のエラー履歴、および、前記走行経路の履歴の一方を含む画像をさらに生成する、請求項1から20のいずれかに記載の管理システム。
- 走行管理装置を用いて少なくとも1台の移動体の走行を管理する方法であって、
前記移動体は、
複数のモータと、
前記複数のモータにそれぞれ接続された複数の駆動輪と、
制御信号に従って各モータに印加される電圧を独立して制御して前記複数の駆動輪の各々を回転させる駆動装置と、
前記走行管理装置と通信して、走行経路を示すデータを受信する第1通信回路と、
前記走行経路に沿って前記移動体を走行させるための前記制御信号を生成する制御回路とを有し、
前記走行管理装置は、画像表示装置と、入力装置と、画像処理回路と、コンピュータである信号処理回路と、第2通信回路とを備え、
前記コンピュータが、
前記入力装置を介してユーザから前記画像表示装置上の複数の位置の指定を受け付け、
前記画像処理回路に前記複数の位置を示す複数のマーカオブジェクトを含む画像を生成させ、
前記画像上の各マーカオブジェクトの座標を、前記移動体が走行する空間内の座標に変換し、
前記複数のマーカオブジェクトを接続する前記画像表示装置上の線分または曲線を、空間内の前記走行経路として設定し、
前記第2通信回路を介して前記空間内の各座標および前記走行経路を示すデータを前記移動体に送信する、方法。 - 少なくとも1台の移動体の走行を管理する管理システムで用いられる走行管理装置に搭載されたコンピュータによって実行されるコンピュータプログラムであって、
前記移動体は、
複数のモータと、
前記複数のモータにそれぞれ接続された複数の駆動輪と、
制御信号に従って各モータに印加される電圧を独立して制御して前記複数の駆動輪の各々を回転させる駆動装置と、
前記走行管理装置と通信して、走行経路を示すデータを受信する第1通信回路と、
前記走行経路に沿って前記移動体を走行させるための前記制御信号を生成する制御回路とを有し、
前記走行管理装置は、画像表示装置と、入力装置と、画像処理回路と、コンピュータと、第2通信回路とを備え、
前記コンピュータプログラムは、前記コンピュータに、
前記入力装置を利用してユーザから前記画像表示装置上の複数の位置の指定を受け付けさせ、
前記画像処理回路を利用して前記複数の位置を示す複数のマーカオブジェクトを含む画像を生成させ、
前記画像上の各マーカオブジェクトの座標を、前記移動体が走行する空間内の座標に変換させ、
前記複数のマーカオブジェクトを接続する前記画像表示装置上の線分または曲線を、空間内の前記走行経路として設定させ、
前記第2通信回路を利用して前記空間内の各座標および前記走行経路を示すデータを前記移動体に送信させる、コンピュータプログラム。 - 複数の移動体および管理コンピュータを有し、前記管理コンピュータを用いて各移動体の走行を管理する管理システムであって、
前記移動体は、複数の駆動輪で走行し、かつ、前記管理コンピュータと通信可能であり、
前記管理コンピュータは、地図画像に配置された複数のマーカから前記移動体の走行経路を作成することが可能であり、
前記走行経路が、前記複数のマーカのうちの第1マーカの次に第2マーカを通過する場合において、前記第1マーカは、前記第1マーカの座標位置の情報、および、次に通過する前記第2マーカを特定する情報を属性情報として有する、管理システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780025890.3A CN109074079B (zh) | 2016-04-27 | 2017-04-26 | 移动体的管理系统、方法以及计算机程序 |
EP17789574.5A EP3451103A4 (en) | 2016-04-27 | 2017-04-26 | SYSTEM, METHOD AND COMPUTER PROGRAM FOR MOBILE BODY MANAGEMENT |
US16/095,000 US10866587B2 (en) | 2016-04-27 | 2017-04-26 | System, method, and computer program for mobile body management |
JP2018514651A JP6769659B2 (ja) | 2016-04-27 | 2017-04-26 | 移動体の管理システム、方法、およびコンピュータプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662328177P | 2016-04-27 | 2016-04-27 | |
US62/328,177 | 2016-04-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017188292A1 true WO2017188292A1 (ja) | 2017-11-02 |
Family
ID=60159659
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/016460 WO2017188292A1 (ja) | 2016-04-27 | 2017-04-26 | 移動体の管理システム、方法、およびコンピュータプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US10866587B2 (ja) |
EP (1) | EP3451103A4 (ja) |
JP (1) | JP6769659B2 (ja) |
CN (1) | CN109074079B (ja) |
TW (1) | TWI703833B (ja) |
WO (1) | WO2017188292A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110320903A (zh) * | 2018-03-30 | 2019-10-11 | 日本电产新宝株式会社 | 计算机系统及计算机可读取记录媒体 |
JP2020123196A (ja) * | 2019-01-31 | 2020-08-13 | 株式会社豊田自動織機 | 無人搬送システム |
KR20200108530A (ko) * | 2019-03-05 | 2020-09-21 | 조동욱 | 무인 원격 제어 자동차의 위치 파악을 위한 존 맵 생성 방법 및 이를 적용한 무인 원격 제어 자동차 제어 시스템 |
JPWO2021001987A1 (ja) * | 2019-07-04 | 2021-10-21 | 三菱電機株式会社 | 移動体測位装置および移動体測位システム |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11835343B1 (en) * | 2004-08-06 | 2023-12-05 | AI Incorporated | Method for constructing a map while performing work |
US10585440B1 (en) | 2017-01-23 | 2020-03-10 | Clearpath Robotics Inc. | Systems and methods for using human-operated material-transport vehicles with fleet-management systems |
US11243546B2 (en) * | 2017-03-27 | 2022-02-08 | Nidec-Shimpo Corporation | Moving body management system, moving body, traveling management device, and computer program |
US20180348792A1 (en) * | 2017-06-06 | 2018-12-06 | Walmart Apollo, Llc | Systems and methods for coupling autonomous ground vehicles delivering merchandise |
US11214437B1 (en) * | 2017-09-13 | 2022-01-04 | AI Incorporated | Autonomous mobile robotic device for the transportation of items |
WO2019144222A1 (en) | 2018-01-24 | 2019-08-01 | Clearpath Robotics Inc. | Systems and methods for maintaining vehicle state information |
TWI671610B (zh) | 2018-09-28 | 2019-09-11 | 財團法人工業技術研究院 | 自動引導車、自動引導車控制系統、以及自動引導車之控制方法 |
JP2020057307A (ja) * | 2018-10-04 | 2020-04-09 | 日本電産株式会社 | 自己位置推定のための地図データを加工する装置および方法、ならびに移動体およびその制御システム |
TWI679512B (zh) * | 2018-10-05 | 2019-12-11 | 東元電機股份有限公司 | 無人自走車 |
US12084328B2 (en) * | 2019-03-04 | 2024-09-10 | Panasonic Intellectual Property Management Co., Ltd. | Mover control method, mover control system, and program |
JP6764138B2 (ja) * | 2019-03-28 | 2020-09-30 | 日本電気株式会社 | 管理方法、管理装置、プログラム |
JP7040809B2 (ja) * | 2020-03-19 | 2022-03-23 | Totalmasters株式会社 | 建設現場管理装置 |
JP7411897B2 (ja) * | 2020-04-10 | 2024-01-12 | パナソニックIpマネジメント株式会社 | 掃除機システム、および掃除機 |
JP7453102B2 (ja) * | 2020-09-09 | 2024-03-19 | シャープ株式会社 | 移動時間予想装置および移動時間予想方法 |
US20220132722A1 (en) * | 2020-11-02 | 2022-05-05 | Deere & Company | Topographic confidence and control |
US20220132723A1 (en) * | 2020-11-02 | 2022-05-05 | Deere & Company | Agricultural characteristic confidence and control |
TWI770966B (zh) * | 2021-04-27 | 2022-07-11 | 陽程科技股份有限公司 | 無人自走車之導引控制方法 |
US20230004170A1 (en) * | 2021-06-30 | 2023-01-05 | Delta Electronics Int'l (Singapore) Pte Ltd | Modular control system and method for controlling automated guided vehicle |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002085305A (ja) * | 2000-09-12 | 2002-03-26 | Toshiba Tec Corp | ロボットクリーナ及びロボットクリーナシステム |
JP2002244731A (ja) * | 2001-02-21 | 2002-08-30 | Matsushita Electric Ind Co Ltd | 移動作業ロボット |
JP2007226322A (ja) * | 2006-02-21 | 2007-09-06 | Sharp Corp | ロボットコントロールシステム |
JP2010198064A (ja) * | 2009-02-23 | 2010-09-09 | Japan Science & Technology Agency | ロボット制御システム及びロボット制御方法 |
JP2012089078A (ja) * | 2010-10-22 | 2012-05-10 | Symtec Hozumi:Kk | 自動搬送車の制御方法 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6092010A (en) | 1997-09-03 | 2000-07-18 | Jervis B. Webb Company | Method and system for describing, generating and checking non-wire guidepaths for automatic guided vehicles |
JP3715420B2 (ja) | 1997-11-12 | 2005-11-09 | マツダエース株式会社 | 無人搬送車の走行プログラム作成装置 |
JP4117971B2 (ja) * | 1999-04-30 | 2008-07-16 | 本田技研工業株式会社 | 移動体用地図情報表示システム |
US20060010844A1 (en) * | 2004-06-30 | 2006-01-19 | Self Guided Systems, L.L.C. | Unmanned utility vehicle |
JP4282662B2 (ja) * | 2004-12-14 | 2009-06-24 | 本田技研工業株式会社 | 自律移動ロボットの移動経路生成装置 |
US20070156321A1 (en) * | 2005-12-29 | 2007-07-05 | Schad Jahan N | Speed regulation system for vehicles |
US8355818B2 (en) * | 2009-09-03 | 2013-01-15 | Battelle Energy Alliance, Llc | Robots, systems, and methods for hazard evaluation and visualization |
WO2008035433A1 (fr) | 2006-09-22 | 2008-03-27 | Fujitsu Limited | Unité mobile et procédé de commande |
JP4811454B2 (ja) * | 2008-12-02 | 2011-11-09 | 村田機械株式会社 | 搬送台車システム及び搬送台車への走行経路の指示方法 |
CN202929478U (zh) * | 2012-07-26 | 2013-05-08 | 苏州工业园区职业技术学院 | 人工装卸自动导引车辆控制系统 |
KR101990439B1 (ko) * | 2012-10-10 | 2019-06-18 | 삼성전자주식회사 | 단말 장치, 이동 장치, 이동 장치의 제어 방법, 이동 장치의 구동 방법 및 컴퓨터 판독가능 기록 매체 |
US9389614B2 (en) * | 2014-04-08 | 2016-07-12 | Unitronics Automated Solutions Ltd | System and method for tracking guiding lines by an autonomous vehicle |
US9720418B2 (en) * | 2014-05-27 | 2017-08-01 | Here Global B.V. | Autonomous vehicle monitoring and control |
KR20160015987A (ko) | 2014-08-01 | 2016-02-15 | 한국전자통신연구원 | 실내 인프라 지도 및 센서를 이용한 위치 인식 기반 원격 자율주행 시스템 및 그 방법 |
US10249088B2 (en) * | 2014-11-20 | 2019-04-02 | Honda Motor Co., Ltd. | System and method for remote virtual reality control of movable vehicle partitions |
-
2017
- 2017-04-26 CN CN201780025890.3A patent/CN109074079B/zh active Active
- 2017-04-26 EP EP17789574.5A patent/EP3451103A4/en not_active Withdrawn
- 2017-04-26 US US16/095,000 patent/US10866587B2/en active Active
- 2017-04-26 TW TW106113920A patent/TWI703833B/zh active
- 2017-04-26 WO PCT/JP2017/016460 patent/WO2017188292A1/ja active Application Filing
- 2017-04-26 JP JP2018514651A patent/JP6769659B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002085305A (ja) * | 2000-09-12 | 2002-03-26 | Toshiba Tec Corp | ロボットクリーナ及びロボットクリーナシステム |
JP2002244731A (ja) * | 2001-02-21 | 2002-08-30 | Matsushita Electric Ind Co Ltd | 移動作業ロボット |
JP2007226322A (ja) * | 2006-02-21 | 2007-09-06 | Sharp Corp | ロボットコントロールシステム |
JP2010198064A (ja) * | 2009-02-23 | 2010-09-09 | Japan Science & Technology Agency | ロボット制御システム及びロボット制御方法 |
JP2012089078A (ja) * | 2010-10-22 | 2012-05-10 | Symtec Hozumi:Kk | 自動搬送車の制御方法 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110320903A (zh) * | 2018-03-30 | 2019-10-11 | 日本电产新宝株式会社 | 计算机系统及计算机可读取记录媒体 |
JP2019179496A (ja) * | 2018-03-30 | 2019-10-17 | 日本電産シンポ株式会社 | コンピュータシステムおよびコンピュータプログラム |
JP2020123196A (ja) * | 2019-01-31 | 2020-08-13 | 株式会社豊田自動織機 | 無人搬送システム |
KR20200108530A (ko) * | 2019-03-05 | 2020-09-21 | 조동욱 | 무인 원격 제어 자동차의 위치 파악을 위한 존 맵 생성 방법 및 이를 적용한 무인 원격 제어 자동차 제어 시스템 |
KR102202743B1 (ko) | 2019-03-05 | 2021-01-12 | 조동욱 | 무인 원격 제어 자동차의 위치 파악을 위한 존 맵 생성 방법 및 이를 적용한 무인 원격 제어 자동차 제어 시스템 |
JPWO2021001987A1 (ja) * | 2019-07-04 | 2021-10-21 | 三菱電機株式会社 | 移動体測位装置および移動体測位システム |
Also Published As
Publication number | Publication date |
---|---|
JP6769659B2 (ja) | 2020-10-14 |
EP3451103A1 (en) | 2019-03-06 |
JPWO2017188292A1 (ja) | 2019-02-28 |
CN109074079A (zh) | 2018-12-21 |
US20190155275A1 (en) | 2019-05-23 |
CN109074079B (zh) | 2021-09-28 |
US10866587B2 (en) | 2020-12-15 |
TW201739194A (zh) | 2017-11-01 |
EP3451103A4 (en) | 2019-12-11 |
TWI703833B (zh) | 2020-09-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017188292A1 (ja) | 移動体の管理システム、方法、およびコンピュータプログラム | |
US20170347979A1 (en) | Method and device for motion control of a mobile medical device | |
TWI665538B (zh) | 進行障礙物之迴避動作的移動體及記錄其之電腦程式的記錄媒體 | |
CN110998472A (zh) | 移动体以及计算机程序 | |
US20190360835A1 (en) | Stand-alone self-driving material-transport vehicle | |
WO2019187816A1 (ja) | 移動体および移動体システム | |
US11537140B2 (en) | Mobile body, location estimation device, and computer program | |
JPWO2019044498A1 (ja) | 移動体、位置推定装置、およびコンピュータプログラム | |
CN108458712A (zh) | 无人驾驶小车导航系统及导航方法、无人驾驶小车 | |
JP2020184148A (ja) | 情報処理装置及び情報処理方法 | |
JPWO2018179960A1 (ja) | 移動体および自己位置推定装置 | |
JP7103585B2 (ja) | 移動体、移動体管理システムおよびコンピュータプログラム | |
JP2019079171A (ja) | 移動体 | |
JP2020166702A (ja) | 移動体システム、地図作成システム、経路作成プログラムおよび地図作成プログラム | |
CN113885506A (zh) | 机器人避障方法、装置、电子设备及存储介质 | |
JP2019179496A (ja) | コンピュータシステムおよびコンピュータプログラム | |
CN113534810A (zh) | 一种物流机器人及物流机器人系统 | |
CN110825083B (zh) | 车辆的控制方法、设备及计算机可读存储介质 | |
KR20230070175A (ko) | 증강현실 뷰를 사용하는 경로 안내 방법 및 장치 | |
WO2018212099A1 (ja) | 移動体の動作を制御するモバイルコンピュータ、移動体制御システムおよびコンピュータプログラム | |
US11619727B2 (en) | Determining multi-degree-of-freedom pose for sensor calibration | |
JPWO2018180175A1 (ja) | 移動体、信号処理装置およびコンピュータプログラム | |
JP7098267B2 (ja) | ロボット起動測位方法、デバイス、電子装置及び記憶媒体 | |
US11243546B2 (en) | Moving body management system, moving body, traveling management device, and computer program | |
US20230297121A1 (en) | Moving body control system, control apparatus, and moving body control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018514651 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017789574 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17789574 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017789574 Country of ref document: EP Effective date: 20181127 |