WO2018212099A1 - Ordinateur mobile pour commander le mouvement d'un corps mobile, système de commande de corps mobile et programme informatique - Google Patents

Ordinateur mobile pour commander le mouvement d'un corps mobile, système de commande de corps mobile et programme informatique Download PDF

Info

Publication number
WO2018212099A1
WO2018212099A1 PCT/JP2018/018362 JP2018018362W WO2018212099A1 WO 2018212099 A1 WO2018212099 A1 WO 2018212099A1 JP 2018018362 W JP2018018362 W JP 2018018362W WO 2018212099 A1 WO2018212099 A1 WO 2018212099A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile
widget
travel
data
touch
Prior art date
Application number
PCT/JP2018/018362
Other languages
English (en)
Japanese (ja)
Inventor
信也 安達
麻衣子 龍
悠平 角宮
赤松 政弘
Original Assignee
日本電産シンポ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産シンポ株式会社 filed Critical 日本電産シンポ株式会社
Priority to JP2019518752A priority Critical patent/JP6794539B2/ja
Publication of WO2018212099A1 publication Critical patent/WO2018212099A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot

Definitions

  • the present disclosure relates to a mobile computer, a mobile control system, and a computer program that control the operation of a mobile.
  • AGV Automatic Guided Vehicle
  • Patent Document 1 discloses a mobile body having a tag communication unit.
  • a tag communication unit In the travel target area, a plurality of IC tags each having position information are arranged in a distributed manner.
  • the tag communication unit performs wireless communication with the IC tag and reads position information of the IC tag. Thereby, the moving body can acquire information on the current position and perform automatic traveling.
  • Patent Document 2 discloses a system for moving an AGV to a designated position.
  • the AGV reads a location marker representing a position and moves to a designated position
  • the AGV corrects it using its own navigation system if the position is shifted.
  • Patent Document 3 discloses a technique for determining the position of an address mark by simulation prior to laying the address mark on a course on which an AGV runs.
  • the IC tag or location marker necessary for detecting the position is placed in advance in the traveling area of the AGV, and the route on which the AGV can travel is determined. About. When it becomes necessary to change the position of the IC tag or the location marker at the site after starting the operation of the AGV, it takes a great deal of work for the change.
  • One non-limiting exemplary embodiment of the present application provides a technique that makes it possible to easily perform AGV setting, travel route setting, change, and the like on site.
  • the mobile computer of the present disclosure is a mobile computer that receives input from a user via a graphical user interface (GUI) and controls the operation of the mobile body, and communicates with the mobile body
  • GUI graphical user interface
  • a touch screen panel that outputs data of a detection position and a processing circuit that performs travel control or setting processing of the moving body associated with the widget arranged at the detection position in response to the detection of the touch.
  • the GUI includes at least one operation widget for travel control of the mobile body, A map creation widget for causing the mobile body to create a map of the space, a capture widget for creating a travel route of the mobile body by specifying a passing position of the mobile body, and one or more created A route selection widget for selecting one travel route from the travel routes is included.
  • the mobile computer includes a display device that displays a GUI, and a touch screen panel that detects a touch on the display device by a user and outputs data of a detection position. Since the GUI includes a plurality of widgets associated with traveling control or setting processing of the moving body, the user can control various operations of the moving body with intuitive operations.
  • FIG. 1 is a diagram illustrating an overview of a control system 100 that controls traveling of each AGV according to the present disclosure.
  • FIG. 2 is an external view of an exemplary AGV 10 according to the present embodiment.
  • FIG. 3 is a diagram illustrating a hardware configuration of the AGV 10.
  • FIG. 4 is a diagram illustrating a hardware configuration of the tablet computer 20.
  • FIG. 5 is a view showing an example of a GUI image displayed on the display 25 of the tablet computer 20.
  • FIG. 6A is a diagram illustrating an example of manual operation of the AGV 10 using the forward button 30a.
  • FIG. 6B is a diagram illustrating an example of manual operation of the AGV 10 using the forward button 30a.
  • FIG. 7A is a diagram illustrating an example of manual operation of the AGV 10 using the right turn button 30c.
  • FIG. 7B is a diagram illustrating an example of manual operation of the AGV 10 using the right turn button 30c.
  • FIG. 8A is a diagram illustrating an example of manual operation of the AGV 10 using the joystick type slider 31.
  • FIG. 8B is a diagram illustrating an example of manual operation of the AGV 10 using the joystick type slider 31.
  • FIG. 9 is a diagram illustrating an example of the sliding direction ⁇ and the sliding amount d of the joystick-type slider 31.
  • FIG. 10A is a diagram showing an AGV 10 that generates a map while moving.
  • FIG. 10B is a diagram illustrating the AGV 10 that generates a map while moving.
  • FIG. 10A is a diagram showing an AGV 10 that generates a map while moving.
  • FIG. 10C is a diagram illustrating the AGV 10 that generates a map while moving.
  • FIG. 10D is a diagram illustrating the AGV 10 that generates a map while moving.
  • FIG. 10E is a diagram showing the AGV 10 that generates a map while moving.
  • FIG. 10F is a diagram schematically showing a part of the completed map 60.
  • FIG. 11 is a diagram showing an example of a procedure for creating a travel route using the capture button 33 (FIG. 5).
  • FIG. 12 is a diagram showing the capture button 33 on which a number icon 33a indicating the number of markers set is displayed.
  • FIG. 13A is a diagram illustrating an example of marker data included in certain travel route data R.
  • FIG. 13B is a diagram illustrating an example of marker data in which richer data than the example of FIG. 13A is set.
  • FIG. 14 is a diagram showing a display example of a plurality of travel routes R1 to R3 displayed on the display 25 after touching the route selection button 35 (FIG. 5).
  • FIG. 15 is a diagram showing an example of an image of a GUI (initial GUI) immediately after an application for controlling the AGV 10 is started on the tablet computer 20.
  • GUI initial GUI
  • a tablet computer 20 is given as an example of a mobile computer.
  • Other examples of the mobile computer are a smartphone and a laptop PC.
  • an automatic guided vehicle is mentioned as an example of a moving body.
  • the automatic guided vehicle is called AGV (Automated Guided Vehicle) and is also described as “AGV” in this specification.
  • FIG. 1 shows an overview of a control system 100 that controls the running of each AGV according to the present disclosure.
  • the control system 100 includes an AGV 10 and a tablet computer 20.
  • the AGV 10 and the tablet computer 20 are connected, for example, in a one-to-one manner and perform communication conforming to the Bluetooth (registered trademark) standard, or Wi-Fi (registered) using one or a plurality of access points 2a, 2b, etc. (Trademark) is performed.
  • the plurality of access points 2 a and 2 b are connected to the switching hub 3. By transferring the data frame via the switching hub 3, bidirectional communication between the AGV 10 and the tablet computer 20 is realized.
  • User 1 uses tablet computer 20 to control the operation of AGV 10. Specifically, the user 1 uses the tablet computer 20 to cause the AGV 10 to create a map of the traveling space S, set or change the travel route of the AGV 10 after creating the map, and manually run the AGV 10. it can.
  • the operation of the tablet computer 20 by the user 1 is performed via a graphical user interface (hereinafter referred to as “GUI”).
  • GUI graphical user interface
  • the GUI is realized by the display of the tablet computer 20 and the touch screen panel.
  • the GUI includes a plurality of widgets.
  • “Widget” means a user interface component displayed on a display such as a GUI button, slider, icon, or the like.
  • the user interface component is sometimes called a “UI part”.
  • Each widget is associated with a travel control operation for traveling the AGV 10, or setting processing such as map creation, travel route setting or change.
  • the tablet computer 20 receives an input from the user 1 via the GUI and performs a traveling control operation or a setting process. Details of the GUI will be described later.
  • AGV 10 Although one AGV 10 is shown in FIG. 1, a plurality of AGVs may be used.
  • the user 1 can select one AGV 10 from a plurality of registered AGVs via the GUI of the tablet computer 20 and perform a travel control operation or a setting process.
  • FIG. 2 is an external view of an exemplary AGV 10 according to the present embodiment.
  • the AGV 10 includes four wheels 11a to 11d, a frame 12, a transport table 13, a travel control device 14, and a laser range finder 15.
  • the AGV 10 also has a plurality of motors, which are not shown in FIG. 2 shows the front wheel 11a, the rear wheel 11b, and the rear wheel 11c, but the front wheel 11d is not clearly shown because it is hidden behind the frame 12.
  • the traveling control device 14 is a device that controls the operation of the AGV 10, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a board on which they are mounted.
  • the traveling control device 14 performs the above-described data transmission / reception with the tablet computer 20 and the preprocessing calculation.
  • the laser range finder 15 is an optical device that measures the distance to a target by, for example, irradiating the target with infrared laser light 15a and detecting the reflected light of the laser light 15a.
  • the laser range finder 15 of the AGV 10 has a pulsed laser beam while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 10, for example. 15a is emitted, and the reflected light of each laser beam 15a is detected. Thereby, the data of the distance to the reflection point in the direction determined by the angle for a total of 1080 steps every 0.25 degrees can be obtained.
  • the AGV 10 can create a map of the space S based on the position and orientation of the AGV 10 and the scan result of the laser range finder 15.
  • the map may reflect the arrangement of walls, pillars and other structures around the AGV, and objects placed on the floor.
  • the map data is stored in a storage device provided in the AGV 10.
  • the position and posture of the moving body are called poses.
  • the position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
  • the position and posture of the AGV 10, that is, the pose (x, y, ⁇ ) may be simply referred to as “position” hereinafter.
  • the position of the reflection point seen from the radiation position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
  • the laser range finder 15 outputs sensor data expressed in polar coordinates.
  • the laser range finder 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
  • Examples of objects that can be detected by the laser range finder 15 are people, luggage, shelves, and walls.
  • the laser range finder 15 is an example of an external sensor for sensing the surrounding space and acquiring sensor data.
  • Other examples of such an external sensor include an image sensor and an ultrasonic sensor.
  • the traveling control device 14 can estimate its current position by comparing the measurement result of the laser range finder 15 with the map data held by itself.
  • the map data may be acquired by the AGV 10 itself using a SLAM (Simultaneous Localization and Mapping) technique.
  • FIG. 3 shows the hardware configuration of AGV10.
  • FIG. 3 also shows a specific configuration of the travel control device 14.
  • the AGV 10 includes a travel control device 14, a laser range finder 15, two motors 16 a and 16 b, and a drive device 17.
  • the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a positioning device 14e.
  • the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the positioning device 14e are connected by a communication bus 14f and can exchange data with each other.
  • the laser range finder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 14a, the positioning device 14e, and / or the memory 14b.
  • the microcomputer 14a is a processor or a control circuit (computer) that performs calculations for controlling the entire AGV 10 including the travel control device 14.
  • the microcomputer 14a is a semiconductor integrated circuit.
  • the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the drive device 17 to control the drive device 17 and adjust the voltage applied to the motor.
  • PWM Pulse Width Modulation
  • the memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a.
  • the memory 14b can also be used as a work memory when the microcomputer 14a and the positioning device 14e perform calculations.
  • the storage device 14c is a non-volatile semiconductor memory device.
  • the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk.
  • the storage device 14c may include a head device for writing and / or reading data on any recording medium and a control device for the head device.
  • the storage device 14c stores map data M of the traveling space S and data (travel route data) R of one or more travel routes.
  • the map data M is created by the AGV 10 operating in the map creation mode and stored in the storage device 14c.
  • the travel route data R is created by the AGV 10 operating in the route creation mode after the map data M is created, and stored in the storage device 14c.
  • the traveling route data R includes marker data indicating the marker position. “Marker” indicates the passing position (route point) of the traveling AGV 10.
  • the travel route data R includes at least a start marker indicating a travel start position and an end marker indicating a travel end position.
  • the travel route data R may further include one or more intermediate waypoints. When one or more intermediate waypoints are included, a route from the start marker to the end marker via the travel route point in order is defined as a travel route.
  • Each marker data includes the direction (angle) of the AGV 10 until moving to the next marker, the traveling speed, the acceleration time for accelerating to reach the traveling speed, and / or the deceleration time for decelerating from the traveling speed. Data can be included.
  • the AGV 10 can travel along the selected travel route while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during travel.
  • the map data M and the travel route data R are stored in the same storage device 14c, but may be stored in different storage devices.
  • the communication circuit 14d is a wireless communication circuit that performs wireless communication based on, for example, Bluetooth (registered trademark) and / or Wi-Fi (registered trademark) standards.
  • Each standard includes a wireless communication standard using a frequency of 2.4 GHz band.
  • wireless communication conforming to the Bluetooth (registered trademark) standard is performed and communication with the tablet computer 20 is performed one-on-one.
  • the positioning device 14e receives the sensor data from the laser range finder 15 and reads the map data M stored in the storage device 14c. By comparing (matching) the local map data created from the scan result of the laser range finder 15 with a wider range of map data M, the self position (x, y, ⁇ ) on the map data M is identified. The positioning device 14e generates “reliability” indicating the degree to which the local map data matches the map data M. The self position (x, y, ⁇ ) and reliability data can be transmitted from the AGV 10 to the tablet computer 20. The tablet computer 20 can receive the self-position (x, y, ⁇ ) and reliability data and display them on a built-in display device.
  • the microcomputer 14a and the positioning device 14e are separate components, but this is an example.
  • the microcomputer 14a and the positioning device 14e may be integrated, and a single chip circuit or a semiconductor integrated circuit capable of performing each operation of the microcomputer 14a and the positioning device 14e independently may be provided.
  • FIG. 3 shows a chip circuit 14g including the microcomputer 14a and the positioning device 14e.
  • the positioning device 14e and the laser range finder 15 are assumed to be separate components, this is also an example.
  • a laser positioning system in which the positioning device 14e and the laser range finder 15 are integrated may be employed.
  • the two motors 16a and 16b are attached to the two wheels 11b and 11c, respectively, and rotate each wheel. That is, the two wheels 11b and 11c are drive wheels, respectively.
  • the motor 16a and the motor 16b are described as being motors that drive the right wheel and the left wheel of the AGV 10, respectively.
  • the drive device 17 has motor drive circuits 17a and 17b for adjusting the voltage applied to each of the two motors 16a and 16b.
  • Each of the motor drive circuits 17a and 17b is a so-called inverter circuit, and the current applied to each motor is turned on or off by the PWM signal transmitted from the microcomputer 14a, thereby adjusting the voltage applied to the motor.
  • FIG. 4 shows the hardware configuration of the tablet computer 20.
  • the tablet computer 20 includes a CPU 21, a memory 22, a communication circuit 23, an image processing circuit 24, a display 25, a touch screen panel 26, and a communication bus 27.
  • the CPU 21, the memory 22, the communication circuit 23, the image processing circuit 24, and the touch screen panel 26 are connected by a communication bus 27 and can exchange data with each other via the communication bus 27.
  • the CPU 21 is a signal processing circuit (computer) that controls the operation of the tablet computer 20.
  • the CPU 21 is a semiconductor integrated circuit.
  • the CPU 21 may be simply referred to as a “processing circuit”.
  • the memory 22 is a volatile storage device that stores a computer program executed by the CPU 21.
  • the memory 22 can also be used as a work memory when the CPU 21 performs calculations.
  • the computer program may be stored in a nonvolatile storage device (not shown) such as an EEPROM.
  • the CPU 21 reads out a computer program from the non-volatile storage device when the tablet computer 20 is activated, expands it in the memory 22 and executes it.
  • the communication circuit 23 is a wireless communication circuit that performs wireless communication based on, for example, Bluetooth (registered trademark) and / or Wi-Fi (registered trademark) standards. Similar to the communication circuit 14d of the AGV 10, in this specification, the tablet computer 20 performs wireless communication in accordance with the Bluetooth (registered trademark) standard and communicates with the AGV 10 on a one-to-one basis.
  • the communication circuit 23 receives data to be transmitted to the AGV 10 from the CPU 21 via the bus 27.
  • the communication circuit 23 transmits data (notification) received from the AGV 10 to the CPU 21 and / or the memory 22 via the bus 27.
  • the image processing circuit 24 generates an image to be displayed on the display 25 in accordance with an instruction from the CPU 21. For example, the image processing circuit 24 displays an image for the GUI, and rewrites the image on the display 25 according to the touch operation of the user 1 received via the touch screen panel 26.
  • the touch screen panel 26 can detect the touch of the user 1 performed with a finger or a pen.
  • detection methods electrostatic methods, resistive film methods, optical methods, ultrasonic methods, electromagnetic methods, and the like are known.
  • the touch screen panel 26 detects a change in capacitance at a specific position, and transmits data related to the change to the CPU 21 via the communication bus 27.
  • CPU21 judges the presence or absence of the touch by a user based on the sent data.
  • An example of “data related to change” is data of the position where the capacitance has changed and the time length that has changed.
  • “Touch” includes various operations such as short press (or tap), long press, and slide.
  • the short press is an operation of releasing the finger within a predetermined reference time after the user 1 touches the touch screen panel 26 with the finger.
  • the long press is an operation of maintaining the state without moving the finger after the user 1 touches the touch screen panel 26 and releasing the finger after a time longer than the reference time has elapsed.
  • the slide is an operation in which, after the user 1 touches the touch screen panel 26 with his / her finger, the user 1 slides left and right on the touch screen panel 26 without releasing his / her finger. is there.
  • the touch screen panel 26 is provided so as to overlap the display 25.
  • the user 1 touches the image while viewing the image displayed on the display 25.
  • the CPU 21 determines which position of the image displayed on the display 25 the detected position data output from the touch screen panel 26 indicates. As a result of the determination, the CPU 21 can execute a function associated with the image displayed at the position.
  • the CPU 21 of the tablet computer 20 can perform the travel control or setting process of the AGV 10.
  • the AGV 10 can perform manual travel that travels according to a real-time operation from the user 1 using the tablet computer 20 and automatic travel that travels according to the created travel route.
  • the user 1 can select whether to run the AGV 10 manually or automatically from the GUI of the tablet computer 20.
  • the AGV 10 and the tablet computer 20 perform communication every several hundred milliseconds to confirm that the connection is maintained. Thereby, at the time of manual driving, it is possible to realize control such as starting and stopping of driving of the AGV 10 from the tablet computer 20 in almost real time.
  • the AGV 10 stops traveling. The AGV 10 can perform manual travel via the tablet computer 20 in a state where control from the tablet computer 20 is possible.
  • FIG. 5 shows an example of a GUI image displayed on the display 25 of the tablet computer 20.
  • the GUI includes a plurality of widgets associated with the traveling control or setting process of the AGV 10.
  • the GUI includes a forward button 30a, a backward button 30b, a right turn button 30c, a left turn button 30d, a joystick-type slider 31, a map creation button 32, a capture button 33, an option setting button 34, and a route.
  • a selection button 35 is provided.
  • an area 36 for displaying the estimated self-position (x, y, ⁇ ) of the AGV 10, the reliability of estimation, etc. received from the positioning device 14e of the AGV 10 is provided.
  • the forward button 30a, the backward button 30b, the right turn button 30c, the left turn button 30d, and the joystick type slider 31 are operation widgets for controlling manual travel of the AGV 10.
  • the AGV 10 operates while the user 1 continues to touch the buttons 30a to 30d and the slider 31.
  • the joystick type slider 31 can be slid in an arbitrary direction.
  • the tablet computer 20 controls the traveling direction of the AGV 10 according to the sliding direction and sliding amount of the joystick-type slider 31 by the user 1.
  • the map creation button 32 is a widget for shifting the operation mode of the AGV 10 to a map creation mode for creating a map of the space S.
  • the capture button 33 is a widget for shifting the operation mode of the AGV 10 to a route creation mode for creating a travel route of the AGV 10.
  • the option setting button 34 is a widget for shifting to a setting mode for setting various parameters applied to the AGV 10.
  • the route selection button 35 is a widget for selecting one travel route from the created one or more travel routes.
  • FIGS. 6A and 6B show an example of manual operation of the AGV 10 using the forward button 30a.
  • the user 1 touches the forward button 30a displayed on the display 25 with a finger.
  • the CPU 21 of the tablet computer 20 determines that the coordinates of the touch position output from the touch screen panel 26 are the coordinates of the forward button 30a. Thereby, CPU21 detects that user 1 touched forward button 30a.
  • the CPU 21 When detecting a touch on the forward button 30a, the CPU 21 generates a control signal including a command for causing the AGV 10 to move straight forward.
  • the communication circuit 23 transmits the generated control signal to the AGV 10.
  • CPU21 produces
  • the microcomputer 14a of the AGV 10 receives a control signal from the tablet computer 20 via the communication circuit 14d.
  • the microcomputer 14a transmits a PWM signal to each of the motor drive circuits 17a and 17b in response to a command for running the AGV 10 forward included in the control signal.
  • the PWM signal is a signal for causing the motors 16a and 16b to rotate forward at the same rotational speed. Note that “forward rotation” means rotation in a direction in which the AGV 10 travels forward.
  • the AGV 10 goes straight as shown in FIG. 6B.
  • the CPU 21 When the touch on the forward button 30a is not detected, the CPU 21 generates a control signal including a command for stopping the traveling of the AGV 10 and transmits it to the AGV 10. Thereby, AGV10 stops driving
  • the speed at the time of going straight forward or backward may be the maximum speed of the AGV 10 or may be set in advance by the user.
  • FIG. 7A and 7B show an example of manual operation of the AGV 10 using the right turn button 30c.
  • the CPU 21 detects that the user 1 has touched the right turn button 30c. Since the detection process is similar to the detection process of the touch on the forward button 30a, the description is omitted. The description of the subsequent detection processing is also omitted.
  • the CPU 21 When detecting a touch on the right turn button 30c, the CPU 21 generates a control signal including a command for turning the AGV 10 right on the spot.
  • the communication circuit 23 transmits the generated control signal to the AGV 10.
  • CPU21 produces
  • the microcomputer 14a of the AGV 10 receives a control signal from the tablet computer 20 via the communication circuit 14d.
  • the microcomputer 14a transmits a PWM signal to each of the motor drive circuits 17a and 17b in response to a command to turn the AGV 10 included in the control signal to the right.
  • the PWM signal is a signal for rotating the motors 16a and 16b in the reverse directions.
  • the CPU 21 generates a control signal for rotating the motor 16a in the reverse direction at the same rotation speed and rotating the motor 16b in the forward direction. Thereby, as shown in FIG. 7B, the AGV 10 turns right on the spot.
  • the CPU 21 When the touch on the right turn button 30c is no longer detected, the CPU 21 generates a control signal including a command for stopping the turning of the AGV 10 and transmits the control signal to the AGV 10. Thereby, AGV10 stops turning, when the user's 1 finger
  • each motor When turning right or turning left, each motor may rotate at the highest speed, or each motor may rotate so as to have a turning speed (angular speed) set in advance by the user.
  • the setting by the user can be performed using the option setting button 34, for example.
  • the CPU 21 receives an input of the turning speed of the AGV 10.
  • CPU21 may receive the maximum speed at the time of driving
  • FIG. 8A and 8B show an example of manual operation of the AGV 10 using the joystick type slider 31.
  • the CPU 21 When detecting a touch on the joystick slider 31, the CPU 21 generates a control signal for causing the AGV 10 to travel in a direction corresponding to the sliding direction of the joystick slider 31 and at a speed corresponding to the sliding amount.
  • the user 1 slides the joystick-type slider 31 in the upper right direction. Then, the AGV 10 travels diagonally right forward as shown in FIG. 8B.
  • the joystick type slider 31 is provided so that the user 1 can intuitively operate the traveling direction and traveling speed of the AGV 10.
  • FIG. 9 shows an example of the sliding direction ⁇ and the sliding amount d of the joystick type slider 31.
  • the X axis, the Y axis, and the origin O are set.
  • the right side of the origin O is the + X direction and the upper side is the + Y direction.
  • the + Y direction is the straight direction of AGV10.
  • the + X direction is a direction directly beside the right hand of the AGV 10.
  • the CPU 21 acquires the center coordinates P (X, Y) of the joystick slider 31 that has been slid from the touch screen panel 26.
  • Vbase Vmax ⁇ d / 100
  • the motor 16a and the motor 16b are rotated at the speed shown in the following table according to the quadrant where the coordinate P exists.
  • the right wheel rotation speed V r and the left wheel rotation speed V l mean the rotation speeds of the motor 16a and the motor 16b, respectively.
  • the user 1 slides the joystick type slider 31 in the + Y direction, the ⁇ Y direction, or other directions. Then, the CPU 21 generates a control signal for causing the AGV 10 to travel in the forward direction, the backward direction, the direction along the arc, and the like according to the slide direction.
  • the user 1 can intuitively control the traveling direction of the AGV 10 and can travel at a speed corresponding to the amount of slide.
  • the CPU 21 When the CPU 21 detects a touch on the map creation button 32, the CPU 21 shifts the AGV 10 to a map creation mode for creating a map of the space S.
  • the AGV 10 scans the space S using the laser range finder 15 and creates a map using the positioning device 14e.
  • FIGS. 10A to 10F show the AGV 10 that generates a map while moving.
  • the user 1 may move the AGV 10 using the joystick-type slider 31 described above, or move the AGV 10 using the forward button 30a, the backward button 30b, the right turn button 30c, and the left turn button 30d. Also good.
  • FIG. 10A shows an AGV 10 that scans the surrounding space using the laser range finder 15. Laser light is emitted at every predetermined step angle, and scanning is performed.
  • the illustrated scan range is an example schematically shown, and is different from the above-described scan range of 270 degrees in total.
  • the position of the reflection point of the laser beam is shown using a plurality of black dots 4 represented by the symbol “ ⁇ ”.
  • the positioning device 14e accumulates the position of the black spot 4 obtained as a result of traveling, for example, in the memory 14b. By continuously scanning while the AGV 10 travels, the map is gradually completed.
  • 10B to 10E only the scan range is shown for the sake of simplicity. The scan range is an example, and is different from the above-described example of 270 degrees in total.
  • FIG. 10F schematically shows a part of the completed map 60.
  • the positioning device 14e accumulates the data of the map 60 (map data M) in the memory 14b or the storage device 14c.
  • the number or density of black spots shown in the figure is an example.
  • the user 1 can set the travel route of the AGV 10.
  • FIG. 11 shows an example of a procedure for creating a travel route using the capture button 33 (FIG. 5).
  • the CPU 21 shifts the AGV 10 to the route creation mode.
  • the user 1 moves the AGV 10 to the start position of the travel route to be newly created, and further touches the capture button 33.
  • the CPU 21 transmits to the AGV 10 a command for acquiring the pause (x, y, ⁇ ) of the AGV 10 at that time.
  • the microcomputer 14a of the AGV 10 stores the data of the pause (x, y, ⁇ ) at that time as a “marker” in the memory 14b or the storage device 14c.
  • a marker M1 in FIG. 11 indicates the travel start position of the AGV 10.
  • FIG. 12 shows the capture button 33 on which a number icon 33a indicating the number of markers set is displayed. At the time when the marker M1 is acquired, the number icon 33a indicates “1”.
  • the user 1 moves the AGV 10 to the next passing position on the travel route and touches the capture button 33.
  • the microcomputer 14a of the AGV 10 stores the data of the pose (x, y, ⁇ ) at that time in the memory 14b or the storage device 14c in accordance with the command of the tablet computer 20.
  • the AGV 10 sequentially acquires the poses (x, y, ⁇ ) of the AGV 10 at each passing position on the travel route.
  • the markers M2 to M4 in FIG. 11 indicate the passing points acquired in this way.
  • the creation of the travel route is completed.
  • the marker M5 represents the travel end position.
  • the user 1 may be able to attach a route name for determining the travel route.
  • Markers M1, M2,..., M5 indicate the passage path of AGV10 from marker M1 to marker M5 via markers M2,.
  • the AGV 10 moves from the travel start position to the travel end position by changing the position and orientation pose in the order in which each marker data is acquired.
  • the travel route data R can be defined as a set of a plurality of markers.
  • the created travel route data R is stored in the storage device 14c.
  • FIG. 13A shows an example of marker data included in a certain travel route data R.
  • Each marker indicated by numbers M1 to M5 has marker data including an X coordinate, a Y coordinate, and an angle ⁇ .
  • the position of the marker M3 is the same as the position of the marker M2.
  • the AGV 10 is acquired as a different marker by turning left by an angle ⁇ from the pose at the marker M2.
  • FIG. 13B shows an example of marker data in which abundant data is set as compared with the example of FIG. 13A.
  • the marker data of some markers has data on travel speed, acceleration time, and deceleration time.
  • the “traveling speed” indicates the traveling speed of the AGV 10 from the position indicated by the k-th (k: 1 or larger) marker data to the position indicated by the (k + 1) -th acquired marker data.
  • the “acceleration time” is an acceleration time for accelerating until reaching the traveling speed
  • the “deceleration time” is a deceleration time for decelerating from the traveling speed. Note that the traveling speed, acceleration time, and deceleration time need not always be set simultaneously. Like the marker M2, there may be a case where none of the traveling speed, acceleration time, and deceleration time is set, and any one or more of the traveling speed, acceleration time, and deceleration time may be set.
  • the user 1 can create one or a plurality of travel routes.
  • the travel route data R is stored in the storage device 14c of the AGV 10, but when the connection is established between the AGV 10 and the tablet computer 20, the travel route data R is transferred from the AGV 10 to the tablet computer 20. Is done.
  • the user 1 can edit the marker data constituting the transferred travel route data R on the tablet computer 20.
  • the editing of the marker data is, for example, deleting a part of the marker data and changing the value of the X coordinate, the Y coordinate and / or the angle ⁇ .
  • FIG. 14 shows a display example of a plurality of travel routes R1 to R3 displayed on the display 25 after touching the route selection button 35 (FIG. 5). Each route number and the route name given by the user 1 are displayed.
  • User 1 touches the displayed route number or route name to select it.
  • the selected travel route R2 is highlighted.
  • CPU21 transmits the instruction
  • the microcomputer 14a of the AGV 10 that has received the command reads each marker data of the travel route R2 from the plurality of travel routes stored in the storage device 14c.
  • a start button not shown
  • the microcomputer 14a of the AGV 10 starts automatic operation along the travel route R2.
  • FIG. 14 shows a menu 40 for editing the selected travel route.
  • the menu 40 includes an edit button 40a for editing individual marker data, a delete button, and the like.
  • the convenience of the user 1 can be improved by providing on the display 25 a menu 40 for enabling not only selection relating to the travel route but also editing.
  • the following method may be further adopted. That is, after setting one or more travel routes, the user 1 may find it troublesome to select a travel route each time.
  • the user 1 may find it troublesome to select a travel route each time.
  • FIG. 15 shows an example of the GUI (initial GUI) immediately after the application for controlling the AGV 10 is started on the tablet computer 20.
  • the initial GUI displays a plurality of travel routes 50a and 50b registered in advance. These correspond to the travel routes R1 and R2 in FIG. Further, a start button 51 is displayed on the initial GUI. User 1 completes the selection of the travel route and touches start button 51. Then, the CPU 21 transmits a travel route designation and a travel start instruction to the AGV 10.
  • the AGV 10 reads each marker data of the selected travel route, and then travels while passing through each marker.
  • the user 1 can run the AGV 10 immediately after starting the application.
  • the technique of the present disclosure can be widely used for controlling the operation of a moving object.
  • 1 user 2a, 2b wireless access point, 10 AGV (mobile), 14a microcomputer, 14b memory, 14c storage device, 14d communication circuit, 14e positioning device, 16a, 16b motor, 15 laser range finder, 17a, 17b motor drive Circuit, 20 tablet computer (mobile computer), 21 CPU, 22 memory, 23 communication circuit, 24 image processing circuit, 25 display, 26 touch screen panel, 100 control system

Abstract

L'invention concerne un ordinateur mobile (20) recevant une entrée d'un utilisateur par l'intermédiaire d'une interface utilisateur graphique (GUI) pour commander le mouvement d'un corps mobile. L'ordinateur mobile comprend : un circuit de communication (23); un dispositif d'affichage (25) pour afficher la GUI; un panneau d'écran tactile (26) pour délivrer en sortie les données d'une position à laquelle une interaction tactile sur le dispositif d'affichage est détectée; et un circuit de traitement (21) qui, en réponse à la détection d'une interaction tactile, effectue une commande de déplacement ou un processus de réglage pour le corps mobile associé à un widget installé au niveau de la position de détection. La GUI comprend : un ou plusieurs widgets de fonctionnement (30a-30d, 31) pour la commande de déplacement du corps mobile; un widget de création de carte (32) pour amener le corps mobile à générer une carte spatiale; un widget de capture (33) pour générer un itinéraire de déplacement du corps mobile; et un widget de sélection d'itinéraire (35) pour sélectionner un itinéraire qui a été sélectionné.
PCT/JP2018/018362 2017-05-19 2018-05-11 Ordinateur mobile pour commander le mouvement d'un corps mobile, système de commande de corps mobile et programme informatique WO2018212099A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019518752A JP6794539B2 (ja) 2017-05-19 2018-05-11 移動体の動作を制御するモバイルコンピュータ、移動体制御システムおよびコンピュータプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-099592 2017-05-19
JP2017099592 2017-05-19

Publications (1)

Publication Number Publication Date
WO2018212099A1 true WO2018212099A1 (fr) 2018-11-22

Family

ID=64274424

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/018362 WO2018212099A1 (fr) 2017-05-19 2018-05-11 Ordinateur mobile pour commander le mouvement d'un corps mobile, système de commande de corps mobile et programme informatique

Country Status (2)

Country Link
JP (1) JP6794539B2 (fr)
WO (1) WO2018212099A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020047168A (ja) * 2018-09-21 2020-03-26 シャープ株式会社 搬送システム、搬送方法、及びプログラム
CN111086572A (zh) * 2020-01-16 2020-05-01 深圳市科昭科技有限公司 一种智能升降柱转向agv机器人
JP7478506B2 (ja) 2018-09-21 2024-05-07 シャープ株式会社 搬送システム、搬送方法、及びプログラム

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6122703A (ja) * 1984-07-10 1986-01-31 Toyoda Autom Loom Works Ltd 無人車等のオ−バ−スピ−ド検知装置
JPH03148708A (ja) * 1989-11-02 1991-06-25 Ishikawajima Shibaura Kikai Kk 自動走行作業車の操向制御装置
JPH08170438A (ja) * 1994-12-20 1996-07-02 Sumitomo Heavy Ind Ltd 走行装置の停止位置制御方式
JPH08272439A (ja) * 1995-03-30 1996-10-18 Nissan Diesel Motor Co Ltd 車両の無線操縦システム
JP2001142533A (ja) * 1999-11-12 2001-05-25 Nissan Diesel Motor Co Ltd 無人搬送車の運行制御システム
WO2014129042A1 (fr) * 2013-02-21 2014-08-28 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2016067800A (ja) * 2014-10-01 2016-05-09 パナソニックIpマネジメント株式会社 電気機器システム
JP2016122278A (ja) * 2014-12-24 2016-07-07 ヤマハ発動機株式会社 操作装置および自律移動システム
JP2017021445A (ja) * 2015-07-07 2017-01-26 キヤノン株式会社 通信装置、その制御方法、プログラム
JP2017509034A (ja) * 2013-07-31 2017-03-30 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 遠隔制御方法および端末

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6122703A (ja) * 1984-07-10 1986-01-31 Toyoda Autom Loom Works Ltd 無人車等のオ−バ−スピ−ド検知装置
JPH03148708A (ja) * 1989-11-02 1991-06-25 Ishikawajima Shibaura Kikai Kk 自動走行作業車の操向制御装置
JPH08170438A (ja) * 1994-12-20 1996-07-02 Sumitomo Heavy Ind Ltd 走行装置の停止位置制御方式
JPH08272439A (ja) * 1995-03-30 1996-10-18 Nissan Diesel Motor Co Ltd 車両の無線操縦システム
JP2001142533A (ja) * 1999-11-12 2001-05-25 Nissan Diesel Motor Co Ltd 無人搬送車の運行制御システム
WO2014129042A1 (fr) * 2013-02-21 2014-08-28 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2017509034A (ja) * 2013-07-31 2017-03-30 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 遠隔制御方法および端末
JP2016067800A (ja) * 2014-10-01 2016-05-09 パナソニックIpマネジメント株式会社 電気機器システム
JP2016122278A (ja) * 2014-12-24 2016-07-07 ヤマハ発動機株式会社 操作装置および自律移動システム
JP2017021445A (ja) * 2015-07-07 2017-01-26 キヤノン株式会社 通信装置、その制御方法、プログラム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020047168A (ja) * 2018-09-21 2020-03-26 シャープ株式会社 搬送システム、搬送方法、及びプログラム
JP7478506B2 (ja) 2018-09-21 2024-05-07 シャープ株式会社 搬送システム、搬送方法、及びプログラム
CN111086572A (zh) * 2020-01-16 2020-05-01 深圳市科昭科技有限公司 一种智能升降柱转向agv机器人

Also Published As

Publication number Publication date
JPWO2018212099A1 (ja) 2020-02-27
JP6794539B2 (ja) 2020-12-02

Similar Documents

Publication Publication Date Title
US10866587B2 (en) System, method, and computer program for mobile body management
JP7103585B2 (ja) 移動体、移動体管理システムおよびコンピュータプログラム
JP6665506B2 (ja) 遠隔操作装置、方法及びプログラム
US9002535B2 (en) Navigation portals for a remote vehicle control user interface
CN108016497A (zh) 用于扫描停车位的装置和方法
US20150130759A1 (en) Display apparatus, vehicle equipped with the display apparatus and control method for the display apparatus
WO2018110568A1 (fr) Corps mobile effectuant une opération d'évitement d'obstacle et programme informatique associé
JP5805841B1 (ja) 自律移動体及び自律移動体システム
US20190360835A1 (en) Stand-alone self-driving material-transport vehicle
JP6104715B2 (ja) 経路生成方法と装置
JP6074205B2 (ja) 自律移動体
JP6025814B2 (ja) 操作装置および自律移動システム
WO2018212099A1 (fr) Ordinateur mobile pour commander le mouvement d'un corps mobile, système de commande de corps mobile et programme informatique
Hachet et al. Navidget for 3d interaction: Camera positioning and further uses
KR102637701B1 (ko) 증강현실 뷰를 사용하는 경로 안내 방법 및 장치
KR20100099489A (ko) 충돌 예측 기반 주행 제어장치 및 그 방법
JPWO2018179960A1 (ja) 移動体および自己位置推定装置
US20190302757A1 (en) Computer system and computer program
US20190224842A1 (en) Teaching Device, Robot Control Device, And Robot System
KR102117338B1 (ko) 원통좌표계 기반 무인이동체 조종 방법, 이를 구현하기 위한 프로그램이 저장된 기록매체 및 이를 구현하기 위해 매체에 저장된 컴퓨터프로그램
JP6801243B2 (ja) 移動目標決定装置、及び、移動目標決定方法
JP4975693B2 (ja) 移動ロボット装置及び移動ロボットの制御方法
JP6192506B2 (ja) 移動体
EP4258077A1 (fr) Dispositif et procédé permettant de simuler un robot mobile sur un site de travail
JP2021184014A (ja) 地図表示装置及び地図表示プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18801280

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019518752

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18801280

Country of ref document: EP

Kind code of ref document: A1