WO2022091787A1 - Communication system, robot, and storage medium - Google Patents

Communication system, robot, and storage medium Download PDF

Info

Publication number
WO2022091787A1
WO2022091787A1 PCT/JP2021/037934 JP2021037934W WO2022091787A1 WO 2022091787 A1 WO2022091787 A1 WO 2022091787A1 JP 2021037934 W JP2021037934 W JP 2021037934W WO 2022091787 A1 WO2022091787 A1 WO 2022091787A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
robot
unit
peripheral
terminal device
Prior art date
Application number
PCT/JP2021/037934
Other languages
French (fr)
Japanese (ja)
Inventor
昂 深堀
ケビン 梶谷
Original Assignee
avatarin株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by avatarin株式会社 filed Critical avatarin株式会社
Publication of WO2022091787A1 publication Critical patent/WO2022091787A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M11/00Telephonic communication systems specially adapted for combination with other electrical systems

Definitions

  • the present invention relates to a communication system, a robot, and a storage medium.
  • Patent Document 1 describes a moving body with a camera that provides a photographed image to a user in a remote place.
  • the robot when the user remotely controls the robot, the robot functions as a user's alter ego, and the user can experience the experience in the place where the robot is.
  • the robot equipped with a camera and a drive unit since a robot equipped with a camera and a drive unit has high functionality, there is room for use in purposes other than providing an experience to the user.
  • One of the objects of the present invention is to provide a technique related to a robot that can be used for a plurality of uses.
  • the communication system is a communication system including one or more robots and a server device, and the robot is via a positioning unit for acquiring position information of the robot and a terminal device.
  • a drive unit that can move the robot according to remote operation, a peripheral data acquisition unit that acquires peripheral data of the robot, a first data based on the peripheral data, and a second based on the peripheral data and the position information.
  • a first processing unit that generates data and a communication unit that transmits the first data to the terminal device and transmits the second data to the server device are provided, and the server device receives from the robot. It includes a second processing unit that generates map data based on the second data, and a storage unit that stores the map data.
  • the robot includes a positioning unit that acquires position information of its own machine, a drive unit that can move according to remote operation via a terminal device, and a peripheral data acquisition unit that acquires peripheral data.
  • the first processing unit that generates the first data based on the peripheral data, the peripheral data and the second data based on the position information, and the first data are transmitted to the terminal device, and the second data is transmitted. It is equipped with a communication unit that transmits data to the server device.
  • the storage medium is a storage medium that stores a program for causing a computer to execute a control method of the robot, and the control method is to acquire position information of the robot by a positioning unit. , Controlling the movement of the robot in response to remote operation via the terminal device, acquiring peripheral data with a camera or sensor, first data based on the peripheral data, the peripheral data, and the position information.
  • the present invention includes the generation of the second data based on the above, the transmission of the first data to the terminal device, and the transmission of the second data to the server device.
  • system configuration An exemplary configuration of the system 1 according to an embodiment will be described with reference to FIGS. 1 and 2.
  • the system 1 is a communication system that enables a user to remotely control one or more robots via a terminal device. Multiple robots are placed in different places.
  • the system 1 includes a server device 10, terminal devices 20a, 20b, 20c, and robots 30a, 30b, 30c.
  • Each device or robot is configured to be able to communicate with other devices or robots wirelessly or by wire (or both).
  • the system 1 includes three terminal devices, but the number of terminal devices may be arbitrarily set and may be two or less or four or more.
  • the terminal devices 20a, 20b, and 20c may have similar configurations or different configurations. In the present embodiment, when the terminal devices 20a, 20b, and 20c are referred to without distinguishing from each other, they are collectively referred to as the terminal device 20.
  • the system 1 includes three robots, but the number of robots is arbitrarily set and may be two or less or four or more.
  • the robots 30a, 30b, and 30c may have similar configurations or different configurations. In the present embodiment, when the robots 30a, 30b, and 30c are referred to without being distinguished from each other, they are collectively referred to as the robot 30. The outline of each device and robot will be described below.
  • the server device 10 is a device that executes various processes related to remote control of a plurality of robots 30 from the terminal device 20.
  • the server device 10 further performs a search process for the available robot 30, management of reservation registration for the operation of the robot 30, and the like.
  • the server device 10 is composed of an information processing device such as a server computer.
  • the server device 10 may be configured by one information processing device or may be configured by a plurality of information processing devices (for example, cloud computing or edge computing).
  • the terminal device 20 is an information processing device used by the user to operate the robot 30 and to reserve the user to perform the operation.
  • the terminal device 20 is a general-purpose or dedicated information processing device such as a smartphone, a tablet terminal, a PDA (Personal Digital Assistants), a personal computer, a head-mounted display, and an operation system for a specific purpose.
  • the terminal device 20 used for reserving the operation of the robot 30 may be a device different from the terminal device 20 used for the operation, or may be the same device.
  • the robot 30 is a non-fixed robot.
  • the robot 30 is not fixed in the case where the robot 30 is a mobile type having a drive unit for movement by wheels or the like, and the case where the robot 30 is a wearable type which can be worn by a person and has a drive unit for operation of a manipulator or the like. Including some cases.
  • the mobile robot is shown in, for example, Patent Document 1.
  • the moving part of the mobile robot is one that travels by one wheel, two wheels or multiple wheels, one that travels by a caterpillar, one that travels on a rail, one that jumps and moves, bipedal walking, four-legged walking or multi-legged walking. Includes those that use a screw, those that navigate on or under water with a screw, and those that fly with a propeller or the like.
  • Wearable robots are, for example, MHD Yaman Saraiji, Tomoya Sasaki, Reo Matsumura, Kouta Minamizawa and Masahiko Inami, "Fusion: full body surrogacy for collaborative communication," Proceeding SIGGRAPH '18 ACM It has been published.
  • the robot 30 includes a vehicle or a heavy machine capable of automatic or semi-automatic traveling, or a drone or an airplane. Further, the robot 30 includes a robot installed in a sports stadium or the like and equipped with a camera that can move on rails. Further, the robot 30 is a satellite type robot launched into outer space, and includes a robot capable of controlling the attitude and the shooting direction of the camera. Further, the robot 30 may be a so-called telepresence robot or an avatar robot.
  • the user can remotely control the robot 30 (for example, move the robot 30 or operate the camera mounted on the robot 30) via the terminal device 20.
  • the communication architecture for transmitting and receiving operation signals or data between the terminal device 20 and the robot 30 can adopt any architecture, for example, P2P (Peer-to-Peer) or a client server.
  • the robot 30 operates in response to a signal received from the terminal device 20, and the robot 30 relates to a location where the robot 30 is located, such as image data and voice data acquired through a camera, a sensor, and other devices mounted on the robot 30.
  • the acquired or detected data is transmitted to the terminal device 20.
  • the user can experience the experience as if he / she is in the place where the robot 30 is located through the terminal device 20 and the robot 30.
  • the peripheral data of the robot 30 acquired by the camera or sensor mounted on the robot 30 is used for creating the map data.
  • the robot 30 transmits peripheral data (or data based on the peripheral data; the same applies in the following description) acquired by the camera or sensor of the robot 30 to the server device 10.
  • the server device 10 stores peripheral data received from the robot 30.
  • a predetermined process is performed on the peripheral data before the transmission by the robot 30 or before the storage in the server device 10.
  • the robot 30 that is, peripheral data acquired by the camera or sensor of the robot 30
  • the robot 30 is also used for purposes other than allowing the user to experience the experience.
  • the robot 30 has a positioning unit 31, a drive control unit 32, a data processing unit 33, a communication control unit 34, a database 35, and a peripheral data acquisition unit 36 as main functional configurations.
  • the functions of the robot 30 are not limited to these, and may have other functions such as the functions generally possessed by the robot.
  • the robot 30 further has a camera control unit that controls the operation of the camera mounted on the robot 30.
  • the positioning unit 31 acquires the position information of the robot 30.
  • the position information may be acquired from an external device or may be calculated by a positioning process by the positioning unit 31.
  • the positioning process may be carried out by GNSS (Global Navigation Satellite System), or may be carried out by using a base station or a communication device (for example, a router), or by another method.
  • the position information of the robot 30 is, for example, position information acquired based on the communication address information (for example, IP address) of a device (for example, a router) in the vicinity of the robot 30 used for the communication of the robot 30. May be.
  • the position information of the robot 30 may be calculated and acquired based on the distance and the direction moved by the drive unit included in the robot 30.
  • the drive control unit 32 controls the movement, stop, and other operations of the robot 30 by controlling the drive unit included in the robot 30.
  • the drive control unit 32 controls the drive unit so that the robot moves in response to a signal for remote control received by the robot 30 from the terminal device 20. Further, the drive control unit 32 can control the drive unit so as to automatically move according to a preset program.
  • Peripheral data acquisition unit 36 acquires peripheral data of the robot 30.
  • Peripheral data includes, for example, image data of the surrounding environment acquired by a camera mounted on the robot 30. Further, the peripheral data includes two-dimensional or three-dimensional data of the surrounding environment sensed by a sensor mounted on the robot 30 (for example, an ultrasonic sensor, a laser sensor, a radar, a microphone, or the like), audio data, or the like.
  • the two-dimensional or three-dimensional data of the surrounding environment includes, for example, data regarding the distance from the robot 30 to an object existing in the surrounding environment and the shape of the object.
  • the data processing unit 33 performs various processes on the peripheral data acquired by the peripheral data acquisition unit 36.
  • the processing by the data processing unit 33 includes, for example, a correction processing (for example, color correction), a processing processing (for example, noise reduction), and a compression processing for the image data included in the peripheral data.
  • the data processing unit 33 can perform different processing depending on the use or destination of the image data. For example, the data generated by performing the above correction processing, processing processing, and compression processing can be used as data for displaying on the display unit of the terminal device 20.
  • the data processing unit 33 can generate data by performing other processing on peripheral data in addition to or in place of the above processing.
  • the data processing unit 33 in order to create map data, the data processing unit 33 generates data based on the peripheral data acquired by the peripheral data acquisition unit 36 and the position information acquired by the positioning unit 31.
  • the data processing unit 33 generates data in which the coordinates in the space indicated by the peripheral data acquired by the camera of the robot 30 and the position information acquired by the positioning unit 31 are associated with each other so as to follow the real world.
  • the data processing unit 33 has three-dimensional data of the surrounding environment acquired by the peripheral data acquisition unit 36 (for example, data sensed by an ultrasonic sensor, a laser sensor, or a radar) for the purpose of generating the data.
  • the data processing unit 33 may generate a point cloud based on the peripheral data acquired by the peripheral data acquisition unit 36. Further, the data processing unit 33 performs image recognition processing on the image data included in the peripheral data, and is not related to the spatial information to be included in the map, such as a predetermined object (for example, a person, a car, a bicycle, etc.) included in the image. It is possible to perform masking processing (for example, processing for removing a predetermined object) on an image of an object). The data processing unit 33 can perform image recognition processing on the masked object and other objects included in the image, and add the attribute information of the object to the image data of the peripheral data as metadata.
  • image recognition processing for example, a person, a car, a bicycle, etc.
  • the attribute information of the object includes the type of the object (person, car, bicycle, animal, etc.), the age of the person, the gender, and the like.
  • the data processing unit 33 uses LiDAR (light detection and ranging) technology in response to the three-dimensional data of the surrounding environment acquired by the peripheral data acquisition unit 36 to the image data. You may carry out the processing.
  • LiDAR light detection and ranging
  • the communication control unit 34 controls various communications with external devices such as the server device 10 and the terminal device 20 via the communication unit mounted on the robot 30.
  • the communication control unit 34 controls to transmit the data based on the peripheral data acquired by the peripheral data acquisition unit 36 (for example, the data processed by the data processing unit 33) to the server device 10 and the terminal device 20.
  • the communication control unit 34 controls the transmission of peripheral data to an external device, which has been subjected to different processing according to the purpose or destination, by the data processing unit 33.
  • the communication control unit 34 controls to transmit the first data based on the peripheral data of the robot 30 acquired by the peripheral data acquisition unit 36 to the terminal device 20.
  • the communication control unit 34 controls to transmit the second data different from the first data to the server device 10 based on the peripheral data.
  • the first data is, for example, data in which the data processing unit 33 has performed a process for displaying the image data of the peripheral data on the display unit of the terminal device 20.
  • the second data includes, for example, peripheral data acquired by a peripheral data acquisition unit 36 for creating map data (for example, data sensed by an ultrasonic sensor, a laser sensor, a radar, or a microphone). This is data generated based on the position information acquired by the positioning unit 31.
  • the database 35 stores various data such as data necessary for the processing executed by the robot 30 and data generated or set by the processing.
  • the database 35 stores, for example, peripheral data acquired by the peripheral data acquisition unit 36 and data processed for the peripheral data.
  • the configuration of the main functions of the server device 10 will be described. These functions are realized by the control unit (processor) of the server device 10 reading and executing the computer program stored in the storage unit.
  • the hardware configuration of the server device 10 will be described later.
  • the server device 10 has a processing unit 11, a communication control unit 12, and a database 13 as a configuration of main functions.
  • the functions of the server device 10 are not limited to these, and may have other functions such as the functions generally possessed by the robot.
  • the processing unit 11 carries out various processes in the server device 10. For example, the processing unit 11 generates map data based on the data received from the robot 30 and stores it in the database 13, or updates the map data with reference to the database 13.
  • the map data based on the data received from the robot 30 is data generated or updated based on the data acquired by the robot 30 (for example, the above-mentioned peripheral data and position data), and the robot 30 has passed or stayed. It is the data of the map near the place.
  • the processing unit 11 is a part of the processing performed by the data processing unit 33 of the robot 30 described above (for example, at least one of the process of generating a point cloud, the masking process, or the addition of metadata), the robot 30. It may be carried out instead.
  • the processing unit 11 may generate or update map data based on the data received from the plurality of robots 30.
  • the processing unit 11 may integrate data received from a plurality of robots 30 existing at different positions to generate or update map data. This makes it possible to generate and update a wider range of map data in a short period of time as compared with the case where map data is generated based on the data received from one robot 30.
  • the communication control unit 12 controls various communications between the terminal device 20 and an external device such as the robot 30 via the communication unit included in the server device 10.
  • the database 13 stores various data such as data required for processing executed by the server device 10, data generated or set by the processing, and data acquired from an external device.
  • the database 13 stores, for example, map data generated or updated by the processing unit 11.
  • the robot 30 is remotely controlled via the terminal device 20 and moves in order to provide the user with an experience regarding the location of the robot 30. Further, peripheral data acquired by the camera, sensor, or the like of the robot 30 during the movement is used for generating map data. Therefore, the more the robot 30 is remotely controlled to provide the experience to the user, the more frequently and more peripheral data for generating the map data is acquired by the robot 30. On the other hand, conventionally, the frequency of collecting data for generating map data is about once every few years. That is, according to the present embodiment, since the collection of peripheral data by the robot 30 is not only for the purpose of generating map data, it is possible to collect peripheral data with high frequency, and as a result, a more real space. It is possible to generate map data that matches the information.
  • step S501 the terminal device 20 starts transmitting a control signal to the robot 30 in order to remotely control the robot 30 in response to a user operation.
  • the remote control includes, for example, an operation for a camera mounted on the robot 30 and an operation for moving the robot 30.
  • step S502 the robot 30 starts the operation in response to the remote control start control signal received in step S501.
  • the operation includes photographing by the camera of the robot 30, sensing by a sensor, and acquisition of position information of the current position of the robot 30.
  • the robot 30 performs a predetermined process on the peripheral data of the robot 30 acquired by the camera and the sensor of the robot 30.
  • the robot 30 performs a process for generating image data and audio data to be reproduced by an output unit (for example, a display device or a speaker) of the terminal device 20, and the data (for example, the first data described above).
  • an output unit for example, a display device or a speaker
  • the data for example, the first data described above.
  • the robot 30 performs a process for creating map data to generate data (for example, the above-mentioned second data).
  • step S504 the robot 30 starts transmitting the data generated in step S503 to the terminal device 20 for reproduction by the output unit of the terminal device 20.
  • the terminal device 20 starts displaying an image on the display unit or reproducing the sound on the speaker based on the data received from the robot 30.
  • the transmission of data from the robot 30 to the terminal device 20 is continuously executed, for example, until the end of the image display on the display unit of the terminal device 20 is instructed via the terminal device 20.
  • step S505 the robot 30 starts moving in response to the remote control control signal received from the terminal device 20.
  • the movement includes indoor or outdoor movement for shopping, sightseeing, or walking using the robot 30 instructed by the user via the terminal device 20.
  • the robot 30 acquires peripheral data in the vicinity of the place where the robot 30 has passed or stayed due to the movement by the camera and the sensor of the robot 30.
  • step S506 the robot 30 starts transmitting the data generated in step S503, which has been processed for creating map data, to the server device 10 at a predetermined timing.
  • the data transmitted to the server device 10 in step S506 includes data generated based on the peripheral data acquired by the camera and the sensor and the position information of the robot 30.
  • the predetermined timing for starting transmission to the server device 10 is, for example, the timing when the robot 30 finishes the data generation process for transmission to the terminal device 20 in step S503, or the terminal device started in step S504. Includes the timing of the end of data transmission to 20. Further, the predetermined timing described above is a timing at which the robot 30 is not remotely controlled from the terminal device 20, a timing at which the robot 30 has stopped moving, or a predetermined timing (for example, a charging dock of the robot 30). Including the timing. By transmitting the data to the server device 10 at such a timing, the robot 30 transmits the data to the server device 10 at a timing different from the communication timing for the robot 30 to transmit the data to the terminal device 20. Is possible.
  • the processing load of the robot 30 can be distributed.
  • some of the image processes for creating the map data performed in step S503, such as the masking process and the attribute information addition process, are described above. It may be carried out at a predetermined timing.
  • step S506 the robot 30 does not start transmitting the data for which the image processing for creating the map data has been performed to the server device 10 at a predetermined timing, but the processing is completed in step S503.
  • the data may be sequentially transmitted to the server device 10.
  • step S507 the server device 10 generates map data and stores it in the storage unit based on the data received from the robot 30, or updates the map data by referring to the storage unit.
  • the robot 30 is remotely controlled via the terminal device 20 and moves in order to provide the user with an experience regarding the location of the robot 30. Further, peripheral data acquired by the camera or sensor of the robot 30 during the movement is used for generating map data. Therefore, the more the robot 30 is remotely controlled to provide the experience to the user, the more frequently and more peripheral data for generating the map data is acquired by the robot 30.
  • the frequency of collecting data for generating map data is about once every few years. That is, according to the present embodiment, since the collection of peripheral data by the robot 30 is not only for the purpose of generating map data, it is possible to collect peripheral data with high frequency, and as a result, a more real space. It is possible to generate map data that matches the information.
  • the computer 700 mainly includes a processor 701, a memory 703, a storage device 705, an operation unit 707, an input unit 709, a communication unit 711, and an output unit 713.
  • the computer 700 does not have to include at least a part of these configurations.
  • the computer 700 may include a general-purpose computer or another configuration generally provided by a dedicated computer. Further, the computer 700 does not have to include a part of the configurations shown in FIG.
  • the processor 701 is a control unit that controls various processes in the computer 700 by executing a program stored in the memory 703. Therefore, the processor 701 realizes the function of each device described in the above-described embodiment by cooperating with the program and other configurations included in the computer 700, and controls the execution of the above-mentioned processing.
  • the memory 703 is a storage medium such as, for example, a RAM (Random Access Memory).
  • the program code of the program executed by the processor 701 and the data required for executing the program are temporarily read from the storage device 705 or the like, or stored in advance.
  • the storage device 705 is a non-volatile storage medium such as a hard disk drive (HDD).
  • the storage device 705 stores an operating system, various programs for realizing each of the above configurations, data of the above-mentioned processing results, and the like.
  • the operation unit 707 is a device for receiving input from the user. Specific examples of the operation unit 707 include a keyboard, a mouse, a touch panel, a joystick, various sensors, a wearable device, and the like.
  • the operation unit 707 may be detachably connected to the computer 700 via an interface such as USB (Universal Serial Bus).
  • the input unit 709 is a device for inputting data from the outside of the computer 700.
  • Specific examples of the input unit 709 include a drive device for reading data stored in various storage media.
  • the input unit 709 includes a microphone that collects surrounding voices, converts the voices into voice data, and inputs the voices.
  • the input unit 709 may be detachably connected to the computer 700. In that case, the input unit 709 is connected to the computer 700 via an interface such as USB.
  • the communication unit 711 is a device for performing data communication via a network with an external device of the computer 700 by wire or wirelessly.
  • the communication unit 711 may be detachably connected to the computer 700. In that case, the communication unit 711 is connected to the computer 700 via an interface such as USB.
  • the output unit 713 is a device that outputs various data.
  • the output unit 713 is, for example, a display device for displaying data or a speaker for outputting audio.
  • Specific examples of the display device include a liquid crystal display, an organic EL display, a display of a wearable device, and the like.
  • the output unit 713 may be detachably connected to the outside of the computer 700.
  • the display device, which is the output unit 713 is connected to the computer 700 via, for example, a display cable or the like.
  • the output unit 713 can be integrally configured with the operation unit 707.
  • the robot 30 includes a processor 901, a RAM (Random Access Memory) 902, a ROM (Read only Memory) 903, a communication unit 904, an input unit 905, a display unit 906, a drive unit 907, a camera 908, and a sensor 909.
  • the configuration shown in FIG. 7 is an example, and the robot 30 may have configurations other than these, or may not have a part of these configurations.
  • the robot 30 may further include a speaker. Further, the robot 30 may include a unit for specifying the position of its own machine.
  • the processor 901 is a calculation unit of the robot 30, and is, for example, a CPU (Central Processing Unit).
  • the RAM 902 and the ROM 903 are storage units for storing data required for various processes and data of processing results.
  • the robot 30 may include a large-capacity storage unit such as an HDD (Hard Disk Drive) and an SSD (Sold State Drive).
  • the communication unit 904 is a device that communicates with an external device.
  • the communication unit 904 includes, for example, an antenna or a NIC (Network Interface Car).
  • the input unit 905 is a device for inputting data from the outside of the robot 30.
  • the display unit 906 is a device for displaying various information.
  • the processor 901 is a control unit that controls execution of a program stored in RAM 902 or ROM 903, calculates data, and processes data.
  • the processor 901 executes a program (communication program) for controlling communication via the robot.
  • the processor 901 receives various data from the input unit 905 and the communication unit 904, displays the calculation result of the data on the display unit 906, and stores the data in the RAM 902.
  • the input unit 905 accepts data input from the user, and may include, for example, a keyboard and a touch panel. Further, the input unit 905 may include a microphone for voice input.
  • the display unit 906 visually displays the calculation result by the processor 901, and may be configured by, for example, an LCD (Liquid Crystal Display).
  • the display unit 906 may display an image taken by the camera 908 of the robot 30.
  • the communication program may be stored in a storage medium readable by a computer such as RAM 902 or ROM 903 and provided, or may be provided via a communication network connected by the communication unit 904.
  • various operations for controlling the robot 30 are realized by the processor 901 executing the communication program.
  • the computer 900 may include an LSI (Large-Scale Integration) in which a processor 901 and a RAM 902 or a ROM 903 are integrated.
  • the drive unit 907 includes an actuator that can be remotely controlled, and includes a moving unit such as a wheel, a manipulator, and the like.
  • the drive unit 907 includes at least a moving unit such as a wheel, but may include a manipulator.
  • the drive unit 907 includes at least a manipulator.
  • the camera 908 includes an image sensor that captures a still image or a moving image, and transmits the captured still image or the moving image to an external device via the communication unit 904.
  • Sensor 909 is various sensors that perform sensing related to the surrounding environment.
  • the sensor 909 includes, for example, at least a part of an ultrasonic sensor, a laser sensor, a radar, an optical sensor, a sound sensor (mic), a pressure sensor, an acceleration sensor, an angular velocity sensor, an altitude sensor, or a geomagnetic sensor.
  • the program for mounting the system 1 (or the server device 10, the terminal device 20, or the robot 30) in the present embodiment is recorded on various recording media such as an optical disk such as a CD-ROM, a magnetic disk, and a semiconductor memory. Can be done. It can also be installed or loaded on a computer by downloading the above program through a recording medium or via a communication network or the like.
  • the present invention is not limited to the above-described embodiment, and can be carried out in various other forms without departing from the gist of the present invention.
  • the above embodiments are merely examples in all respects and are not to be construed in a limited manner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Telephonic Communication Services (AREA)

Abstract

Provided is a technology which is related to robots and which is applicable to multiple uses. This communication system comprises: one or a plurality of robots; and a server device. Each robot comprises: a positioning unit that acquires positional information of the robot; a drive unit that can move the robot according to a remote operation via a terminal device; a peripheral data acquisition unit that acquires peripheral data of the robot; a first processing unit that generates first data based on the peripheral data and second data based on the peripheral data and the positional information; and a communication unit that transmits the first data to the terminal device and the second data to the server device. The server device comprises: a second processing unit that generates map data based on the second data received from the robot; and a storage unit that stores the map data.

Description

コミュニケーションシステム、ロボット、及び記憶媒体Communication systems, robots, and storage media
 本発明は、コミュニケーションシステム、ロボット、及び記憶媒体に関する。 The present invention relates to a communication system, a robot, and a storage medium.
 近年、インターネットを用いたテレビ会議システムが普及し、顔を見ながら話すだけでなく、遠隔地にいるユーザによる操作に応じて、カメラの向きを変更させたり、駆動部の制御により位置を移動させることが可能なテレプレゼンスロボットが知られている。 In recent years, video conference systems using the Internet have become widespread, and in addition to talking while looking at the face, the direction of the camera can be changed or the position can be moved by controlling the drive unit according to the operation by a user at a remote location. Telepresence robots that are capable are known.
 特許文献1には、遠隔地にいるユーザに対して撮影画像を提供するカメラ付き移動体が記載されている。 Patent Document 1 describes a moving body with a camera that provides a photographed image to a user in a remote place.
特開2019-062308号公報JP-A-2019-062308
 従来のテレプレゼンスロボットは、ユーザがロボットを遠隔操作することで、ロボットがいわばユーザの分身として機能し、ユーザはロボットがいる場所におけるエクスペリエンスを体感できる。一方で、カメラや駆動部を備えるロボットは高機能であるため、ユーザにエクスペリエンスを提供する以外の用途に利用する余地がある。 In the conventional telepresence robot, when the user remotely controls the robot, the robot functions as a user's alter ego, and the user can experience the experience in the place where the robot is. On the other hand, since a robot equipped with a camera and a drive unit has high functionality, there is room for use in purposes other than providing an experience to the user.
 本発明の目的の一つは、複数の用途に利用可能なロボットに関する技術を提供することにある。 One of the objects of the present invention is to provide a technique related to a robot that can be used for a plurality of uses.
 本発明の一態様に係るコミュニケーションシステムは、一又は複数のロボットと、サーバ装置とを備えるコミュニケーションシステムであって、前記ロボットは、前記ロボットの位置情報を取得する測位部と、端末装置を介した遠隔操作に応じて前記ロボットを移動可能な駆動部と、前記ロボットの周辺データを取得する周辺データ取得部と、前記周辺データに基づく第1データと、前記周辺データ及び前記位置情報に基づく第2データとを生成する第1処理部と、前記第1データを前記端末装置に送信し、前記第2データを前記サーバ装置に送信する通信部とを備え、前記サーバ装置は、前記ロボットから受信した前記第2データに基づくマップデータを生成する第2処理部と、前記マップデータを記憶する記憶部とを備える。 The communication system according to one aspect of the present invention is a communication system including one or more robots and a server device, and the robot is via a positioning unit for acquiring position information of the robot and a terminal device. A drive unit that can move the robot according to remote operation, a peripheral data acquisition unit that acquires peripheral data of the robot, a first data based on the peripheral data, and a second based on the peripheral data and the position information. A first processing unit that generates data and a communication unit that transmits the first data to the terminal device and transmits the second data to the server device are provided, and the server device receives from the robot. It includes a second processing unit that generates map data based on the second data, and a storage unit that stores the map data.
 本発明の一態様に係るロボットは、自機の位置情報を取得する測位部と、端末装置を介した遠隔操作に応じてを移動可能な駆動部と、周辺データを取得する周辺データ取得部と、前記周辺データに基づく第1データと、前記周辺データ及び前記位置情報に基づく第2データとを生成する第1処理部と、前記第1データを前記端末装置に送信し、前記第2データをサーバ装置に送信する通信部とを備える。 The robot according to one aspect of the present invention includes a positioning unit that acquires position information of its own machine, a drive unit that can move according to remote operation via a terminal device, and a peripheral data acquisition unit that acquires peripheral data. , The first processing unit that generates the first data based on the peripheral data, the peripheral data and the second data based on the position information, and the first data are transmitted to the terminal device, and the second data is transmitted. It is equipped with a communication unit that transmits data to the server device.
 本発明の一態様に係る記憶媒体は、ロボットの制御方法をコンピュータに実施させるためのプログラムを記憶した記憶媒体であって、前記制御方法は、測位部により前記ロボットの位置情報を取得することと、端末装置を介した遠隔操作に応じて前記ロボットの移動を制御することと、カメラ又はセンサにより周辺データを取得することと、前記周辺データに基づく第1データと、前記周辺データ及び前記位置情報に基づく第2データとを生成することと、前記第1データを前記端末装置に送信し、前記第2データをサーバ装置に送信することとを含む。 The storage medium according to one aspect of the present invention is a storage medium that stores a program for causing a computer to execute a control method of the robot, and the control method is to acquire position information of the robot by a positioning unit. , Controlling the movement of the robot in response to remote operation via the terminal device, acquiring peripheral data with a camera or sensor, first data based on the peripheral data, the peripheral data, and the position information. The present invention includes the generation of the second data based on the above, the transmission of the first data to the terminal device, and the transmission of the second data to the server device.
 本発明によれば、複数の用途に利用可能なロボットに関する技術を提供することができる。 According to the present invention, it is possible to provide a technique related to a robot that can be used for a plurality of uses.
一実施形態におけるシステムの構成を例示する図である。It is a figure which illustrates the structure of the system in one Embodiment. 一実施形態における端末装置、ロボット及びサーバ装置を説明するための概念図である。It is a conceptual diagram for demonstrating the terminal apparatus, the robot and the server apparatus in one Embodiment. 一実施形態におけるロボットの機能構成を示すブロック図である。It is a block diagram which shows the functional structure of a robot in one Embodiment. 一実施形態におけるサーバ装置の機能構成を示すブロック図である。It is a block diagram which shows the functional structure of the server apparatus in one Embodiment. 一実施形態におけるシステムの処理フローを例示的に説明するための図である。It is a figure for exemplifying the processing flow of the system in one Embodiment. 一実施形態においてシステムにより使用されるコンピュータのハードウェア構成を例示する概念図である。It is a conceptual diagram which illustrates the hardware composition of the computer used by the system in one Embodiment. 一実施形態におけるロボットのハードウェア構成を例示する概念図である。It is a conceptual diagram which illustrates the hardware composition of the robot in one Embodiment.
 以下に、本発明の一実施形態について説明する。なお、以下の実施形態は、本発明を説明するための例示であり、本発明をその実施の形態のみに限定する趣旨ではない。また、本発明は、その要旨を逸脱しない限り、さまざまな変形が可能である。さらに、当業者であれば、以下に述べる各要素を均等なものに置換した実施の形態を採用することが可能であり、かかる実施の形態も本発明の範囲に含まれる。 Hereinafter, an embodiment of the present invention will be described. It should be noted that the following embodiments are examples for explaining the present invention, and the present invention is not intended to be limited only to the embodiments thereof. Further, the present invention can be modified in various ways as long as it does not deviate from the gist thereof. Further, those skilled in the art can adopt an embodiment in which each element described below is replaced with an equal one, and such an embodiment is also included in the scope of the present invention.
 <システムの構成>
 図1及び図2を参照して、一実施形態に係るシステム1の例示的な構成について説明する。本実施形態において、システム1は、ユーザが端末装置を介して1つ又は複数のロボットを遠隔操作することを可能にするコミュニケーションシステムである。複数のロボットは、それぞれ異なる場所に置かれている。
<System configuration>
An exemplary configuration of the system 1 according to an embodiment will be described with reference to FIGS. 1 and 2. In the present embodiment, the system 1 is a communication system that enables a user to remotely control one or more robots via a terminal device. Multiple robots are placed in different places.
 図1に示すように、システム1は、サーバ装置10、端末装置20a,20b,20c、及びロボット30a,30b,30cを備える。各装置又はロボットは、他の装置又はロボットと、無線若しくは有線により(又はその両者により)通信可能に構成されている。図1に示す例において、システム1は、3つの端末装置を備えるが、端末装置の数は任意に設定され、2つ以下でも4つ以上であってもよい。端末装置20a,20b,20cは、それぞれ同様の構成を有してもよいし、異なる構成を有してもよい。本実施形態において、端末装置20a,20b,20cが互いに区別せずに参照される場合は、総称して端末装置20と称される。また、システム1は、3つのロボットを備えるが、ロボットの数は任意に設定され、2つ以下でも4つ以上であってもよい。ロボット30a,30b,30cは、それぞれ同様の構成を有してもよいし、異なる構成を有してもよい。本実施形態において、ロボット30a,30b,30cを互いに区別せずに参照する場合は、総称してロボット30と称される。以下に、それぞれの装置及びロボットの概要を説明する。 As shown in FIG. 1, the system 1 includes a server device 10, terminal devices 20a, 20b, 20c, and robots 30a, 30b, 30c. Each device or robot is configured to be able to communicate with other devices or robots wirelessly or by wire (or both). In the example shown in FIG. 1, the system 1 includes three terminal devices, but the number of terminal devices may be arbitrarily set and may be two or less or four or more. The terminal devices 20a, 20b, and 20c may have similar configurations or different configurations. In the present embodiment, when the terminal devices 20a, 20b, and 20c are referred to without distinguishing from each other, they are collectively referred to as the terminal device 20. Further, the system 1 includes three robots, but the number of robots is arbitrarily set and may be two or less or four or more. The robots 30a, 30b, and 30c may have similar configurations or different configurations. In the present embodiment, when the robots 30a, 30b, and 30c are referred to without being distinguished from each other, they are collectively referred to as the robot 30. The outline of each device and robot will be described below.
 サーバ装置10は、端末装置20から複数のロボット30を遠隔操作することに関し、各種の処理を実行する装置である。サーバ装置10はさらに、利用可能なロボット30の検索処理や、ロボット30の操作のための予約登録の管理等を行う。サーバ装置10は、サーバコンピュータなどの情報処理装置により構成される。サーバ装置10は、1つの情報処理装置により構成されてもよいし、複数の情報処理装置(例えば、クラウドコンピューティング又はエッヂコンピューティング)により構成されてもよい。 The server device 10 is a device that executes various processes related to remote control of a plurality of robots 30 from the terminal device 20. The server device 10 further performs a search process for the available robot 30, management of reservation registration for the operation of the robot 30, and the like. The server device 10 is composed of an information processing device such as a server computer. The server device 10 may be configured by one information processing device or may be configured by a plurality of information processing devices (for example, cloud computing or edge computing).
 端末装置20は、ロボット30の操作や、当該操作をユーザが行うことを予約するために、ユーザにより使用される情報処理装置である。端末装置20は、例えば、スマートフォン、タブレット端末、PDA(Personal Digital Assistants)、パーソナルコンピュータ、ヘッドマウントディスプレイ、特定用途の操作系などの汎用又は専用の情報処理装置である。ロボット30の操作の予約のために使用される端末装置20は、当該操作のために使用される端末装置20と異なる装置であってもよいし、同じ装置であってもよい。 The terminal device 20 is an information processing device used by the user to operate the robot 30 and to reserve the user to perform the operation. The terminal device 20 is a general-purpose or dedicated information processing device such as a smartphone, a tablet terminal, a PDA (Personal Digital Assistants), a personal computer, a head-mounted display, and an operation system for a specific purpose. The terminal device 20 used for reserving the operation of the robot 30 may be a device different from the terminal device 20 used for the operation, or may be the same device.
 ロボット30は、固定されていないロボットである。ロボット30が固定されていないとは、ロボット30が車輪等による移動のための駆動部を有する移動型である場合と、人が装着でき、マニピュレータ等の動作のための駆動部を有する装着型である場合とを含む。 The robot 30 is a non-fixed robot. The robot 30 is not fixed in the case where the robot 30 is a mobile type having a drive unit for movement by wheels or the like, and the case where the robot 30 is a wearable type which can be worn by a person and has a drive unit for operation of a manipulator or the like. Including some cases.
 移動型のロボットは、例えば特許文献1に示されている。移動型ロボットの移動部は、一輪、二輪又は多輪により走行するもの、キャタピラにより走行するもの、レールの上を走行するもの、飛び跳ねて移動するもの、二足歩行、四足歩行又は多足歩行するもの、スクリューにより水上又は水中を航行するもの及びプロペラ等により飛行するものを含む。装着型のロボットは、例えばMHD Yamen Saraiji, Tomoya Sasaki, Reo Matsumura, Kouta Minamizawa and Masahiko Inami, "Fusion: full body surrogacy for collaborative communication," Proceeding SIGGRAPH '18 ACM SIGGRAPH 2018 Emerging Technologies Article No. 7.にて公開されている。さらに、ロボット30は、自動走行又は半自動走行可能な車両や重機であったり、ドローンや飛行機であったりを含む。また、ロボット30は、スポーツスタジアム等に設置され、レールの上を移動可能なカメラを備えたロボットを含む。また、ロボット30は、宇宙空間に打ち上げられる衛星型ロボットであって、姿勢制御やカメラの撮影方向の制御が可能なロボットを含む。また、ロボット30は、いわゆるテレプレゼンスロボットやアバターロボットであってよい。 The mobile robot is shown in, for example, Patent Document 1. The moving part of the mobile robot is one that travels by one wheel, two wheels or multiple wheels, one that travels by a caterpillar, one that travels on a rail, one that jumps and moves, bipedal walking, four-legged walking or multi-legged walking. Includes those that use a screw, those that navigate on or under water with a screw, and those that fly with a propeller or the like. Wearable robots are, for example, MHD Yaman Saraiji, Tomoya Sasaki, Reo Matsumura, Kouta Minamizawa and Masahiko Inami, "Fusion: full body surrogacy for collaborative communication," Proceeding SIGGRAPH '18 ACM It has been published. Further, the robot 30 includes a vehicle or a heavy machine capable of automatic or semi-automatic traveling, or a drone or an airplane. Further, the robot 30 includes a robot installed in a sports stadium or the like and equipped with a camera that can move on rails. Further, the robot 30 is a satellite type robot launched into outer space, and includes a robot capable of controlling the attitude and the shooting direction of the camera. Further, the robot 30 may be a so-called telepresence robot or an avatar robot.
 図2に示すように、ユーザは端末装置20を介してロボット30の遠隔操作(例えば、ロボット30の移動やロボット30に搭載されたカメラの操作)を行うことができる。端末装置20とロボット30の間の操作信号又はデータの送受信のための通信アーキテクチャは、任意のアーキテクチャを採用可能であり、例えば、P2P(Peer-to-Peer)又はクライアントサーバである。ロボット30は、端末装置20から受信した信号に応じて動作し、ロボット30に搭載されたカメラ、センサ及びその他の装置を通じて取得された画像データ及び音声データなど、ロボット30がいる場所に関してロボット30が取得又は検知等したデータを端末装置20に送信する。これにより、ユーザは、端末装置20及びロボット30を介して、ロボット30がいる場所に自分もいるかのようなエクスペリエンスを体感することができる。 As shown in FIG. 2, the user can remotely control the robot 30 (for example, move the robot 30 or operate the camera mounted on the robot 30) via the terminal device 20. The communication architecture for transmitting and receiving operation signals or data between the terminal device 20 and the robot 30 can adopt any architecture, for example, P2P (Peer-to-Peer) or a client server. The robot 30 operates in response to a signal received from the terminal device 20, and the robot 30 relates to a location where the robot 30 is located, such as image data and voice data acquired through a camera, a sensor, and other devices mounted on the robot 30. The acquired or detected data is transmitted to the terminal device 20. As a result, the user can experience the experience as if he / she is in the place where the robot 30 is located through the terminal device 20 and the robot 30.
 さらに、ロボット30搭載されたカメラ又はセンサ等により取得されたロボット30の周辺データは、マップデータの作成のために使用される。詳細には、まず、ロボット30は、ロボット30のカメラ又はセンサにより取得された周辺データ(又は周辺データに基づくデータ。以下の説明においても同様。)をサーバ装置10に送信する。サーバ装置10は、ロボット30から受信した周辺データを記憶する。マップデータの作成のために、ロボット30による送信前、又はサーバ装置10への記憶前に、周辺データに対して所定の処理が行われる。このように、ロボット30(すなわち、ロボット30のカメラ又はセンサにより取得された周辺データ)は、ユーザにエクスペリエンスを体感させる以外の目的にも使用される。 Further, the peripheral data of the robot 30 acquired by the camera or sensor mounted on the robot 30 is used for creating the map data. Specifically, first, the robot 30 transmits peripheral data (or data based on the peripheral data; the same applies in the following description) acquired by the camera or sensor of the robot 30 to the server device 10. The server device 10 stores peripheral data received from the robot 30. In order to create the map data, a predetermined process is performed on the peripheral data before the transmission by the robot 30 or before the storage in the server device 10. In this way, the robot 30 (that is, peripheral data acquired by the camera or sensor of the robot 30) is also used for purposes other than allowing the user to experience the experience.
 <機能構成>
 図3を参照して、ロボット30が備えるコンピュータが有する主な機能の構成を説明する。これらの機能は、ロボット30が有する制御部(プロセッサ)が、記憶部に記憶されたコンピュータプログラムを読み込み、実行することにより実現される。ロボット30のハードウェア構成については、後述する。
<Functional configuration>
With reference to FIG. 3, the configuration of the main functions of the computer included in the robot 30 will be described. These functions are realized by the control unit (processor) of the robot 30 reading and executing the computer program stored in the storage unit. The hardware configuration of the robot 30 will be described later.
 図3に示すように、ロボット30は、主な機能の構成として、測位部31、駆動制御部32、データ処理部33、通信制御部34、データベース35、及び周辺データ取得部36を有する。ロボット30が有する機能は、これらに限定されず、ロボットが一般的に有する機能など、他の機能を有してもよい。例えば、ロボット30は、ロボット30に搭載されたカメラの動作を制御するカメラ制御部をさらに有する。 As shown in FIG. 3, the robot 30 has a positioning unit 31, a drive control unit 32, a data processing unit 33, a communication control unit 34, a database 35, and a peripheral data acquisition unit 36 as main functional configurations. The functions of the robot 30 are not limited to these, and may have other functions such as the functions generally possessed by the robot. For example, the robot 30 further has a camera control unit that controls the operation of the camera mounted on the robot 30.
 測位部31は、ロボット30の位置情報を取得する。当該位置情報は、外部装置から取得されてもよいし、測位部31による測位処理により算出されてもよい。測位処理は、GNSS(Global Navigation Satellite System)により実施されてもよいし、基地局又は通信機器(例えば、ルータ)などを使用して、又はその他の方法により実施されてもよい。また、ロボット30の位置情報は、例えば、ロボット30の通信に使用するロボット30の近隣の装置(例えば、ルーター)の通信用のアドレス情報(例えば、IPアドレス)に基づいて取得された位置の情報であってもよい。さらに、ロボット30の位置情報は、ロボット30が備える駆動部により移動した距離及び方向に基づいて算出されて取得されてもよい。 The positioning unit 31 acquires the position information of the robot 30. The position information may be acquired from an external device or may be calculated by a positioning process by the positioning unit 31. The positioning process may be carried out by GNSS (Global Navigation Satellite System), or may be carried out by using a base station or a communication device (for example, a router), or by another method. Further, the position information of the robot 30 is, for example, position information acquired based on the communication address information (for example, IP address) of a device (for example, a router) in the vicinity of the robot 30 used for the communication of the robot 30. May be. Further, the position information of the robot 30 may be calculated and acquired based on the distance and the direction moved by the drive unit included in the robot 30.
 駆動制御部32は、ロボット30が備える駆動部を制御することにより、ロボット30の移動、停止及びその他の動作を制御する。駆動制御部32は、端末装置20からロボット30が受信した遠隔操作のための信号に応じてロボットが移動するように駆動部を制御する。また、駆動制御部32は、予め設定されたプログラムに応じて自動的に移動するように駆動部を制御可能である。 The drive control unit 32 controls the movement, stop, and other operations of the robot 30 by controlling the drive unit included in the robot 30. The drive control unit 32 controls the drive unit so that the robot moves in response to a signal for remote control received by the robot 30 from the terminal device 20. Further, the drive control unit 32 can control the drive unit so as to automatically move according to a preset program.
 周辺データ取得部36は、ロボット30の周辺データを取得する。周辺データは、例えば、ロボット30に搭載されたカメラにより取得された周辺環境の画像データを含む。さらに、周辺データは、ロボット30に搭載されたセンサ(例えば、超音波センサ、レーザセンサ、レーダー、又はマイク等)によりセンシングされた周辺環境の2次元若しくは3次元データ、又は音声データ等を含む。周辺環境の2次元若しくは3次元データは、例えば、ロボット30から周辺環境に存在するオブジェクトまでの距離、及び当該オブジェクトの形状に関するデータを含む。 Peripheral data acquisition unit 36 acquires peripheral data of the robot 30. Peripheral data includes, for example, image data of the surrounding environment acquired by a camera mounted on the robot 30. Further, the peripheral data includes two-dimensional or three-dimensional data of the surrounding environment sensed by a sensor mounted on the robot 30 (for example, an ultrasonic sensor, a laser sensor, a radar, a microphone, or the like), audio data, or the like. The two-dimensional or three-dimensional data of the surrounding environment includes, for example, data regarding the distance from the robot 30 to an object existing in the surrounding environment and the shape of the object.
 データ処理部33は、周辺データ取得部36により取得された周辺データに対して各種の処理を行う。データ処理部33による処理は、例えば、周辺データに含まれる画像データに対する補正処理(例えば、色補正)、加工処理(例えば、ノイズ除去)、及び圧縮処理を含む。データ処理部33は、画像データの用途又は送信先に応じて、異なる処理を行うことが可能である。例えば、上記の補正処理、加工処理、及び圧縮処理を実施して生成したデータを端末装置20の表示部に表示させるためのデータとすることが可能である。 The data processing unit 33 performs various processes on the peripheral data acquired by the peripheral data acquisition unit 36. The processing by the data processing unit 33 includes, for example, a correction processing (for example, color correction), a processing processing (for example, noise reduction), and a compression processing for the image data included in the peripheral data. The data processing unit 33 can perform different processing depending on the use or destination of the image data. For example, the data generated by performing the above correction processing, processing processing, and compression processing can be used as data for displaying on the display unit of the terminal device 20.
 また、他の用途のために、データ処理部33は、上記の処理に加えて、又は上記の処理に代えて、周辺データに対して他の処理を実施してデータを生成可能である。例えば、マップデータの作成のために、データ処理部33は、周辺データ取得部36により取得された周辺データと、測位部31により取得された位置情報とに基づくデータを生成する。例えば、データ処理部33は、現実世界に従うように、ロボット30のカメラにより取得された周辺データが示す空間における座標と、測位部31により取得された位置情報とを対応付けたデータを生成する。このとき、データ処理部33は、当該データの生成のために、周辺データ取得部36により取得された周辺環境の3次元データ(例えば、超音波センサ、レーザセンサ、又はレーダーによりセンシングされたデータ)と、SLAM(Simultaneous Localization and Mapping)技術とが使用されてもよい。また、データ処理部33は、周辺データ取得部36により取得された周辺データに基づいて、ポイントクラウドを生成してもよい。さらに、データ処理部33は、周辺データに含まれる画像データに対して画像認識処理を行い、画像に含まれる所定のオブジェクト(例えば、人、車、自転車など、マップに含めるべき空間情報に関係しないオブジェクト)の画像に対して、マスキング処理(例えば、所定のオブジェクトを除去する処理)を実施することが可能である。データ処理部33は、画像に含まれていたマスキングされたオブジェクト及びその他のオブジェクトに対して、画像認識処理を行い、当該オブジェクトの属性情報をメタデータとして周辺データの画像データに付加することができる。オブジェクトの属性情報は、オブジェクトの種別(人、車、自転車、動物など)、人の年齢、及び性別などを含む。データ処理部33は、マップデータの作成のために、周辺データ取得部36により取得された周辺環境の3次元データに応じて、LiDAR(light detection and ranging)技術を使用して、画像データに対して処理を実施してもよい。 Further, for other purposes, the data processing unit 33 can generate data by performing other processing on peripheral data in addition to or in place of the above processing. For example, in order to create map data, the data processing unit 33 generates data based on the peripheral data acquired by the peripheral data acquisition unit 36 and the position information acquired by the positioning unit 31. For example, the data processing unit 33 generates data in which the coordinates in the space indicated by the peripheral data acquired by the camera of the robot 30 and the position information acquired by the positioning unit 31 are associated with each other so as to follow the real world. At this time, the data processing unit 33 has three-dimensional data of the surrounding environment acquired by the peripheral data acquisition unit 36 (for example, data sensed by an ultrasonic sensor, a laser sensor, or a radar) for the purpose of generating the data. And SLAM (Simultaneous Localization and Mapping) technology may be used. Further, the data processing unit 33 may generate a point cloud based on the peripheral data acquired by the peripheral data acquisition unit 36. Further, the data processing unit 33 performs image recognition processing on the image data included in the peripheral data, and is not related to the spatial information to be included in the map, such as a predetermined object (for example, a person, a car, a bicycle, etc.) included in the image. It is possible to perform masking processing (for example, processing for removing a predetermined object) on an image of an object). The data processing unit 33 can perform image recognition processing on the masked object and other objects included in the image, and add the attribute information of the object to the image data of the peripheral data as metadata. .. The attribute information of the object includes the type of the object (person, car, bicycle, animal, etc.), the age of the person, the gender, and the like. In order to create map data, the data processing unit 33 uses LiDAR (light detection and ranging) technology in response to the three-dimensional data of the surrounding environment acquired by the peripheral data acquisition unit 36 to the image data. You may carry out the processing.
 通信制御部34は、ロボット30が搭載する通信部を介して、サーバ装置10及び端末装置20などの外部装置との間の各種通信を制御する。通信制御部34は、周辺データ取得部36により取得された周辺データに基づくデータ(例えば、データ処理部33により処理されたデータ)をサーバ装置10及び端末装置20に送信するように制御する。通信制御部34は、データ処理部33により周辺データに対して、用途又は送信先に応じた異なる処理が実施されたデータの外部装置への送信を制御する。例えば、通信制御部34は、周辺データ取得部36により取得されたロボット30の周辺データに基づく第1データを端末装置20に送信するように制御する。また、通信制御部34は、当該周辺データに基づき、第1データと異なる第2データをサーバ装置10に送信するように制御する。第1データは、例えば、周辺データの画像データに対して、端末装置20の表示部に表示させるための処理がデータ処理部33により実施されたデータである。第2データは、例えば、マップデータの作成のために、周辺データ取得部36により取得された周辺データ(例えば、超音波センサ、レーザセンサ、レーダー、又はマイクによりセンシングされたデータを含む。)と、測位部31により取得された位置情報とに基づいて生成されたデータである。 The communication control unit 34 controls various communications with external devices such as the server device 10 and the terminal device 20 via the communication unit mounted on the robot 30. The communication control unit 34 controls to transmit the data based on the peripheral data acquired by the peripheral data acquisition unit 36 (for example, the data processed by the data processing unit 33) to the server device 10 and the terminal device 20. The communication control unit 34 controls the transmission of peripheral data to an external device, which has been subjected to different processing according to the purpose or destination, by the data processing unit 33. For example, the communication control unit 34 controls to transmit the first data based on the peripheral data of the robot 30 acquired by the peripheral data acquisition unit 36 to the terminal device 20. Further, the communication control unit 34 controls to transmit the second data different from the first data to the server device 10 based on the peripheral data. The first data is, for example, data in which the data processing unit 33 has performed a process for displaying the image data of the peripheral data on the display unit of the terminal device 20. The second data includes, for example, peripheral data acquired by a peripheral data acquisition unit 36 for creating map data (for example, data sensed by an ultrasonic sensor, a laser sensor, a radar, or a microphone). This is data generated based on the position information acquired by the positioning unit 31.
 データベース35は、ロボット30において実行される処理に必要なデータ、並びに当該処理により生成又は設定されたデータなど、各種データを記憶する。データベース35は、例えば、周辺データ取得部36により取得された周辺データ、及び当該周辺データに対して処理が施されたデータを記憶する。 The database 35 stores various data such as data necessary for the processing executed by the robot 30 and data generated or set by the processing. The database 35 stores, for example, peripheral data acquired by the peripheral data acquisition unit 36 and data processed for the peripheral data.
 次に、図4を参照して、サーバ装置10が有する主な機能の構成を説明する。これらの機能は、サーバ装置10が有する制御部(プロセッサ)が、記憶部に記憶されたコンピュータプログラムを読み込み、実行することにより実現される。サーバ装置10のハードウェア構成については、後述する。 Next, with reference to FIG. 4, the configuration of the main functions of the server device 10 will be described. These functions are realized by the control unit (processor) of the server device 10 reading and executing the computer program stored in the storage unit. The hardware configuration of the server device 10 will be described later.
 図4に示すように、サーバ装置10は、主な機能の構成として、処理部11、通信制御部12、及びデータベース13を有する。サーバ装置10が有する機能は、これらに限定されず、ロボットが一般的に有する機能など、他の機能を有してもよい。 As shown in FIG. 4, the server device 10 has a processing unit 11, a communication control unit 12, and a database 13 as a configuration of main functions. The functions of the server device 10 are not limited to these, and may have other functions such as the functions generally possessed by the robot.
 処理部11は、サーバ装置10における各種の処理を実施する。処理部11は、例えば、ロボット30から受信したデータに基づいて、マップデータを生成してデータベース13に記憶し、又はデータベース13を参照してマップデータを更新する。ロボット30から受信したデータに基づくマップデータとは、ロボット30により取得されたデータ(例えば、上述の周辺データ及び位置データ)に基づいて生成又は更新されるデータであり、ロボット30が通過又は滞在した場所付近におけるマップのデータである。処理部11は、上述したロボット30のデータ処理部33が行う処理の一部(例えば、ポイントクラウドの生成、マスキング処理、又はメタデータの付加のうちの少なくともいずれかの処理を)、ロボット30の代わりに実施してもよい。 The processing unit 11 carries out various processes in the server device 10. For example, the processing unit 11 generates map data based on the data received from the robot 30 and stores it in the database 13, or updates the map data with reference to the database 13. The map data based on the data received from the robot 30 is data generated or updated based on the data acquired by the robot 30 (for example, the above-mentioned peripheral data and position data), and the robot 30 has passed or stayed. It is the data of the map near the place. The processing unit 11 is a part of the processing performed by the data processing unit 33 of the robot 30 described above (for example, at least one of the process of generating a point cloud, the masking process, or the addition of metadata), the robot 30. It may be carried out instead.
 サーバ装置10が複数のロボット30からデータを受信する場合、処理部11は、複数のロボット30から受信したデータに基づいて、マップデータを生成又は更新してもよい。例えば、処理部11は、それぞれ異なる位置に存在する複数のロボット30から受信したデータを統合し、マップデータを生成又は更新してもよい。これにより、一つのロボット30から受信したデータに基づいてマップデータを生成する場合と比較して、より広範囲のマップデータを短期間で生成及び更新することが可能となる。 When the server device 10 receives data from a plurality of robots 30, the processing unit 11 may generate or update map data based on the data received from the plurality of robots 30. For example, the processing unit 11 may integrate data received from a plurality of robots 30 existing at different positions to generate or update map data. This makes it possible to generate and update a wider range of map data in a short period of time as compared with the case where map data is generated based on the data received from one robot 30.
 通信制御部12は、サーバ装置10が備える通信部を介して、端末装置20及びロボット30などの外部装置との間の各種通信を制御する。 The communication control unit 12 controls various communications between the terminal device 20 and an external device such as the robot 30 via the communication unit included in the server device 10.
 データベース13は、サーバ装置10において実行される処理に必要なデータ、当該処理により生成又は設定されたデータ、並びに外部装置から取得したデータなど、各種データを記憶する。データベース13は、例えば、処理部11により生成又は更新されたマップデータを記憶する。 The database 13 stores various data such as data required for processing executed by the server device 10, data generated or set by the processing, and data acquired from an external device. The database 13 stores, for example, map data generated or updated by the processing unit 11.
 本実施形態によれば、上述のとおり、ロボット30は、端末装置20を介して遠隔操作され、ユーザに対してロボット30がいる場所に関するエクスペリエンスを提供するために、移動を行う。さらに、当該移動中にロボット30のカメラ又はセンサ等により取得された周辺データは、マップデータの生成に使用される。従って、ユーザへのエクスペリエンスの提供のためにロボット30が遠隔操作されるほど、マップデータ生成のための周辺データが高頻度で、より多く、ロボット30により取得される。これに対し、従来において、マップデータを生成するためのデータの収集の頻度は、数年に一度程度である。すなわち、本実施形態によれば、ロボット30による周辺データの収集は、マップデータを生成するためだけの目的ではないため、高頻度で周辺データを収集することができ、その結果、より現実の空間情報に即したマップデータを生成することが可能となる。 According to the present embodiment, as described above, the robot 30 is remotely controlled via the terminal device 20 and moves in order to provide the user with an experience regarding the location of the robot 30. Further, peripheral data acquired by the camera, sensor, or the like of the robot 30 during the movement is used for generating map data. Therefore, the more the robot 30 is remotely controlled to provide the experience to the user, the more frequently and more peripheral data for generating the map data is acquired by the robot 30. On the other hand, conventionally, the frequency of collecting data for generating map data is about once every few years. That is, according to the present embodiment, since the collection of peripheral data by the robot 30 is not only for the purpose of generating map data, it is possible to collect peripheral data with high frequency, and as a result, a more real space. It is possible to generate map data that matches the information.
 <処理フロー>
 図5を参照して、システム1における処理フローの一例を説明する。この処理は、サーバ装置10、端末装置20及びロボット30において、プロセッサが、記憶部に記憶されたコンピュータプログラムを読み込み、実行することにより実現される。なお、この処理における各処理ステップについて、既に詳細を説明しているものについては、ここでは詳細な説明を省略する。
<Processing flow>
An example of the processing flow in the system 1 will be described with reference to FIG. This process is realized in the server device 10, the terminal device 20, and the robot 30 by the processor reading and executing the computer program stored in the storage unit. It should be noted that detailed description of each processing step in this processing has already been described here.
 ステップS501において、端末装置20は、ユーザ操作に応じて、ロボット30の遠隔操作を行うために、ロボット30への制御信号の送信を開始する。遠隔操作は、例えば、ロボット30に搭載されたカメラに対する操作及びロボット30の移動の操作を含む。 In step S501, the terminal device 20 starts transmitting a control signal to the robot 30 in order to remotely control the robot 30 in response to a user operation. The remote control includes, for example, an operation for a camera mounted on the robot 30 and an operation for moving the robot 30.
 ステップS502において、ロボット30は、ステップS501で受信した遠隔操作開始の制御信号に応じて、動作を開始する。当該動作は、ロボット30のカメラによる撮影、センサによるセンシング、及びロボット30の現在位置の位置情報の取得を含む。 In step S502, the robot 30 starts the operation in response to the remote control start control signal received in step S501. The operation includes photographing by the camera of the robot 30, sensing by a sensor, and acquisition of position information of the current position of the robot 30.
 ステップS503において、ロボット30は、ロボット30のカメラ及びセンサにより取得されたロボット30の周辺データに対して所定の処理を実施する。例えば、ロボット30は、端末装置20の出力部(例えば、表示装置又はスピーカ)で再生させるための画像データ及び音声データの生成のための処理を実施してデータ(例えば、上記の第1データ)を生成する。また、ロボット30は、マップデータの作成のための処理を実施して、データ(例えば、上記の第2データ)を生成する。 In step S503, the robot 30 performs a predetermined process on the peripheral data of the robot 30 acquired by the camera and the sensor of the robot 30. For example, the robot 30 performs a process for generating image data and audio data to be reproduced by an output unit (for example, a display device or a speaker) of the terminal device 20, and the data (for example, the first data described above). To generate. Further, the robot 30 performs a process for creating map data to generate data (for example, the above-mentioned second data).
 ステップS504において、ロボット30は、ステップS503で生成されたデータのうち、端末装置20の出力部で再生させるためのデータの端末装置20への送信を開始する。端末装置20は、ロボット30から受信したデータに基づいて、表示部への画像の表示、又はスピーカでの音声の再生を開始する。ロボット30から端末装置20へのデータの送信は、例えば、端末装置20の表示部への画像表示の終了が端末装置20を介して指示されるまで、継続的に実行される。 In step S504, the robot 30 starts transmitting the data generated in step S503 to the terminal device 20 for reproduction by the output unit of the terminal device 20. The terminal device 20 starts displaying an image on the display unit or reproducing the sound on the speaker based on the data received from the robot 30. The transmission of data from the robot 30 to the terminal device 20 is continuously executed, for example, until the end of the image display on the display unit of the terminal device 20 is instructed via the terminal device 20.
 ステップS505において、ロボット30は、端末装置20から受信した遠隔操作の制御信号に応じて、移動を開始する。当該移動は、ユーザにより端末装置20を介して指示されたロボット30を利用したショッピング、観光、又は散歩のための屋内又は屋外の移動を含む。ロボット30は、当該移動により通過又は滞在した場所付近における周辺データをロボット30のカメラ及びセンサにより取得する。 In step S505, the robot 30 starts moving in response to the remote control control signal received from the terminal device 20. The movement includes indoor or outdoor movement for shopping, sightseeing, or walking using the robot 30 instructed by the user via the terminal device 20. The robot 30 acquires peripheral data in the vicinity of the place where the robot 30 has passed or stayed due to the movement by the camera and the sensor of the robot 30.
 ステップS506において、ロボット30は、所定のタイミングで、ステップS503で生成されたデータのうち、マップデータの作成のための処理が実施されたデータのサーバ装置10への送信を開始する。ステップS506でサーバ装置10に送信されるデータは、カメラ及びセンサにより取得された周辺データとロボット30の位置情報とに基づいて生成されたデータを含む。 In step S506, the robot 30 starts transmitting the data generated in step S503, which has been processed for creating map data, to the server device 10 at a predetermined timing. The data transmitted to the server device 10 in step S506 includes data generated based on the peripheral data acquired by the camera and the sensor and the position information of the robot 30.
 サーバ装置10への送信を開始する上記の所定のタイミングは、例えば、ロボット30が、ステップS503における端末装置20へ送信するためのデータの生成処理を終了したタイミング、又はステップS504で開始した端末装置20へのデータ送信の終了のタイミングを含む。また、上記の所定のタイミングは、端末装置20からロボット30が遠隔操作されていないタイミング、ロボット30が移動を停止しているタイミング、又は所定の場所(例えば、ロボット30の充電ドック)で停止しているタイミングを含む。ロボット30は、このようなタイミングでデータをサーバ装置10への送信することにより、ロボット30が端末装置20へデータを送信するための通信のタイミングと異なるタイミングで、サーバ装置10へデータ送信することが可能である。その結果、ロボット30が同時に使用する通信帯域を低減することが可能である。また、ロボット30が通信を行うために必要な処理を実行するタイミングを分散することができるため、ロボット30の処理負荷を分散することができる。このような処理負荷の分散の効果を得るために、ステップS503で実施されるマップデータの作成のための画像処理のうち、上記のマスキング処理や属性情報の付加処理などの一部の処理を上記の所定のタイミングで実施してもよい。 The predetermined timing for starting transmission to the server device 10 is, for example, the timing when the robot 30 finishes the data generation process for transmission to the terminal device 20 in step S503, or the terminal device started in step S504. Includes the timing of the end of data transmission to 20. Further, the predetermined timing described above is a timing at which the robot 30 is not remotely controlled from the terminal device 20, a timing at which the robot 30 has stopped moving, or a predetermined timing (for example, a charging dock of the robot 30). Including the timing. By transmitting the data to the server device 10 at such a timing, the robot 30 transmits the data to the server device 10 at a timing different from the communication timing for the robot 30 to transmit the data to the terminal device 20. Is possible. As a result, it is possible to reduce the communication band used by the robot 30 at the same time. Further, since the timing at which the robot 30 executes the processing necessary for communication can be distributed, the processing load of the robot 30 can be distributed. In order to obtain the effect of distributing the processing load, some of the image processes for creating the map data performed in step S503, such as the masking process and the attribute information addition process, are described above. It may be carried out at a predetermined timing.
 なお、ステップS506において、ロボット30は、所定のタイミングで、マップデータの作成のための画像処理が実施されたデータのサーバ装置10への送信を開始するのではなく、ステップS503で処理が終了したデータから順次サーバ装置10へ送信してもよい。 In step S506, the robot 30 does not start transmitting the data for which the image processing for creating the map data has been performed to the server device 10 at a predetermined timing, but the processing is completed in step S503. The data may be sequentially transmitted to the server device 10.
 ステップS507において、サーバ装置10は、ロボット30から受信したデータに基づいて、マップデータを生成して記憶部に記憶し、又は記憶部を参照してマップデータを更新する。 In step S507, the server device 10 generates map data and stores it in the storage unit based on the data received from the robot 30, or updates the map data by referring to the storage unit.
 本実施形態によれば、上述のとおり、ロボット30は、端末装置20を介して遠隔操作され、ユーザに対してロボット30がいる場所に関するエクスペリエンスを提供するために、移動を行う。さらに、当該移動中にロボット30のカメラ又はセンサにより取得された周辺データは、マップデータの生成に使用される。従って、ユーザへのエクスペリエンスの提供のためにロボット30が遠隔操作されるほど、マップデータ生成のための周辺データが高頻度で、より多く、ロボット30により取得される。これに対し、従来において、マップデータを生成するためのデータの収集の頻度は、数年に一度程度である。すなわち、本実施形態によれば、ロボット30による周辺データの収集は、マップデータを生成するためだけの目的ではないため、高頻度で周辺データを収集することができ、その結果、より現実の空間情報に即したマップデータを生成することが可能となる。 According to the present embodiment, as described above, the robot 30 is remotely controlled via the terminal device 20 and moves in order to provide the user with an experience regarding the location of the robot 30. Further, peripheral data acquired by the camera or sensor of the robot 30 during the movement is used for generating map data. Therefore, the more the robot 30 is remotely controlled to provide the experience to the user, the more frequently and more peripheral data for generating the map data is acquired by the robot 30. On the other hand, conventionally, the frequency of collecting data for generating map data is about once every few years. That is, according to the present embodiment, since the collection of peripheral data by the robot 30 is not only for the purpose of generating map data, it is possible to collect peripheral data with high frequency, and as a result, a more real space. It is possible to generate map data that matches the information.
 <コンピュータのハードウェア構成>
 次に、図6を参照して、本実施形態におけるサーバ装置10及び端末装置20を実装するためのコンピュータ(情報処理装置)の例示的なハードウェア構成を説明する。
<Computer hardware configuration>
Next, with reference to FIG. 6, an exemplary hardware configuration of a computer (information processing device) for mounting the server device 10 and the terminal device 20 in the present embodiment will be described.
 図6に示すように、コンピュータ700は、主な構成として、プロセッサ701と、メモリ703と、記憶装置705と、操作部707と、入力部709と、通信部711、及び出力部713を備える。コンピュータ700は、これらの構成のうち、少なくとも一部を備えなくてもよい。また、コンピュータ700は、これらの構成以外に、汎用コンピュータ又は専用コンピュータが一般的に備える他の構成を備えてもよい。また、コンピュータ700は、図6に示した構成のうち、一部を備えなくてもよい。 As shown in FIG. 6, the computer 700 mainly includes a processor 701, a memory 703, a storage device 705, an operation unit 707, an input unit 709, a communication unit 711, and an output unit 713. The computer 700 does not have to include at least a part of these configurations. In addition to these configurations, the computer 700 may include a general-purpose computer or another configuration generally provided by a dedicated computer. Further, the computer 700 does not have to include a part of the configurations shown in FIG.
 プロセッサ701は、メモリ703に記憶されているプログラムを実行することによりコンピュータ700における各種の処理を制御する制御部である。従って、プロセッサ701は、コンピュータ700が備える他の構成と、プログラムとの協働により、上記の実施形態で説明した各装置の機能を実現し、上記の処理の実行を制御する。 The processor 701 is a control unit that controls various processes in the computer 700 by executing a program stored in the memory 703. Therefore, the processor 701 realizes the function of each device described in the above-described embodiment by cooperating with the program and other configurations included in the computer 700, and controls the execution of the above-mentioned processing.
 メモリ703は、例えばRAM(Random Access Memory)等の記憶媒体である。メモリ703には、プロセッサ701によって実行されるプログラムのプログラムコードや、プログラムの実行時に必要となるデータが記憶装置705等から一時的に読みだされて、又は予め記憶されている。 The memory 703 is a storage medium such as, for example, a RAM (Random Access Memory). In the memory 703, the program code of the program executed by the processor 701 and the data required for executing the program are temporarily read from the storage device 705 or the like, or stored in advance.
 記憶装置705は、例えばハードディスクドライブ(HDD)等の不揮発性の記憶媒体である。記憶装置705は、オペレーティングシステム、上記各構成を実現するための各種プログラム、上述した処理結果のデータ等を記憶する。 The storage device 705 is a non-volatile storage medium such as a hard disk drive (HDD). The storage device 705 stores an operating system, various programs for realizing each of the above configurations, data of the above-mentioned processing results, and the like.
 操作部707は、ユーザからの入力を受け付けるためのデバイスである。操作部707の具体例としては、キーボード、マウス、タッチパネル、ジョイスティック、各種センサ、ウェアラブルデバイス等が挙げられる。操作部707は、例えばUSB(Universal Serial Bus)等のインタフェースを介してコンピュータ700に取り外し可能に接続されても良い。 The operation unit 707 is a device for receiving input from the user. Specific examples of the operation unit 707 include a keyboard, a mouse, a touch panel, a joystick, various sensors, a wearable device, and the like. The operation unit 707 may be detachably connected to the computer 700 via an interface such as USB (Universal Serial Bus).
 入力部709は、コンピュータ700の外部からデータを入力するためのデバイスである。入力部709の具体例としては、各種記憶媒体に記憶されているデータを読み取るためのドライブ装置等がある。また、入力部709は、周辺の音声を収音し、音声を音声データに変換して入力するマイクを含む。入力部709は、コンピュータ700に取り外し可能に接続されてもよい。その場合、入力部709は、例えばUSB等のインタフェースを介してコンピュータ700へと接続される。 The input unit 709 is a device for inputting data from the outside of the computer 700. Specific examples of the input unit 709 include a drive device for reading data stored in various storage media. Further, the input unit 709 includes a microphone that collects surrounding voices, converts the voices into voice data, and inputs the voices. The input unit 709 may be detachably connected to the computer 700. In that case, the input unit 709 is connected to the computer 700 via an interface such as USB.
 通信部711は、コンピュータ700の外部の装置と有線又は無線により、ネットワークを介したデータ通信を行うための装置である。通信部711は、コンピュータ700に取り外し可能に接続されてもよい。その場合、通信部711は、例えばUSB等のインタフェースを介してコンピュータ700に接続される。 The communication unit 711 is a device for performing data communication via a network with an external device of the computer 700 by wire or wirelessly. The communication unit 711 may be detachably connected to the computer 700. In that case, the communication unit 711 is connected to the computer 700 via an interface such as USB.
 出力部713は、各種データを出力するデバイスである。出力部713は、例えば、データを表示するための表示装置、又は音声を出力するためのスピーカである。表示装置の具体例としては、例えば液晶ディスプレイや有機ELディスプレイ、ウェアラブルデバイスのディスプレイ等が挙げられる。出力部713は、コンピュータ700の外部に取り外し可能に接続されてもよい。その場合、出力部713である表示装置は、例えばディスプレイケーブル等を介してコンピュータ700に接続される。また、操作部707としてタッチパネルが採用される場合には、出力部713は、操作部707と一体化して構成することが可能である。 The output unit 713 is a device that outputs various data. The output unit 713 is, for example, a display device for displaying data or a speaker for outputting audio. Specific examples of the display device include a liquid crystal display, an organic EL display, a display of a wearable device, and the like. The output unit 713 may be detachably connected to the outside of the computer 700. In that case, the display device, which is the output unit 713, is connected to the computer 700 via, for example, a display cable or the like. Further, when the touch panel is adopted as the operation unit 707, the output unit 713 can be integrally configured with the operation unit 707.
 次に、図7を参照して、本実施形態におけるロボット30に搭載されるコンピュータ(情報処理装置)及びその他の主な構成の例示的なハードウェア構成を説明する。ロボット30は、プロセッサ901、RAM(Random Access Memory)902、ROM(Read only Memory)903、通信部904、入力部905、表示部906、駆動部907、カメラ908、及びセンサ909を備える。図7に示す構成は一例であり、ロボット30はこれら以外の構成を有してもよいし、これらの構成のうち一部を有さなくてもよい。例えば、ロボット30は、スピーカをさらに備えてもよい。また、ロボット30は、自機の位置を特定するためのユニットを備えてもよい。 Next, with reference to FIG. 7, an exemplary hardware configuration of a computer (information processing device) mounted on the robot 30 and other main configurations in the present embodiment will be described. The robot 30 includes a processor 901, a RAM (Random Access Memory) 902, a ROM (Read only Memory) 903, a communication unit 904, an input unit 905, a display unit 906, a drive unit 907, a camera 908, and a sensor 909. The configuration shown in FIG. 7 is an example, and the robot 30 may have configurations other than these, or may not have a part of these configurations. For example, the robot 30 may further include a speaker. Further, the robot 30 may include a unit for specifying the position of its own machine.
 プロセッサ901は、ロボット30の演算部であり、例えば、CPU(Central Processing Unit)である。RAM902及びROM903は、各種処理に必要なデータ及び処理結果のデータを記憶する記憶部である。ロボット30は、RAM902及びROM903以外に、HDD(Hard Disk Drive)及びSSD (Solid State Drive)などの大容量の記憶部を備えてもよい。通信部904は、外部装置との通信を行うデバイスである。通信部904は、例えば、アンテナ、又はNIC(Network Interface Car)を含む。入力部905は、ロボット30の外部からデータを入力するためのデバイスである。表示部906は、各種情報を表示するためのデバイスである。 The processor 901 is a calculation unit of the robot 30, and is, for example, a CPU (Central Processing Unit). The RAM 902 and the ROM 903 are storage units for storing data required for various processes and data of processing results. In addition to the RAM 902 and the ROM 903, the robot 30 may include a large-capacity storage unit such as an HDD (Hard Disk Drive) and an SSD (Sold State Drive). The communication unit 904 is a device that communicates with an external device. The communication unit 904 includes, for example, an antenna or a NIC (Network Interface Car). The input unit 905 is a device for inputting data from the outside of the robot 30. The display unit 906 is a device for displaying various information.
 プロセッサ901は、RAM902又はROM903に記憶されたプログラムの実行に関する制御やデータの演算、加工を行う制御部である。プロセッサ901は、ロボットを介したコミュニケーションを制御するプログラム(コミュニケーションプログラム)を実行する。プロセッサ901は、入力部905や通信部904から種々のデータを受け取り、データの演算結果を表示部906に表示したり、RAM902に格納したりする。 The processor 901 is a control unit that controls execution of a program stored in RAM 902 or ROM 903, calculates data, and processes data. The processor 901 executes a program (communication program) for controlling communication via the robot. The processor 901 receives various data from the input unit 905 and the communication unit 904, displays the calculation result of the data on the display unit 906, and stores the data in the RAM 902.
 入力部905は、ユーザからデータの入力を受け付けるものであり、例えば、キーボード及びタッチパネルを含んでよい。また、入力部905は、音声入力のためのマイクを含んでよい。 The input unit 905 accepts data input from the user, and may include, for example, a keyboard and a touch panel. Further, the input unit 905 may include a microphone for voice input.
 表示部906は、プロセッサ901による演算結果を視覚的に表示するものであり、例えば、LCD(Liquid Crystal Display)により構成されてよい。表示部906は、ロボット30のカメラ908で撮影された画像を表示してよい。 The display unit 906 visually displays the calculation result by the processor 901, and may be configured by, for example, an LCD (Liquid Crystal Display). The display unit 906 may display an image taken by the camera 908 of the robot 30.
 コミュニケーションプログラムは、RAM902やROM903等のコンピュータによって読み取り可能な記憶媒体に記憶されて提供されてもよいし、通信部904により接続される通信ネットワークを介して提供されてもよい。ロボット30では、プロセッサ901がコミュニケーションプログラムを実行することにより、ロボット30を制御するための様々な動作が実現される。なお、これらの物理的な構成は例示であって、必ずしも独立した構成でなくてもよい。例えば、コンピュータ900は、プロセッサ901とRAM902やROM903が一体化したLSI(Large-Scale Integration)を備えていてもよい。 The communication program may be stored in a storage medium readable by a computer such as RAM 902 or ROM 903 and provided, or may be provided via a communication network connected by the communication unit 904. In the robot 30, various operations for controlling the robot 30 are realized by the processor 901 executing the communication program. It should be noted that these physical configurations are examples and do not necessarily have to be independent configurations. For example, the computer 900 may include an LSI (Large-Scale Integration) in which a processor 901 and a RAM 902 or a ROM 903 are integrated.
 駆動部907は、遠隔操作可能なアクチュエータを含み、車輪等の移動部やマニピュレータ等を含む。ロボット30が移動型のロボットである場合、駆動部907は、少なくとも車輪等の移動部を含むが、マニピュレータを含んでもよい。ロボット30が装着型である場合、駆動部907は、少なくともマニピュレータを含む。 The drive unit 907 includes an actuator that can be remotely controlled, and includes a moving unit such as a wheel, a manipulator, and the like. When the robot 30 is a mobile robot, the drive unit 907 includes at least a moving unit such as a wheel, but may include a manipulator. When the robot 30 is wearable, the drive unit 907 includes at least a manipulator.
 カメラ908は、静止画又は動画を撮像する撮像素子を含み、撮像した静止画又は動画を、通信部904を介して外部装置に送信する。 The camera 908 includes an image sensor that captures a still image or a moving image, and transmits the captured still image or the moving image to an external device via the communication unit 904.
 センサ909は、周辺環境に関するセンシングを行う各種センサである。センサ909は、例えば、超音波センサ、レーザセンサ、レーダー、光センサ、音センサ(マイク)、圧力センサ、加速度センサ、角速度センサ、高度センサ、又は地磁気センサのうち少なくとも一部を含む。 Sensor 909 is various sensors that perform sensing related to the surrounding environment. The sensor 909 includes, for example, at least a part of an ultrasonic sensor, a laser sensor, a radar, an optical sensor, a sound sensor (mic), a pressure sensor, an acceleration sensor, an angular velocity sensor, an altitude sensor, or a geomagnetic sensor.
 <変形例> <Modification example>
 本実施形態におけるシステム1(または、サーバ装置10、端末装置20、又はロボット30)を実装するためのプログラムは、CD-ROM等の光学ディスク、磁気ディスク、半導体メモリなどの各種の記録媒体に記録しておくことができる。また、記録媒体を通じて、又は通信ネットワークなどを介して上記のプログラムをダウンロードすることにより、コンピュータにインストール又はロードすることができる。 The program for mounting the system 1 (or the server device 10, the terminal device 20, or the robot 30) in the present embodiment is recorded on various recording media such as an optical disk such as a CD-ROM, a magnetic disk, and a semiconductor memory. Can be done. It can also be installed or loaded on a computer by downloading the above program through a recording medium or via a communication network or the like.
 本発明は、上記した実施の形態に限定されるものではなく、本発明の要旨を逸脱しない範囲内において、他の様々な形で実施することができる。上記実施形態はあらゆる点で単なる例示にすぎず、限定的に解釈されるものではない。 The present invention is not limited to the above-described embodiment, and can be carried out in various other forms without departing from the gist of the present invention. The above embodiments are merely examples in all respects and are not to be construed in a limited manner.
1 システム
10 サーバ装置
20 端末装置
30 ロボット
1 System 10 Server device 20 Terminal device 30 Robot

Claims (9)

  1.  一又は複数のロボットと、サーバ装置とを備えるコミュニケーションシステムであって、
     前記ロボットは、
     前記ロボットの位置情報を取得する測位部と、
     端末装置を介した遠隔操作に応じて前記ロボットを移動可能な駆動部と、
     前記ロボットの周辺データを取得する周辺データ取得部と、
     前記周辺データに基づく第1データと、前記周辺データ及び前記位置情報に基づく第2データとを生成する第1処理部と、
     前記第1データを前記端末装置に送信し、前記第2データを前記サーバ装置に送信する通信部と
     を備え、
     前記サーバ装置は、
     前記ロボットから受信した前記第2データに基づくマップデータを生成する第2処理部と、
     前記マップデータを記憶する記憶部と
     を備える、コミュニケーションシステム。
    A communication system including one or more robots and a server device.
    The robot
    The positioning unit that acquires the position information of the robot and
    A drive unit that can move the robot according to remote control via a terminal device,
    Peripheral data acquisition unit that acquires peripheral data of the robot,
    A first processing unit that generates first data based on the peripheral data, second data based on the peripheral data and the position information, and the like.
    A communication unit for transmitting the first data to the terminal device and transmitting the second data to the server device is provided.
    The server device is
    A second processing unit that generates map data based on the second data received from the robot, and
    A communication system including a storage unit for storing the map data.
  2.  前記通信部は、前記ロボットが端末装置から遠隔操作されていないときに、前記第2データを前記サーバ装置に送信する、請求項1に記載のコミュニケーションシステム。 The communication system according to claim 1, wherein the communication unit transmits the second data to the server device when the robot is not remotely controlled from the terminal device.
  3.  前記通信部は、前記ロボットが所定の場所で停止しているときに、前記第2データを前記サーバ装置に送信する、請求項1又は2に記載のコミュニケーションシステム。 The communication system according to claim 1 or 2, wherein the communication unit transmits the second data to the server device when the robot is stopped at a predetermined place.
  4.  前記第2データは、前記周辺データに含まれる所定のオブジェクトの画像に対してマスキング処理が実施されたデータを含む、請求項1から3のいずれか一項に記載のコミュニケーションシステム。 The communication system according to any one of claims 1 to 3, wherein the second data includes data in which a masking process is performed on an image of a predetermined object included in the peripheral data.
  5.  前記第2データは、前記所定のオブジェクトに関するメタデータを含む、請求項4に記載のコミュニケーションシステム。 The communication system according to claim 4, wherein the second data includes metadata about the predetermined object.
  6.  前記サーバ装置は、複数の前記ロボットから前記第2データを受信し、
     前記第2処理部は、複数の前記ロボットから受信した前記第2データに基づいて前記マップデータを生成する、請求項1から5のいずれか一項に記載のコミュニケーションシステム。
    The server device receives the second data from the plurality of robots, and receives the second data.
    The communication system according to any one of claims 1 to 5, wherein the second processing unit generates the map data based on the second data received from the plurality of robots.
  7.  自機の位置情報を取得する測位部と、
     端末装置を介した遠隔操作に応じて移動可能な駆動部と、
     周辺データを取得する周辺データ取得部と、
     前記周辺データに基づく第1データと、前記周辺データ及び前記位置情報に基づく第2データとを生成する第1処理部と、
     前記第1データを前記端末装置に送信し、前記第2データをサーバ装置に送信する通信部と
     を備えるロボット。
    The positioning unit that acquires the position information of the own machine and
    A drive unit that can be moved according to remote control via a terminal device,
    Peripheral data acquisition unit to acquire peripheral data and
    A first processing unit that generates first data based on the peripheral data, second data based on the peripheral data and the position information, and the like.
    A robot including a communication unit that transmits the first data to the terminal device and transmits the second data to the server device.
  8.  ロボットの制御方法であって、
     測位部により前記ロボットの位置情報を取得することと、
     端末装置を介した遠隔操作に応じて前記ロボットの移動を制御することと、
     カメラ又センサにより周辺データを取得することと、
     前記周辺データに基づく第1データと、前記周辺データ及び前記位置情報に基づく第2データとを生成することと、
     前記第1データを前記端末装置に送信し、前記第2データをサーバ装置に送信することと
     を含む制御方法。
    It ’s a robot control method.
    Acquiring the position information of the robot by the positioning unit and
    Controlling the movement of the robot according to remote control via the terminal device,
    Acquiring peripheral data with a camera or sensor,
    To generate the first data based on the peripheral data and the second data based on the peripheral data and the location information.
    A control method comprising transmitting the first data to the terminal device and transmitting the second data to the server device.
  9.  ロボットの制御方法をコンピュータに実施させるためのプログラムを記憶した記憶媒体であって、前記制御方法は、
     測位部により前記ロボットの位置情報を取得することと、
     端末装置を介した遠隔操作に応じて前記ロボットの移動を制御することと、
     カメラ又はセンサにより周辺データを取得することと、
     前記周辺データに基づく第1データと、前記周辺データ及び前記位置情報に基づく第2データとを生成することと、
     前記第1データを前記端末装置に送信し、前記第2データをサーバ装置に送信することと
     を含む、記憶媒体。
    A storage medium that stores a program for causing a computer to execute a robot control method, wherein the control method is
    Acquiring the position information of the robot by the positioning unit and
    Controlling the movement of the robot according to remote control via the terminal device,
    Acquiring peripheral data with a camera or sensor,
    To generate the first data based on the peripheral data and the second data based on the peripheral data and the location information.
    A storage medium comprising transmitting the first data to the terminal device and transmitting the second data to the server device.
PCT/JP2021/037934 2020-10-29 2021-10-13 Communication system, robot, and storage medium WO2022091787A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020181495A JP2022072184A (en) 2020-10-29 2020-10-29 Communication system, robot, and storage medium
JP2020-181495 2020-10-29

Publications (1)

Publication Number Publication Date
WO2022091787A1 true WO2022091787A1 (en) 2022-05-05

Family

ID=81383803

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/037934 WO2022091787A1 (en) 2020-10-29 2021-10-13 Communication system, robot, and storage medium

Country Status (2)

Country Link
JP (1) JP2022072184A (en)
WO (1) WO2022091787A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116016621A (en) * 2023-03-28 2023-04-25 麦岩智能科技(北京)有限公司 Operation and maintenance method, device, electronic equipment and system for cleaning robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017047519A (en) * 2015-09-04 2017-03-09 Rapyuta Robotics株式会社 Cloud robotics system, information processor, program, and method for controlling or supporting robot in cloud robotics system
KR20170071212A (en) * 2015-12-15 2017-06-23 주식회사 엘지유플러스 Movable Home-Robot Apparatus Monitoring Home Devices And Method of Threof
JP2019062308A (en) * 2017-09-25 2019-04-18 富士ゼロックス株式会社 Mobile object with camera, mobile object with camera control system, and program
JP2019139692A (en) * 2018-02-15 2019-08-22 トヨタ自動車株式会社 Mobile shop vehicle and mobile shop system
US20200331148A1 (en) * 2018-01-24 2020-10-22 Qfeeltech (Beijing) Co., Ltd. Cleaning robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017047519A (en) * 2015-09-04 2017-03-09 Rapyuta Robotics株式会社 Cloud robotics system, information processor, program, and method for controlling or supporting robot in cloud robotics system
KR20170071212A (en) * 2015-12-15 2017-06-23 주식회사 엘지유플러스 Movable Home-Robot Apparatus Monitoring Home Devices And Method of Threof
JP2019062308A (en) * 2017-09-25 2019-04-18 富士ゼロックス株式会社 Mobile object with camera, mobile object with camera control system, and program
US20200331148A1 (en) * 2018-01-24 2020-10-22 Qfeeltech (Beijing) Co., Ltd. Cleaning robot
JP2019139692A (en) * 2018-02-15 2019-08-22 トヨタ自動車株式会社 Mobile shop vehicle and mobile shop system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116016621A (en) * 2023-03-28 2023-04-25 麦岩智能科技(北京)有限公司 Operation and maintenance method, device, electronic equipment and system for cleaning robot

Also Published As

Publication number Publication date
JP2022072184A (en) 2022-05-17

Similar Documents

Publication Publication Date Title
US11644832B2 (en) User interaction paradigms for a flying digital assistant
US11573562B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
CN108449945B (en) Information processing apparatus, information processing method, and program
KR20140031316A (en) Tracking and following of moving objects by a mobile robot
JP7156305B2 (en) CONTROL DEVICE, CONTROL METHOD, PROGRAM, AND MOVING OBJECT
US10908609B2 (en) Apparatus and method for autonomous driving
CN111373347B (en) Apparatus, method and computer program for providing virtual reality content
WO2016168722A1 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
JP4348468B2 (en) Image generation method
WO2022091787A1 (en) Communication system, robot, and storage medium
KR20200128486A (en) Artificial intelligence device for determining user&#39;s location and method thereof
KR20200020295A (en) AUGMENTED REALITY SERVICE PROVIDING APPARATUS INTERACTING WITH ROBOT and METHOD OF THEREOF
CN114333404A (en) Vehicle searching method and device for parking lot, vehicle and storage medium
US20230280742A1 (en) Magic Wand Interface And Other User Interaction Paradigms For A Flying Digital Assistant
EP4234181A1 (en) Information processing device, information processing method, and storage medium
WO2022004422A1 (en) Information processing device, information processing method, and recording medium
US20240118703A1 (en) Display apparatus, communication system, display control method, and recording medium
US20230205198A1 (en) Information processing apparatus, route generation system, route generating method, and non-transitory recording medium
JP7186484B1 (en) Program, information processing device, method and system
EP4207100A1 (en) Method and system for providing user interface for map target creation
JP6945149B2 (en) Call system, call method
US20240053746A1 (en) Display system, communications system, display control method, and program
WO2021140916A1 (en) Moving body, information processing device, information processing method, and program
JP2023083072A (en) Method, system and program
JP2023131258A (en) Information processing device and control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21885906

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21885906

Country of ref document: EP

Kind code of ref document: A1