WO2022196080A1 - Information processing device, information processing method, program, moving device, and information processing system - Google Patents

Information processing device, information processing method, program, moving device, and information processing system Download PDF

Info

Publication number
WO2022196080A1
WO2022196080A1 PCT/JP2022/001718 JP2022001718W WO2022196080A1 WO 2022196080 A1 WO2022196080 A1 WO 2022196080A1 JP 2022001718 W JP2022001718 W JP 2022001718W WO 2022196080 A1 WO2022196080 A1 WO 2022196080A1
Authority
WO
WIPO (PCT)
Prior art keywords
absolute position
information processing
self
absolute
orientation
Prior art date
Application number
PCT/JP2022/001718
Other languages
French (fr)
Japanese (ja)
Inventor
英一郎 森永
淳一郎 三澤
達也 石川
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022196080A1 publication Critical patent/WO2022196080A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • G01S17/875Combinations of systems using electromagnetic waves other than radio waves for determining attitude
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw

Definitions

  • the present disclosure relates to an information processing device, an information processing method, a program, a mobile device, and an information processing system, and in particular, it is possible to estimate a self position and orientation with higher accuracy with less computational load, for example.
  • the present invention relates to an information processing device, an information processing method, a program, a mobile device, and an information processing system.
  • Patent Document 1 As a technology related to self-location estimation, for example, there is a technology disclosed in Patent Document 1.
  • Patent Document 1 an omnidirectional predicted image is created by synthesizing an omnidirectional image obtained when the robot is placed at an initial position, assuming that the robot has moved from the initial position. and an omnidirectional image newly obtained by the actual movement of the robot to detect the position and orientation of the robot.
  • the present disclosure has been made in view of such circumstances, and is intended to enable estimation of the self-position and orientation with less computational load and with higher accuracy.
  • An information processing apparatus includes at least two absolute position sensors and a self-position/orientation calculation unit that calculates a self-position and orientation, and the self-position/orientation calculation unit obtains information from the absolute position sensor. and the relative position of the absolute position sensor with respect to the reference position of the mobile device.
  • An information processing method and program according to one aspect of the present disclosure are an information processing method and program corresponding to the information processing apparatus according to one aspect of the present disclosure described above.
  • a mobile device is based on at least two or more absolute position sensors, the absolute positions acquired from the absolute position sensors, and the relative positions of the absolute position sensors with respect to the own reference position. and a control unit that calculates an attitude.
  • At least two absolute position sensors are provided. and attitude are calculated.
  • An information processing system includes a moving device and an absolute position detection mat on which a predetermined pattern indicating an absolute position is formed, wherein the moving device includes at least two absolute position sensors; a controller for calculating a self-position and orientation based on the absolute position obtained from the absolute position sensor and the relative position of the absolute position sensor with respect to the self-reference position, wherein the mobile device is for detecting the absolute position;
  • the information processing system identifies the absolute position by recognizing the pattern with the absolute position sensor when the object exists on the mat.
  • the absolute position is specified by recognizing the pattern by at least two or more absolute position sensors.
  • the self-position and orientation are calculated based on the absolute position and the relative position of the absolute position sensor with respect to the self-reference position.
  • the information processing device or mobile device may be an independent device, or may be an internal block forming one device.
  • FIG. 1 is a diagram illustrating a first example of an external configuration of a robot device to which the present disclosure is applied;
  • FIG. FIG. 4 is a diagram showing a second example of the configuration of the appearance of a robot device to which the present disclosure is applied;
  • 1 is a diagram illustrating an example of components of a robot apparatus to which the present disclosure is applied;
  • FIG. FIG. 4 is a diagram showing an example of a movable screen in a robot device to which the present disclosure is applied;
  • FIG. 4 is a diagram showing a first example of cooperative motion by a plurality of robot devices;
  • FIG. 10 is a diagram illustrating a second example of cooperative motion by a plurality of robot devices;
  • FIG. 11 is a diagram illustrating a third example of cooperative motion by a plurality of robot devices;
  • FIG. 11 is a diagram showing a fourth example of cooperative motion by a plurality of robot devices;
  • FIG. 4 is a diagram showing an example of position measurement using a plurality of optical absolute position sensors in a robot device to which the present disclosure is applied;
  • 1 is a block diagram showing an example of a functional configuration of a robot device to which the present disclosure is applied;
  • FIG. FIG. 4 is a diagram showing an example of arrangement of a plurality of optical absolute position sensors;
  • FIG. 10 is a diagram showing another example of arrangement of a plurality of optical absolute position sensors;
  • FIG. 10 is a diagram showing an example of a method of calculating self-position and orientation using measurement results from a plurality of optical absolute position sensors; It is a figure which shows the usage example of the robot apparatus to which this indication is applied. It is a block diagram which shows the example of a structure of a computer.
  • FIG. 1 and 2 show an example of an external configuration of a robot device to which the present disclosure is applied.
  • FIG. 1 shows a top view, a front view, and a side view of a robot device to which the present disclosure is applied.
  • FIG. 2 shows a state in which the display in the robot device to which the present disclosure is applied is moved.
  • the robot device 10 is an autonomous robot. Further, the robot apparatus 10 is a mobile robot (autonomous mobile robot) having a mobile mechanism such as wheels, and can freely move in space.
  • a mobile robot autonomous mobile robot having a mobile mechanism such as wheels, and can freely move in space.
  • the robot device 10 has a substantially rectangular parallelepiped shape, and has a display capable of displaying display information such as images on its upper surface.
  • the display (screen) on the upper surface is movable, and can be adjusted to a desired angle with respect to a plane (a moving surface such as a floor surface or the ground), and its posture can be fixed.
  • FIG. 3 shows an example of components of a robot device to which the present disclosure is applied.
  • the robot apparatus 10 includes a control unit 101 for controlling the operation of each part, an image display unit 102 including a display for displaying images, and a screen elevation mechanism including a mechanism for raising and lowering the image display unit 102. and a unit 103 .
  • the physical structural portion of the robot device 10 is also referred to as the body.
  • a thin plate-like image display unit 102 provided on the upper surface of the housing of the robot device 10 is moved by a screen lifting unit 103 and fixed in a desired posture.
  • the image display unit 102 can move around its lower end portion, and when the image display unit 102 opens upward, the inside of the housing is exposed to the outside.
  • the robot device 10 has a left motor encoder 104-1 and a left motor 105-1, and a right motor encoder 104-2 and a right motor 105-2.
  • the robot apparatus 10 employs a differential two-wheel drive type, and the left and right motors 105-1 and 105-2 operate to enable movement by the left and right wheels.
  • the left motor encoder 104-1 and the right motor encoder 104-2 detect the amount of rotational movement of the left motor 105-1 and the right motor 105-2.
  • the robot device 10 has various sensors such as sensors 106-1 to 106-3.
  • the sensors 106-1 to 106-3 include IMU (Inertial Measurement Unit), LiDAR (Light Detection and Ranging), position sensors, cameras, and the like.
  • the robot apparatus 10 operates as an autonomous mobile robot using sensor signals detected by various sensors.
  • the battery unit 107 supplies electric power to each part of the robot device 10 .
  • FIG. 4 shows an example in which a movable display (movable screen) in the robot device 10 is moved.
  • the robot device 10 is used on a court for playing sports.
  • the robot device 10 changes the posture of (the display of) the image display unit 102 according to the situation such as its own position and posture, and the position of the target spectators (spectators watching the game on the court). , can be adjusted to the desired posture.
  • the robot device 10 presents an image to the spectators on the 1st floor and 2nd floor seats.
  • the posture of (the display of) the image display unit 102 is changed when the image is presented to the audience.
  • the tilt of the image display unit 102 is adjusted so as to be gentler according to the line of sight of the spectators. .
  • the inclination of the image display unit 102 may be adjusted to be steeper.
  • the robot device 10 does not tilt the image display unit 102 (the screen of the display is displayed on the floor, the ground, etc.). (parallel to the plane of movement of the screen) so that all spectators in the surroundings have their eyes directed to the screen.
  • FIGS. 5 to 8 show an example in which a plurality of robot devices 10 cooperate to form a pseudo screen having a predetermined shape.
  • FIG. 5 eight robot devices 10-1 to 10-8 perform cooperative operations and are arranged in two rows and four columns (2 ⁇ 4), so that the display (2 ⁇ 4 screen) of each robot device 10 is displayed. are combined to configure one screen (large screen) having a pseudo predetermined shape. Using this large screen (2 ⁇ 4 screen), one large image can be presented to the audience.
  • FIG. 6 eight robot devices 10-1 to 10-8 perform coordinated operations and are arranged in one row and eight columns (1 ⁇ 8), so that the display (1 ⁇ 8 screen) of each robot device 10 is displayed. are combined to construct one pseudo-screen (landscape screen). Using this large screen (1 ⁇ 8 screen), one horizontally long image can be presented to the audience.
  • the robot devices 10-1 to 10-8 change the posture of (the display of) the image display unit 102 according to the situation such as the position and posture of the robot device and the position of the target audience. It can be adjusted to the desired posture.
  • all of the image display units 102 are tilted at the same angle.
  • the image display unit 102 may be tilted at different angles.
  • the robot apparatus 10 estimates its own position and orientation based on absolute positions measured by a plurality of optical absolute position sensors during autonomous movement.
  • FIG. 9 shows an example of measurement using four optical absolute position sensors 161-1 to 161-4 as the plurality of optical absolute position sensors in the robot apparatus 10.
  • the absolute position sensor is a sensor that can measure the absolute position. That is, the absolute position sensor does not measure the distance from one point to the next point, but can measure the position of a certain point only by measuring that point.
  • an optical absolute position sensor measures an absolute position by detecting light from an object such as visible light and infrared rays. For example, an optical absolute position sensor identifies a position (measures an absolute position) within a predetermined area such as on a court by reading unique information from a predetermined pattern.
  • the accuracy of the absolute position sensor depends on the fineness of a predetermined pattern, the accuracy of posture estimation by one absolute position sensor does not meet the accuracy required for the robot apparatus 10 to travel over a wide area such as on a court. On the other hand, it is extremely low, and it is difficult for the robot device 10 to travel over a wide area using a single absolute position sensor.
  • the robot device 10 can measure the absolute position by reading a predetermined pattern formed on the absolute position detection mat 20 by each of the plurality of optical absolute position sensors 161 installed on the bottom surface of the robot device 10 .
  • the optical absolute position sensor 161 is installed at a position relative to a reference position on the bottom surface of the robot apparatus 10 and at a position where a predetermined pattern formed on a moving surface facing the bottom surface can be recognized.
  • the robot apparatus 10 can determine its own position and posture with higher accuracy. can be calculated.
  • the absolute position detection mat 20 is installed on a moving surface such as a floor on which the robot device 10 can move.
  • a predetermined pattern indicating an absolute position is formed on the surface of the mat 20 for absolute position detection, and the absolute position can be designated using XY coordinates or the like.
  • a predetermined pattern indicating the absolute position can be printed on the surface of the mat 20 for absolute position detection.
  • the mat 20 for absolute position detection only needs to have a vertical and horizontal size that can accommodate most of the body when the robot device 10 is placed thereon. It is possible to measure the absolute position even if it becomes a posture.
  • the absolute position detection mat 20 needs to be installed in accordance with the area of the court. Further, by using a transparent mat that can reflect infrared rays and has a pattern (infrared pattern) indicating an absolute position as the mat 20 for absolute position detection, when the mat is attached to a moving surface such as a floor, It is possible to prevent spectators from noticing the presence of the absolute position detection mat 20. ⁇
  • the optical absolute position sensor can directly measure the absolute position, the calculation load is smaller than when using other sensors, and even if multiple sensors are installed, the calculation processing load is low. Therefore, in the robot apparatus 10, a low-cost computer can be used as a computer for performing calculation processing.
  • the computational load is small, but the accuracy is low, and the robot device must operate with high accuracy. is difficult to use.
  • the accuracy of estimating the self-position of the robot apparatus is lower than that of a system using a camera when the optical absolute position sensor is used alone.
  • a predetermined pattern indicating the absolute position formed on the absolute position detection mat 20 is detected by a plurality of optical absolute position sensors 161 respectively installed at relative positions with respect to the reference position. Reading it causes the absolute position to be measured.
  • a plurality of sensors can be installed. By doing so, it is possible to estimate the self-position and orientation with high accuracy while keeping the computational load small.
  • FIG. 10 shows an example of a functional configuration of a robot device to which the present disclosure is applied.
  • the robot device 10 has a main CPU (Central Processing Unit) 151 and a sensor 152 .
  • the main CPU 151 is included in the control unit 101 of FIG. 3, for example.
  • the main CPU 151 has a self-position/orientation calculator 171 and a controller 172 .
  • the sensor 152 has optical absolute position sensors 161-1 to 161-N (N is an integer equal to or greater than 2).
  • the optical absolute position sensor 161-1 is installed at a predetermined position on the bottom surface of the robot device 10.
  • the optical absolute position sensor 161-1 identifies the absolute position indicated by the recognized pattern by recognizing a predetermined pattern formed on the moving surface while the robot device 10 is running or stopped.
  • the optical absolute position sensor 161 - 1 supplies information about the identified absolute position to the self-position/orientation calculator 171 .
  • the optical absolute position sensors 161-2 to 161-N are configured in the same manner as the optical absolute position sensor 161-1, and are installed at predetermined positions on the bottom surface of the robot device 10, respectively.
  • Each of the optical absolute position sensors 161-2 to 161-N when the robot device 10 is running or stopped, automatically transmits information about the absolute position specified by recognizing a predetermined pattern formed on the moving surface. It is supplied to the position/orientation calculation unit 171 .
  • the self-position/orientation calculation unit 171 acquires information about the absolute position supplied from each of the optical absolute position sensors 161-1 to 161-N.
  • the information about the absolute position can be coordinates represented by the XY coordinate system, an ID (position ID) for identifying the target position, or the like.
  • the self-position/orientation calculation unit 171 holds in advance information on the relative positions of the bottom surface of the robot device 10 relative to the reference position for each of the optical absolute position sensors 161-1 to 161-N installed on the bottom surface of the robot device 10. is doing.
  • the self-position/orientation calculator 171 determines the position of the robot based on the absolute positions obtained from each of the plurality of optical absolute position sensors 161 and the relative positions of the optical absolute position sensors 161 with respect to the reference position on the bottom surface of the robot apparatus 10. The self-position and orientation of the device 10 are calculated. The self-position/orientation calculation unit 171 supplies information about the calculated self-position and orientation to the control unit 172 .
  • the control section 172 controls the operation of each section of the robot device 10 .
  • the control unit 172 acquires information about the self-position and orientation supplied from the self-position/posture calculation unit 171 .
  • the control unit 172 performs control to autonomously move the robot apparatus 10 based on its own position and orientation.
  • the self-position/orientation calculator 171 calculates the absolute positions obtained from at least two or more optical absolute position sensors 161 and the relative positions of the optical absolute position sensors 161 with respect to the reference position of the bottom surface. Based on the position, the self-position and orientation are calculated. Since the absolute position measured by the optical absolute position sensor 161 and the relative position stored in advance are used when calculating the self position and orientation, there is no need to perform processing with a high computational load such as image processing. , a process with a lower computational load (for example, an arithmetic process such as addition or multiplication) may be performed. Moreover, by measuring the absolute position using a plurality of optical absolute position sensors 161, the accuracy of the self-position and orientation can be improved. Therefore, the robot apparatus 10 to which the present disclosure is applied can estimate its own position and orientation with less computational load and with higher accuracy.
  • FIG. 11 shows an example of arrangement of a plurality of optical absolute position sensors 161. As shown in FIG. 11A and 11B show a side view and a bottom view of the robot device 10. FIG. In practice, a movement mechanism such as wheels is provided on the bottom surface of the robot device 10, but the illustration is omitted in order to make the arrangement of the sensors easier to understand.
  • optical absolute position sensors 161-1 to 161-4 are arranged side by side in the longitudinal direction of the bottom surface with respect to the reference position.
  • the position of the robot device 10 on the court is defined as the center (center of gravity) of the bottom surface of the machine body
  • the center of the bottom surface of the machine body is the reference position.
  • the reference position is not limited to the center of the bottom surface of the fuselage, and may be another position such as the upper left of the bottom surface of the fuselage.
  • each of the optical absolute position sensors 161-1 to 161-4 has lower accuracy than a system using a camera, each optical absolute position sensor 161 outputs an absolute position as a measurement result. Even when installed, the computational load does not increase so much that it is necessary to use a high-performance computer. As shown in FIG. 11, when the optical absolute position sensors 161-1 to 161-4 are arranged in the longitudinal direction with respect to the reference position on the bottom surface, the value in the longitudinal direction (y direction) is fixed. can be performed, the amount of calculation can be further reduced.
  • optical absolute position sensors 161-1 to 161-4 are arranged at positions relative to the reference position on the bottom surface of the robot device 10, and information about the inclination measurable by each optical absolute position sensor 161 is not used. By measuring and holding the relative positions of the optical absolute position sensors 161-1 to 161-4 in advance, the self position and orientation can be calculated using the measured absolute positions.
  • the robot apparatus 10 uses an infrared absolute position sensor as the optical absolute position sensor 161, it is not necessary to provide a separate illumination or perform calibration compared to a system using a camera. becomes unnecessary.
  • FIG. 11B is an example of the arrangement of the plurality of optical absolute position sensors 161.
  • the number and arrangement of the sensors are arbitrary, and other arrangements may be adopted. The greater the number of optical absolute position sensors 161, the more widely they can be distributed on the bottom surface of the robot device 10, and thus the more errors can be reduced.
  • FIG. 12 shows another example of the arrangement of the plurality of optical absolute position sensors 161.
  • FIG. 12 Similar to FIG. 11B, FIGS. 12A to 12C show bottom views of the robot apparatus 10.
  • FIG. 12A to 12C show bottom views of the robot apparatus 10.
  • the optical absolute position sensors 161-1 to 161-4 are arranged side by side in the longitudinal direction with respect to the reference position of the bottom surface (for example, the center of the bottom surface), similar to the arrangement of FIG. 11B. However, the spacing between the sensors is different. By arranging the optical absolute position sensors 161-1 to 161-4 in a straight line, the calculation load can be reduced.
  • the optical absolute position sensors 161-1 and 161-2 are arranged at relative positions with respect to the reference position of the bottom surface (for example, the center of the bottom surface).
  • the number of optical absolute position sensors 161 should be at least two, and the greater the distance between the sensors, the higher the accuracy of the posture (angle) of the robot device 10 can be.
  • the optical absolute position sensors 161-1 to 161-8 are arranged in 2 rows and 4 columns (2 ⁇ 4) with respect to the reference position of the bottom surface (for example, the center of the bottom surface). By increasing the number of optical absolute position sensors 161, both the accuracy of the position and orientation (angle) of the robot device 10 can be improved.
  • FIG. 13 shows the absolute position (position P N ) measured by the optical absolute position sensor 161 and the self position/orientation calculation when calculating the self position and orientation using the measurement results of the plurality of optical absolute position sensors 161 .
  • the relationship with the self-position (position P T ) calculated by the unit 171 is shown.
  • the robot device 10 since the robot device 10 is positioned at the center of the bottom surface of the machine body, the center of the bottom surface of the machine body is the reference position.
  • the self-position/orientation calculation unit 171 calculates the minimum distance between the position P N measured by each optical absolute position sensor 161 and the position P T (x, y, ⁇ ) as the true position and orientation of the robot device 10 . By applying multiplication, an optimization calculation is performed to estimate the values of x, y, and ⁇ that minimize the error.
  • Equation (1) the values of x 1 and y 1 are known values measured in advance.
  • Expression (4) the three expressions shown in Expression (4) can be expressed by the following Expressions (5), (6), and (7), respectively.
  • Formula (8) is expressed as the following formula (9) by expanding and summarizing.
  • Equation (9) By substituting the actually measured values (x mi , y mi ) in Equation (9), the values of x, y, and ⁇ can be obtained. Then, the combination of these values minimizes the error between the measured value and the estimated value. become.
  • equation (9) is a simple three-dimensional first-order simultaneous equation including addition and multiplication, so a solution can be obtained with a smaller computational load. Furthermore, even if the number of optical absolute position sensors 161 installed in the robot device 10 is increased, if the position of the bottom surface of the machine body on which the optical absolute position sensors 161 are installed is known, the equation (9) is similarly applied. can do.
  • the actually measured values (x mi , y mi ) contain only information about the absolute position (not including information about the tilt), but the self-position is only the position (x, y). can also estimate the orientation ( ⁇ ).
  • the case of using the least squares method was exemplified. However, if the least squares method is not used, nonlinear optimization calculations must be performed, resulting in high calculation costs.
  • FIG. 14 shows a usage example of the robot device 10 to which the present disclosure is applied.
  • FIG. 14 a scene is assumed in which a plurality of robot devices 10 are synchronized in a basketball court 30 and perform performances by coordinated actions such as running and lining up.
  • Each robot device 10 has an autonomous movement function, but is equipped with an internal sensor such as a wheel speed sensor and an IMU, and does not have means for recognizing an absolute position with respect to the court 30 .
  • An absolute position detection mat 20 is fixed to a specific area (starting point, etc.) within the court 30 by being aligned with the court 30 and pasted.
  • Each of the robot devices 10-1 to 10-8 placed outside the court 30 is placed on the absolute position detection mat 20 in order from the robot device 10-1 before starting its operation, and the user U turns on the switch.
  • An operation such as pressing is used as a trigger to start measuring the initial position and orientation.
  • the measurement values obtained by reading the predetermined pattern printed on the absolute position detection mat 20 by the plurality of optical absolute position sensors 161 are used as the self-position and orientation calculation method described above. is used to calculate the self-position and attitude (x, y, ⁇ ) obtained by integration, and recorded in the memory as the initial position and attitude of the aircraft with respect to the court 30 . Then, the robot device 10-1 that has recorded the initial position and posture generates route information regarding a route toward a predetermined position (such as the starting point of the performance) within the court 30 based on the initial position and posture at that time. do. The robot device 10-1 travels toward a predetermined position within the court 30 based on the generated route information.
  • a predetermined position such as the starting point of the performance
  • the robot devices 10-2 to 10-8 are placed on the absolute position detection mat 20 in order.
  • the initial position and orientation of the machine body are calculated based on the measurement values measured by the plurality of optical absolute position sensors 161.
  • route information is generated, and the player runs toward a predetermined position (such as a starting point of the performance) within the court 30 .
  • each of the robot devices 10-1 to 10-8 sequentially travels toward a predetermined area within the court 30 and automatically aligns. Since the initial positions and postures of the robot devices 10-1 to 10-8 are matched, they can act in coordination to perform performances (eg, the coordinated motions shown in FIGS. 5 to 8).
  • each robot device 10 can detect the position of the machine body (own machine) on the court 30 . and posture can be recognized. At that time, even if the absolute position detection mat 20 were fixed to the court 30 with a deviation, since all the robot devices 10 have the same error, there is no relative error, and synchronization is possible. It does not affect predetermined actions (such as performances by coordinated actions).
  • the initial position and posture of the robot device are adjusted manually or by using a jig (positioning jig) capable of regulating the position of the robot device.
  • the differential two-wheel drive type was exemplified as the drive type of the robot device 10, but other drive types such as an omnidirectional movement type may be used.
  • Display information to be displayed on the display is not limited to video, and may be information such as images and text.
  • the robot device 10 to which the present disclosure is applied can be regarded as an autonomous mobile device having a control unit such as the control unit 101.
  • the control unit is configured as an information processing device having a processor such as a CPU, and may be provided inside the robot device 10 or may be configured as an external device. Further, it may be considered that an information processing system is configured by the robot device 10 to which the present disclosure is applied and the absolute position detection mat 20 .
  • the robot device 10 to which the present disclosure is applied can be regarded as a system (autonomous movement system) in which multiple devices such as a control device, a sensor device, a display device, a communication device, and a movement mechanism are combined.
  • a system means a set of a plurality of constituent elements (devices, modules (parts), etc.), and it does not matter whether or not all the constituent elements are in the same housing. Therefore, a plurality of devices housed in separate enclosures and connected via a network, and a single device housing a plurality of modules within a single enclosure, are both systems.
  • the robot device 10 to which the present disclosure is applied can further include an attachment for cleaning.
  • This cleaning attachment is in the form of a mop, and is attached to the front, rear, side, bottom, etc. of the robot device 10 so that the robot device 10 can autonomously travel to clean the travel route. can be done.
  • the location to be cleaned may be given in advance as a travel route, or may be cleaned by recognizing an instruction such as "clean this area" from the instructor by gesture recognition.
  • this gesture recognition the gesture of the target is recognized by recognizing the posture, movement, etc. of the target indicator based on the sensor signal from the sensor 106 (camera or the like).
  • the cleaning operation and the image display may be performed in cooperation.
  • a video display to that effect may be performed, or an advertisement or other video display may be performed during cleaning.
  • the attitude of (the display of) the video display unit 102 may also be controlled.
  • the cleaning attachment is not limited to the illustrated mop-shaped one, and includes other types such as a dustpan-shaped one.
  • a series of processes in estimating the self-position and orientation described above can be executed by hardware or by software.
  • a program constituting the software is installed in the computer of each device.
  • FIG. 15 is a block diagram showing an example of the hardware configuration of a computer that executes a series of processes in estimating the self-position and orientation described above by means of a program.
  • CPU 1001 CPU 1001 , ROM (Read Only Memory) 1002 and RAM (Random Access Memory) 1003 are interconnected by bus 1004 .
  • An input/output interface 1005 is further connected to the bus 1004 .
  • An input unit 1006 , an output unit 1007 , a recording unit 1008 , a communication unit 1009 and a drive 1010 are connected to the input/output interface 1005 .
  • the input unit 1006 consists of various sensors, microphones, switches, and the like.
  • the output unit 1007 includes a speaker, a display, and the like.
  • the recording unit 1008 includes an auxiliary storage device such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
  • the communication unit 1009 includes a communication module compatible with communication methods such as wireless LAN (Local Area Network), cellular communication (for example, 5G, etc.), Bluetooth (registered trademark), and the like.
  • a drive 1010 drives a removable medium 1011 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
  • the CPU 1001 loads a program recorded in the ROM 1002 and the recording unit 1008 into the RAM 1003 via the input/output interface 1005 and the bus 1004, and executes it. A series of processes are performed.
  • the program executed by the computer 1000 can be provided by being recorded on removable media 1011 such as package media, for example. Also, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the recording unit 1008 via the input/output interface 1005 by loading the removable medium 1011 into the drive 1010 . Also, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the recording unit 1008 . In addition, programs can be installed in the ROM 1002 and the recording unit 1008 in advance.
  • processing performed by a computer according to a program also includes processing that is executed in parallel or individually (for example, parallel processing or processing by objects).
  • the program may be processed by one computer (processor) or may be processed by a plurality of computers in a distributed manner.
  • each step of the series of processes in estimating the self-position and orientation described above can be performed by one device or shared by a plurality of devices.
  • the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
  • the self position and orientation calculation unit calculates the self position and orientation based on the absolute position obtained from the absolute position sensor and the relative position of the absolute position sensor with respect to a reference position of the mobile device.
  • the absolute position sensor is an optical absolute position sensor installed so as to be able to recognize a predetermined pattern formed on a moving surface on which the moving device can move.
  • the absolute position includes at least the coordinates of the optical absolute position sensor; The information processing apparatus according to (2), wherein the optical absolute position sensor identifies the coordinates by recognizing the predetermined pattern.
  • the predetermined pattern includes a predetermined pattern indicating an absolute position formed on a mat or a floor surface.
  • the reference position is the center of the bottom surface of the moving device;
  • the absolute position sensor is arranged with reference to the center of the bottom surface of the mobile device.
  • the information processing device An information processing method, comprising: calculating a self position and orientation of the mobile device based on absolute positions obtained from at least two absolute position sensors and relative positions of the absolute position sensors with respect to a reference position of the mobile device.
  • the computer Functioning as a self-position/orientation calculator that calculates the self-position and orientation of the mobile device based on the absolute positions obtained from at least two or more absolute position sensors and the relative positions of the absolute position sensors with respect to the reference position of the mobile device.
  • a program that makes (9) at least two absolute position sensors;
  • a mobile device comprising: a control unit that calculates its own position and orientation based on the absolute position acquired from the absolute position sensor and the relative position of the absolute position sensor with respect to its own reference position.
  • a mobile device and an absolute position detection mat on which a predetermined pattern indicating an absolute position is formed
  • the moving device at least two absolute position sensors; a control unit that calculates the self-position and orientation based on the absolute position acquired from the absolute position sensor and the relative position of the absolute position sensor with respect to the self-reference position,
  • An information processing system that identifies the absolute position by recognizing the pattern with the absolute position sensor when the mobile device is present on the absolute position detection mat.
  • the absolute position detection mat has an infrared pattern indicating the absolute position.
  • Robot device 20 Absolute position detection mat, 30 Coat, 101 Control unit, 102 Video display unit, 103 Screen lifting unit, 104-1 Left motor encoder, 104-2 Right motor encoder, 105-1 Left motor, 105- 2 Right motor, 106-1 to 106-3, 106 sensor, 107 battery unit, 151 main CPU, 152 sensor, 161, 161-1 to 161-N optical absolute position sensor, 171 self-position/orientation calculator, 172 control Department, 1000 Computer, 1001 CPU

Abstract

The present disclosure pertains to an information processing device, information processing method, program, moving device, and information processing system that make it possible to estimate a self-location and orientation at high accuracy and a smaller calculation load. The present invention provides an information processing device which comprises at least two absolute-position sensors, and a self-location/orientation calculation unit that calculates self-location and orientation. The self-location/attitude calculation unit calculates self-location and orientation on the basis of an absolute position acquired from the absolute-position sensors and the relative positions of the absolute-position sensors with respect to a reference position of the moving device. The present disclosure can be applied, for example, to an autonomous mobile robot device.

Description

情報処理装置、情報処理方法、プログラム、移動装置、及び情報処理システムInformation processing device, information processing method, program, mobile device, and information processing system
 本開示は、情報処理装置、情報処理方法、プログラム、移動装置、及び情報処理システムに関し、特に、例えば、より少ない計算負荷で、かつ、より高精度に自己位置及び姿勢の推定を行うことができるようにした情報処理装置、情報処理方法、プログラム、移動装置、及び情報処理システムに関する。 TECHNICAL FIELD The present disclosure relates to an information processing device, an information processing method, a program, a mobile device, and an information processing system, and in particular, it is possible to estimate a self position and orientation with higher accuracy with less computational load, for example. The present invention relates to an information processing device, an information processing method, a program, a mobile device, and an information processing system.
 近年、自律移動機能を有するロボットの研究開発が盛んに行われている。この種の自律移動型ロボットでは、自身が今どこにいるかを示した自己位置を推定する機能が不可欠である。 In recent years, research and development of robots with autonomous movement functions have been actively carried out. This type of autonomous mobile robot must have a function of estimating its own position, which indicates where it is now.
 自己位置推定に関する技術としては、例えば、特許文献1に開示された技術がある。特許文献1には、ロボットを初期位置に配置したときに取得した全方位画像から、ロボットが初期位置から移動した場合を想定して合成した全方位の予測画像を作成しておき、この予測画像とロボットが実際に移動して新たに取得した全方位画像とを照合してロボットの位置と姿勢を検出する技術が開示されている。 As a technology related to self-location estimation, for example, there is a technology disclosed in Patent Document 1. In Patent Document 1, an omnidirectional predicted image is created by synthesizing an omnidirectional image obtained when the robot is placed at an initial position, assuming that the robot has moved from the initial position. and an omnidirectional image newly obtained by the actual movement of the robot to detect the position and orientation of the robot.
特開2006-220521号公報JP-A-2006-220521
 自律移動型ロボットにおいて自己位置推定を行うに際しては、少ない計算負荷で、かつ、高精度に自己位置及び姿勢の推定を行うことが求められる。特許文献1に開示されている技術では、ロボットの自己位置を推定する際に、カメラにより撮影された撮影画像を用いているため、画像処理での計算負荷が大きくなる。 When estimating the self-position of an autonomous mobile robot, it is required to estimate the self-position and posture with high precision with a small computational load. In the technique disclosed in Patent Document 1, when estimating the self-position of the robot, the captured image captured by the camera is used, so the computational load in the image processing increases.
 本開示はこのような状況に鑑みてなされたものであり、より少ない計算負荷で、かつ、より高精度に自己位置及び姿勢の推定を行うことができるようにするものである。 The present disclosure has been made in view of such circumstances, and is intended to enable estimation of the self-position and orientation with less computational load and with higher accuracy.
 本開示の一側面の情報処理装置は、少なくとも2以上の絶対位置センサと、自己位置及び姿勢を算出する自己位置姿勢算出部とを備え、前記自己位置姿勢算出部は、前記絶対位置センサから取得した絶対位置と、移動装置の基準位置に対する前記絶対位置センサの相対位置とに基づいて、前記自己位置及び姿勢を算出する情報処理装置である。 An information processing apparatus according to one aspect of the present disclosure includes at least two absolute position sensors and a self-position/orientation calculation unit that calculates a self-position and orientation, and the self-position/orientation calculation unit obtains information from the absolute position sensor. and the relative position of the absolute position sensor with respect to the reference position of the mobile device.
 本開示の一側面の情報処理方法、及びプログラムは、上述した本開示の一側面の情報処理装置に対応する情報処理方法、及びプログラムである。 An information processing method and program according to one aspect of the present disclosure are an information processing method and program corresponding to the information processing apparatus according to one aspect of the present disclosure described above.
 本開示の一側面の情報処理装置、情報処理方法、及びプログラムにおいては、少なくとも2以上の絶対位置センサから取得した絶対位置と、移動装置の基準位置に対する絶対位置センサの相対位置とに基づいて、自己位置及び姿勢が算出される。 In the information processing device, information processing method, and program according to one aspect of the present disclosure, based on the absolute positions obtained from at least two absolute position sensors and the relative positions of the absolute position sensors with respect to the reference position of the mobile device, Self position and pose are calculated.
 本開示の一側面の移動装置は、少なくとも2以上の絶対位置センサと、前記絶対位置センサから取得した絶対位置と、自己の基準位置に対する前記絶対位置センサの相対位置とに基づいて、自己位置及び姿勢を算出する制御部とを備えた移動装置である。 A mobile device according to one aspect of the present disclosure is based on at least two or more absolute position sensors, the absolute positions acquired from the absolute position sensors, and the relative positions of the absolute position sensors with respect to the own reference position. and a control unit that calculates an attitude.
 本開示の一側面の移動装置においては、少なくとも2以上の絶対位置センサが設けられ、絶対位置センサから取得した絶対位置と、自己の基準位置に対する絶対位置センサの相対位置とに基づいて、自己位置及び姿勢が算出される。 In the mobile device according to one aspect of the present disclosure, at least two absolute position sensors are provided. and attitude are calculated.
 本開示の一側面の情報処理システムは、移動装置と、絶対位置を示す所定のパターンが形成された絶対位置検出用マットとを備え、前記移動装置は、少なくとも2以上の絶対位置センサと、前記絶対位置センサから取得した絶対位置と、自己の基準位置に対する前記絶対位置センサの相対位置とに基づいて、自己位置及び姿勢を算出する制御部とを有し、前記移動装置が前記絶対位置検出用マット上に存在するとき、前記絶対位置センサにより前記パターンを認識することで、前記絶対位置を特定する情報処理システムである。 An information processing system according to one aspect of the present disclosure includes a moving device and an absolute position detection mat on which a predetermined pattern indicating an absolute position is formed, wherein the moving device includes at least two absolute position sensors; a controller for calculating a self-position and orientation based on the absolute position obtained from the absolute position sensor and the relative position of the absolute position sensor with respect to the self-reference position, wherein the mobile device is for detecting the absolute position; The information processing system identifies the absolute position by recognizing the pattern with the absolute position sensor when the object exists on the mat.
 本開示の一側面の情報処理システムにおいては、移動装置が絶対位置検出用マット上に存在するとき、少なくとも2以上の絶対位置センサによりパターンを認識することで、絶対位置が特定され、特定された絶対位置と、自己の基準位置に対する絶対位置センサの相対位置とに基づいて、自己位置及び姿勢が算出される。 In the information processing system of one aspect of the present disclosure, when the mobile device is present on the absolute position detection mat, the absolute position is specified by recognizing the pattern by at least two or more absolute position sensors. The self-position and orientation are calculated based on the absolute position and the relative position of the absolute position sensor with respect to the self-reference position.
 なお、本開示の一側面の情報処理装置、又は移動装置は、独立した装置であってもよいし、1つの装置を構成している内部ブロックであってもよい。 It should be noted that the information processing device or mobile device according to one aspect of the present disclosure may be an independent device, or may be an internal block forming one device.
本開示を適用したロボット装置の外観の構成の第1の例を示す図である。1 is a diagram illustrating a first example of an external configuration of a robot device to which the present disclosure is applied; FIG. 本開示を適用したロボット装置の外観の構成の第2の例を示す図である。FIG. 4 is a diagram showing a second example of the configuration of the appearance of a robot device to which the present disclosure is applied; 本開示を適用したロボット装置の構成要素の例を示す図である。1 is a diagram illustrating an example of components of a robot apparatus to which the present disclosure is applied; FIG. 本開示を適用したロボット装置における可動式の画面の例を示す図である。FIG. 4 is a diagram showing an example of a movable screen in a robot device to which the present disclosure is applied; 複数のロボット装置による連携動作の第1の例を示す図である。FIG. 4 is a diagram showing a first example of cooperative motion by a plurality of robot devices; 複数のロボット装置による連携動作の第2の例を示す図である。FIG. 10 is a diagram illustrating a second example of cooperative motion by a plurality of robot devices; 複数のロボット装置による連携動作の第3の例を示す図である。FIG. 11 is a diagram illustrating a third example of cooperative motion by a plurality of robot devices; 複数のロボット装置による連携動作の第4の例を示す図である。FIG. 11 is a diagram showing a fourth example of cooperative motion by a plurality of robot devices; 本開示を適用したロボット装置における複数の光学式絶対位置センサを用いた位置計測の例を示す図である。FIG. 4 is a diagram showing an example of position measurement using a plurality of optical absolute position sensors in a robot device to which the present disclosure is applied; 本開示を適用したロボット装置の機能的構成の例を示すブロック図である。1 is a block diagram showing an example of a functional configuration of a robot device to which the present disclosure is applied; FIG. 複数の光学式絶対位置センサの配置の例を示す図である。FIG. 4 is a diagram showing an example of arrangement of a plurality of optical absolute position sensors; 複数の光学式絶対位置センサの配置の他の例を示す図である。FIG. 10 is a diagram showing another example of arrangement of a plurality of optical absolute position sensors; 複数の光学式絶対位置センサによる計測結果を用いた自己位置及び姿勢の算出手法の例を示す図である。FIG. 10 is a diagram showing an example of a method of calculating self-position and orientation using measurement results from a plurality of optical absolute position sensors; 本開示を適用したロボット装置の使用例を示す図である。It is a figure which shows the usage example of the robot apparatus to which this indication is applied. コンピュータの構成の例を示すブロック図である。It is a block diagram which shows the example of a structure of a computer.
<1.本技術の実施の形態> <1. Embodiment of the Present Technology>
(外観の構成)
 図1,図2は、本開示を適用したロボット装置の外観の構成の例を示している。図1には、本開示を適用したロボット装置の上面図、正面図、側面図をそれぞれ示している。図2には、本開示を適用したロボット装置におけるディスプレイを可動させた状態の図を示している。
(External configuration)
1 and 2 show an example of an external configuration of a robot device to which the present disclosure is applied. FIG. 1 shows a top view, a front view, and a side view of a robot device to which the present disclosure is applied. FIG. 2 shows a state in which the display in the robot device to which the present disclosure is applied is moved.
 ロボット装置10は、自律型ロボットである。また、ロボット装置10は、車輪等の移動機構を有する移動型ロボット(自律移動型ロボット)であり、空間内を自由に移動可能である。 The robot device 10 is an autonomous robot. Further, the robot apparatus 10 is a mobile robot (autonomous mobile robot) having a mobile mechanism such as wheels, and can freely move in space.
 ロボット装置10は、略直方体の形状からなり、その上面に、映像等の表示情報を表示可能なディスプレイを有する。ロボット装置10において、上面のディスプレイ(画面)は可動式であり、平面(床面や地面等の移動面)に対して所望の角度に調整してその姿勢を固定することができる。 The robot device 10 has a substantially rectangular parallelepiped shape, and has a display capable of displaying display information such as images on its upper surface. In the robot device 10, the display (screen) on the upper surface is movable, and can be adjusted to a desired angle with respect to a plane (a moving surface such as a floor surface or the ground), and its posture can be fixed.
(構成要素)
 図3は、本開示を適用したロボット装置の構成要素の例を示している。図3において、ロボット装置10は、各部の動作を制御する制御ユニット101と、映像を表示するディスプレイを含む映像表示ユニット102と、映像表示ユニット102を上げたり下げたりするための機構を含む画面昇降ユニット103とを有する。なお、以下の説明では、ロボット装置10の物理的構造部分のことを、機体とも称する。
(Component)
FIG. 3 shows an example of components of a robot device to which the present disclosure is applied. In FIG. 3, the robot apparatus 10 includes a control unit 101 for controlling the operation of each part, an image display unit 102 including a display for displaying images, and a screen elevation mechanism including a mechanism for raising and lowering the image display unit 102. and a unit 103 . In the following description, the physical structural portion of the robot device 10 is also referred to as the body.
 図3では、ロボット装置10の筐体の上面に設けられた薄板状の映像表示ユニット102が、画面昇降ユニット103により可動して、所望の姿勢に固定されている。このように、ロボット装置10において、映像表示ユニット102は、その下端部分を中心に可動することができ、映像表示ユニット102が上方に開くと、筐体の内部が外部に露出する。 In FIG. 3, a thin plate-like image display unit 102 provided on the upper surface of the housing of the robot device 10 is moved by a screen lifting unit 103 and fixed in a desired posture. Thus, in the robot apparatus 10, the image display unit 102 can move around its lower end portion, and when the image display unit 102 opens upward, the inside of the housing is exposed to the outside.
 ロボット装置10は、左モータエンコーダ104-1と左モータ105-1、及び右モータエンコーダ104-2と右モータ105-2を有する。ロボット装置10においては、差動二輪駆動型である駆動形式を用いており、左モータ105-1と右モータ105-2がそれぞれ動作することで、左右の車輪により移動可能となる。左モータエンコーダ104-1と右モータエンコーダ104-2は、左モータ105-1と右モータ105-2の回転の移動量などを検出する。 The robot device 10 has a left motor encoder 104-1 and a left motor 105-1, and a right motor encoder 104-2 and a right motor 105-2. The robot apparatus 10 employs a differential two-wheel drive type, and the left and right motors 105-1 and 105-2 operate to enable movement by the left and right wheels. The left motor encoder 104-1 and the right motor encoder 104-2 detect the amount of rotational movement of the left motor 105-1 and the right motor 105-2.
 ロボット装置10は、センサ106-1乃至106-3等の各種のセンサを有する。センサ106-1乃至106-3としては、IMU(Inertial Measurement Unit),LiDAR(Light Detection and Ranging),位置センサ、カメラなどを含む。ロボット装置10では、各種のセンサにより検出されたセンサ信号を用い、自律移動型ロボットとして動作する。バッテリユニット107は、ロボット装置10の各部に電力を供給する。 The robot device 10 has various sensors such as sensors 106-1 to 106-3. The sensors 106-1 to 106-3 include IMU (Inertial Measurement Unit), LiDAR (Light Detection and Ranging), position sensors, cameras, and the like. The robot apparatus 10 operates as an autonomous mobile robot using sensor signals detected by various sensors. The battery unit 107 supplies electric power to each part of the robot device 10 .
(可動式画面の可動)
 図4は、ロボット装置10における可動式のディスプレイ(可動式画面)を可動させた場合の例を示している。図4では、ロボット装置10が、スポーツを行うためのコート上で用いられる場合を想定する。この場合において、ロボット装置10は、自己位置や姿勢、対象となる観客(コート上の競技を観戦している観客)の位置などの状況に応じて、映像表示ユニット102(のディスプレイ)の姿勢を、所望の姿勢に調整することができる。
(Movable screen movable)
FIG. 4 shows an example in which a movable display (movable screen) in the robot device 10 is moved. In FIG. 4, it is assumed that the robot device 10 is used on a court for playing sports. In this case, the robot device 10 changes the posture of (the display of) the image display unit 102 according to the situation such as its own position and posture, and the position of the target spectators (spectators watching the game on the court). , can be adjusted to the desired posture.
 例えば、コート上の競技を観戦するための観客席として、1階席と2階席がある場合に、ロボット装置10は、1階席の観客に対して映像を提示するときと、2階席の観客に対して映像を提示するときとで、映像表示ユニット102(のディスプレイ)の姿勢を変化させる。 For example, if there are 1st floor seats and 2nd floor seats as spectator seats for watching a game on the court, the robot device 10 presents an image to the spectators on the 1st floor and 2nd floor seats. The posture of (the display of) the image display unit 102 is changed when the image is presented to the audience.
 具体的には、2階席の観客は、1階席の観客よりも空間的に上方に存在するため、その視線に合わせて、映像表示ユニット102の傾きが、よりなだらかになるように調整する。逆に、1階席の観客に対しては、映像表示ユニット102の傾きが、より急峻になるように調整すればよい。また、コートの周囲にいる全ての観客(360度の観客)を対象とする場合には、ロボット装置10は、映像表示ユニット102に角度をつけずに(ディスプレイの画面を、床面や地面等の移動面と平行になるようにして)、周囲の全ての観客の視線が画面に向けられるようにする。 Specifically, since the spectators on the second floor are spatially higher than the spectators on the first floor, the tilt of the image display unit 102 is adjusted so as to be gentler according to the line of sight of the spectators. . Conversely, for spectators on the first floor, the inclination of the image display unit 102 may be adjusted to be steeper. In addition, when targeting all the spectators around the court (360-degree spectators), the robot device 10 does not tilt the image display unit 102 (the screen of the display is displayed on the floor, the ground, etc.). (parallel to the plane of movement of the screen) so that all spectators in the surroundings have their eyes directed to the screen.
(複数のロボットの連携動作)
 図5乃至図8は、複数のロボット装置10が連携した動作を行うことで擬似的に所定の形状の画面を構成した場合の例を示している。
(Coordinated motion of multiple robots)
FIGS. 5 to 8 show an example in which a plurality of robot devices 10 cooperate to form a pseudo screen having a predetermined shape.
 図5では、8台のロボット装置10-1乃至10-8が連携した動作を行って2行4列(2×4)に並ぶことで、各ロボット装置10のディスプレイ(2×4の画面)を組み合わせて、擬似的に所定の形状からなる1つの画面(大画面)を構成している。この大画面(2×4の画面)を利用して、1つの大きな映像を観客などに提示することができる。 In FIG. 5, eight robot devices 10-1 to 10-8 perform cooperative operations and are arranged in two rows and four columns (2×4), so that the display (2×4 screen) of each robot device 10 is displayed. are combined to configure one screen (large screen) having a pseudo predetermined shape. Using this large screen (2×4 screen), one large image can be presented to the audience.
 図6では、8台のロボット装置10-1乃至10-8が連携した動作を行って1行8列(1×8)に並ぶことで、各ロボット装置10のディスプレイ(1×8の画面)を組み合わせて、擬似的に1つの画面(横長画面)を構成している。この大画面(1×8の画面)を利用して、1つの横長の映像を観客などに提示することができる。 In FIG. 6, eight robot devices 10-1 to 10-8 perform coordinated operations and are arranged in one row and eight columns (1×8), so that the display (1×8 screen) of each robot device 10 is displayed. are combined to construct one pseudo-screen (landscape screen). Using this large screen (1×8 screen), one horizontally long image can be presented to the audience.
 このとき、図7に示すように、ロボット装置10-1乃至10-8は、自己位置や姿勢、対象となる観客の位置などの状況に応じて、映像表示ユニット102(のディスプレイ)の姿勢を調整して、所望の姿勢にすることができる。また、図7に示すように、ロボット装置10-1乃至10-8では、映像表示ユニット102の傾きをすべて同一の角度とすることは勿論、図8に示すように、ロボット装置10ごとに、映像表示ユニット102の傾きを異なる角度としても構わない。 At this time, as shown in FIG. 7, the robot devices 10-1 to 10-8 change the posture of (the display of) the image display unit 102 according to the situation such as the position and posture of the robot device and the position of the target audience. It can be adjusted to the desired posture. In addition, as shown in FIG. 7, in the robot devices 10-1 to 10-8, all of the image display units 102 are tilted at the same angle. The image display unit 102 may be tilted at different angles.
(複数の光学式絶対位置センサを用いた計測)
 ロボット装置10では、自律移動を行う際に、複数の光学式絶対位置センサにより計測された絶対位置に基づき、自己位置及び姿勢の推定が行われる。図9には、ロボット装置10において、複数の光学式絶対位置センサとして、4つの光学式絶対位置センサ161-1乃至161-4を用いた計測の例を示している。
(Measurement using multiple optical absolute position sensors)
The robot apparatus 10 estimates its own position and orientation based on absolute positions measured by a plurality of optical absolute position sensors during autonomous movement. FIG. 9 shows an example of measurement using four optical absolute position sensors 161-1 to 161-4 as the plurality of optical absolute position sensors in the robot apparatus 10. FIG.
 ここで、絶対位置センサは、絶対位置を計測可能なセンサである。すなわち、絶対位置センサは、ある地点から次の地点までの距離を計測するものではなく、ある地点を計測するだけでその地点の位置を計測することができる。光学式絶対位置センサは、絶対位置センサのうち、可視光線や赤外線などの物体からの光を検出することで、絶対位置を計測するものである。例えば、光学式絶対位置センサは、所定のパターンから固有の情報を読み取ることで、コート上などの所定の領域内での位置の特定(絶対位置の計測)を行う。 Here, the absolute position sensor is a sensor that can measure the absolute position. That is, the absolute position sensor does not measure the distance from one point to the next point, but can measure the position of a certain point only by measuring that point. Among absolute position sensors, an optical absolute position sensor measures an absolute position by detecting light from an object such as visible light and infrared rays. For example, an optical absolute position sensor identifies a position (measures an absolute position) within a predetermined area such as on a court by reading unique information from a predetermined pattern.
 また、絶対位置センサの精度は、所定のパターンの精細さによるため、1個の絶対位置センサによる姿勢の推定精度は、ロボット装置10がコート上などの広い領域を走行するために必要な精度に対して著しく低く、ロボット装置10が単独の絶対位置センサを用いて広い領域での走行を行うことは難しい。 In addition, since the accuracy of the absolute position sensor depends on the fineness of a predetermined pattern, the accuracy of posture estimation by one absolute position sensor does not meet the accuracy required for the robot apparatus 10 to travel over a wide area such as on a court. On the other hand, it is extremely low, and it is difficult for the robot device 10 to travel over a wide area using a single absolute position sensor.
 ロボット装置10では、その底面に設置された複数の光学式絶対位置センサ161のそれぞれが、絶対位置検出用マット20に形成された所定のパターンを読み取ることで、絶対位置を計測することができる。ここで、光学式絶対位置センサ161は、ロボット装置10の底面の基準位置に対する位置であって、その底面に対向した移動面に形成された所定のパターンを認識可能な位置に設置される。 The robot device 10 can measure the absolute position by reading a predetermined pattern formed on the absolute position detection mat 20 by each of the plurality of optical absolute position sensors 161 installed on the bottom surface of the robot device 10 . Here, the optical absolute position sensor 161 is installed at a position relative to a reference position on the bottom surface of the robot apparatus 10 and at a position where a predetermined pattern formed on a moving surface facing the bottom surface can be recognized.
 ロボット装置10は、複数の光学式絶対位置センサ161のそれぞれから取得した絶対位置と、底面の基準位置に対する各光学式絶対位置センサ161の相対位置とに基づいて、より高精度な自己位置と姿勢を算出することができる。 Based on the absolute positions obtained from each of the plurality of optical absolute position sensors 161 and the relative position of each optical absolute position sensor 161 with respect to the reference position of the bottom surface, the robot apparatus 10 can determine its own position and posture with higher accuracy. can be calculated.
 絶対位置検出用マット20は、ロボット装置10が移動可能な床面等の移動面に設置される。絶対位置検出用マット20の面上には、絶対位置を示す所定のパターンが形成されており、この絶対位置は、XY座標などを用いて指定することができる。例えば、絶対位置を示す所定のパターンは、絶対位置検出用マット20の面上に印刷することができる。絶対位置検出用マット20は、ロボット装置10を置いたときに、機体の大部分が収まる縦横のサイズを有していればよく、その面上であれば、ロボット装置10は、どのような位置や姿勢となっても絶対位置を計測することができる。 The absolute position detection mat 20 is installed on a moving surface such as a floor on which the robot device 10 can move. A predetermined pattern indicating an absolute position is formed on the surface of the mat 20 for absolute position detection, and the absolute position can be designated using XY coordinates or the like. For example, a predetermined pattern indicating the absolute position can be printed on the surface of the mat 20 for absolute position detection. The mat 20 for absolute position detection only needs to have a vertical and horizontal size that can accommodate most of the body when the robot device 10 is placed thereon. It is possible to measure the absolute position even if it becomes a posture.
 例えば、ロボット装置10がコート上で用いられる場合、絶対位置検出用マット20は、当該コートの領域に合わせて設置する必要がある。また、絶対位置検出用マット20として、赤外線を反射可能で、絶対位置を示すパターン(赤外線パターン)を有する透明なマットを用いることで、当該マットを床面等の移動面に貼った場合に、観客などが、絶対位置検出用マット20の存在に気付かないようにすることができる。 For example, when the robot device 10 is used on a court, the absolute position detection mat 20 needs to be installed in accordance with the area of the court. Further, by using a transparent mat that can reflect infrared rays and has a pattern (infrared pattern) indicating an absolute position as the mat 20 for absolute position detection, when the mat is attached to a moving surface such as a floor, It is possible to prevent spectators from noticing the presence of the absolute position detection mat 20.例文帳に追加
 ここで、ロボット装置10において、複数の光学式絶対位置センサ161を用いるメリットを、他のセンサを用いた場合と比較しながら説明すると、例えば、次のようになる。 Here, the advantages of using the plurality of optical absolute position sensors 161 in the robot device 10 will be explained in comparison with the case of using other sensors, for example, as follows.
 光学式絶対位置センサは、絶対位置を直接的に計測できるため、他のセンサを用いた場合と比べて計算負荷が小さくなり、複数台を設けた場合でも計算処理は低負荷となる。そのため、ロボット装置10では、計算処理を行う計算機として、低コストなものを使用することができる。 Since the optical absolute position sensor can directly measure the absolute position, the calculation load is smaller than when using other sensors, and even if multiple sensors are installed, the calculation processing load is low. Therefore, in the robot apparatus 10, a low-cost computer can be used as a computer for performing calculation processing.
 それに対し、絶対位置を検出するためにカメラを用いる場合、高精度での検出を行うことが可能となるものの、計算負荷が高く、一般的に小型のロボット装置などでは画像処理のための高性能な計算機が必要となるため、複数台設置することは難しい。また、1台のカメラを用いる場合には、カメラの取り付け誤差がそのまま自己位置の誤差(初期位置や姿勢のずれ)になるため、カメラの取り付け精度が低いと、機体の自己位置の算出精度にそのまま反映されてしまう。 On the other hand, when a camera is used to detect the absolute position, although it is possible to perform detection with high accuracy, the computational load is high, and in general small robot devices require high performance for image processing. It is difficult to install multiple computers because it requires a large number of computers. In addition, when using a single camera, the camera mounting error directly becomes the self-position error (displacement of the initial position and attitude). It will be reflected as it is.
 ロボット装置においては、自己位置を推定するに際して、コートやその周辺などの設備側に大掛かりな装置を設置できない場合、主に車輪速度センサやIMUなどの内界センサが用いられるが、一般的に絶対的な位置は、IMUの積算誤差の影響などによりずれが生じる。そのため、環境における内界センサによる自己位置のずれを、適宜2次元コードなどをカメラで認識させて機体の位置を計測することで、自己位置を補正する手法が一般的に用いられる。 When estimating the self-position of robots, internal sensors such as wheel speed sensors and IMUs are mainly used when it is not possible to install large-scale equipment on the court or its surroundings. The actual position will deviate due to the influence of IMU integration error. Therefore, a method of correcting the self-position by measuring the position of the body by recognizing a two-dimensional code or the like with a camera is generally used to compensate for the deviation of the self-position by an internal sensor in the environment.
 2次元コードの位置に対してロボット装置の位置が大きくずれているとき、カメラの画角が十分に大きくなければ、2次元コードを捉えることができずに自己位置の補正を行うことができない。その一方で、カメラの画角を大きくすると精度が低下してしまう。さらには、精度が低下する分を補うために高画素化を行った場合には、計算負荷が大きくなってしまう。 When the position of the robot device is greatly deviated from the position of the 2D code, if the angle of view of the camera is not sufficiently large, the 2D code cannot be captured and the self position cannot be corrected. On the other hand, if the angle of view of the camera is increased, the accuracy will decrease. Furthermore, if the number of pixels is increased in order to compensate for the decrease in accuracy, the calculation load will increase.
 また、ロボット装置の底面側にカメラを設置して移動面を撮影する場合、適切な明るさにするための照明を用いなければ、2次元コードを読み取ることが難しい。カメラを用いる場合には、カメラのレンズの歪み補正などが適切に行われていない場合、自己位置に誤差が生じてしまう。高精度の位置推定には、カメラの補正のためのキャリブレーションを行う必要があるが、この種のキャリブレーションには手間と時間を要する。 Also, when a camera is installed on the bottom side of the robot device and the moving surface is photographed, it is difficult to read the two-dimensional code unless lighting is used to provide appropriate brightness. In the case of using a camera, an error occurs in the self-position unless the distortion of the camera lens is properly corrected. Accurate position estimation requires calibration for camera correction, but this type of calibration takes time and effort.
 このように、絶対位置を検出するためのカメラを用いたシステムの場合、カメラの画像処理にかかる負荷や、カメラの設置精度、照明やキャリブレーションなどの問題があるため、本開示を適用したロボット装置10では、光学式絶対位置センサを用いている。 In this way, in the case of a system using a camera for detecting the absolute position, there are problems such as the load on the image processing of the camera, the installation accuracy of the camera, lighting and calibration, so the robot to which the present disclosure is applied Device 10 uses an optical absolute position sensor.
 ただし、床面等の移動面に形成された所定のパターンから、絶対位置を計測する光学式絶対位置センサを用いる場合、計算負荷は小さいものの精度が低く、高精度に動作する必要があるロボット装置では使用することが難しい。すなわち、ロボット装置の自己位置の推定精度は、光学式絶対位置センサを単体で用いた場合、カメラを用いたシステムと比べて精度が低下してしまう。 However, when using an optical absolute position sensor that measures the absolute position from a predetermined pattern formed on a moving surface such as the floor, the computational load is small, but the accuracy is low, and the robot device must operate with high accuracy. is difficult to use. In other words, the accuracy of estimating the self-position of the robot apparatus is lower than that of a system using a camera when the optical absolute position sensor is used alone.
 そこで、本開示を適用したロボット装置10では、基準位置に対する相対位置にそれぞれ設置された複数の光学式絶対位置センサ161によって、絶対位置検出用マット20に形成された絶対位置を示す所定のパターンを読み取ることで、絶対位置が計測されるようにする。さらに、複数の光学式絶対位置センサ161から得られる低精度な絶対位置に関する情報に適切な計算処理を行うことにより、単体では精度の低い光学式絶対位置センサ161であっても、複数個を設置することで、計算負荷を小さく抑えたまま、高精度に自己位置及び姿勢の推定を行うことができる。 Therefore, in the robot apparatus 10 to which the present disclosure is applied, a predetermined pattern indicating the absolute position formed on the absolute position detection mat 20 is detected by a plurality of optical absolute position sensors 161 respectively installed at relative positions with respect to the reference position. Reading it causes the absolute position to be measured. In addition, by performing appropriate calculation processing on the low-precision absolute position information obtained from a plurality of optical absolute position sensors 161, even if the optical absolute position sensor 161 has a low accuracy by itself, a plurality of sensors can be installed. By doing so, it is possible to estimate the self-position and orientation with high accuracy while keeping the computational load small.
(機能的構成)
 図10は、本開示を適用したロボット装置の機能的構成の例を示している。
(Functional configuration)
FIG. 10 shows an example of a functional configuration of a robot device to which the present disclosure is applied.
 ロボット装置10は、メインCPU(Central Processing Unit)151、及びセンサ152を備える。メインCPU151は、例えば、図3の制御ユニット101に含まれる。メインCPU151は、自己位置姿勢算出部171、及び制御部172を有する。センサ152は、光学式絶対位置センサ161-1乃至161-N(Nは、2以上の整数)を有する。 The robot device 10 has a main CPU (Central Processing Unit) 151 and a sensor 152 . The main CPU 151 is included in the control unit 101 of FIG. 3, for example. The main CPU 151 has a self-position/orientation calculator 171 and a controller 172 . The sensor 152 has optical absolute position sensors 161-1 to 161-N (N is an integer equal to or greater than 2).
 光学式絶対位置センサ161-1は、ロボット装置10の底面における所定の位置に設置される。光学式絶対位置センサ161-1は、ロボット装置10の走行時又は停止時に、移動面に形成された所定のパターンを認識することで、認識したパターンが示す絶対位置を特定する。光学式絶対位置センサ161-1は、特定した絶対位置に関する情報を、自己位置姿勢算出部171に供給する。 The optical absolute position sensor 161-1 is installed at a predetermined position on the bottom surface of the robot device 10. The optical absolute position sensor 161-1 identifies the absolute position indicated by the recognized pattern by recognizing a predetermined pattern formed on the moving surface while the robot device 10 is running or stopped. The optical absolute position sensor 161 - 1 supplies information about the identified absolute position to the self-position/orientation calculator 171 .
 光学式絶対位置センサ161-2乃至161-Nは、光学式絶対位置センサ161-1と同様に構成され、ロボット装置10の底面における所定の位置にそれぞれ設置される。光学式絶対位置センサ161-2乃至161-Nのそれぞれは、ロボット装置10の走行時又は停止時に、移動面に形成された所定のパターンを認識することで特定される絶対位置に関する情報を、自己位置姿勢算出部171に供給する。 The optical absolute position sensors 161-2 to 161-N are configured in the same manner as the optical absolute position sensor 161-1, and are installed at predetermined positions on the bottom surface of the robot device 10, respectively. Each of the optical absolute position sensors 161-2 to 161-N, when the robot device 10 is running or stopped, automatically transmits information about the absolute position specified by recognizing a predetermined pattern formed on the moving surface. It is supplied to the position/orientation calculation unit 171 .
 自己位置姿勢算出部171は、光学式絶対位置センサ161-1乃至161-Nのそれぞれから供給される絶対位置に関する情報を取得する。絶対位置に関する情報は、XY座標系で表された座標や、対象の位置を識別するID(ポジションID)などとすることができる。また、自己位置姿勢算出部171は、ロボット装置10の底面に設置された光学式絶対位置センサ161-1乃至161-Nのそれぞれに関して、機体の底面の基準位置に対する相対位置に関する情報を、予め保持している。 The self-position/orientation calculation unit 171 acquires information about the absolute position supplied from each of the optical absolute position sensors 161-1 to 161-N. The information about the absolute position can be coordinates represented by the XY coordinate system, an ID (position ID) for identifying the target position, or the like. In addition, the self-position/orientation calculation unit 171 holds in advance information on the relative positions of the bottom surface of the robot device 10 relative to the reference position for each of the optical absolute position sensors 161-1 to 161-N installed on the bottom surface of the robot device 10. is doing.
 自己位置姿勢算出部171は、複数の光学式絶対位置センサ161のそれぞれから取得した絶対位置と、ロボット装置10の底面の基準位置に対する各光学式絶対位置センサ161の相対位置とに基づいて、ロボット装置10の自己位置及び姿勢を算出する。自己位置姿勢算出部171は、算出した自己位置及び姿勢に関する情報を、制御部172に供給する。 The self-position/orientation calculator 171 determines the position of the robot based on the absolute positions obtained from each of the plurality of optical absolute position sensors 161 and the relative positions of the optical absolute position sensors 161 with respect to the reference position on the bottom surface of the robot apparatus 10. The self-position and orientation of the device 10 are calculated. The self-position/orientation calculation unit 171 supplies information about the calculated self-position and orientation to the control unit 172 .
 制御部172は、ロボット装置10の各部の動作を制御する。制御部172は、自己位置姿勢算出部171から供給される自己位置及び姿勢に関する情報を取得する。制御部172は、自己位置及び姿勢に基づいて、ロボット装置10を自律的に移動させる制御を行う。 The control section 172 controls the operation of each section of the robot device 10 . The control unit 172 acquires information about the self-position and orientation supplied from the self-position/posture calculation unit 171 . The control unit 172 performs control to autonomously move the robot apparatus 10 based on its own position and orientation.
 以上のように構成されるロボット装置10では、自己位置姿勢算出部171によって、少なくとも2以上の光学式絶対位置センサ161から取得した絶対位置と、底面の基準位置に対する光学式絶対位置センサ161の相対位置とに基づき、自己位置及び姿勢が算出される。自己位置及び姿勢の算出に際しては、光学式絶対位置センサ161により計測される絶対位置と、予め保持している相対位置とを用いるため、画像処理などの計算負荷の高い処理を実施する必要がなく、より計算負荷の低い処理(例えば加算や乗算などの演算処理)を実施すればよい。また、複数の光学式絶対位置センサ161を用いて絶対位置の計測を実施することで、自己位置及び姿勢の精度を向上させることができる。よって、本開示を適用したロボット装置10では、より少ない計算負荷で、かつ、より高精度に自己位置及び姿勢の推定を行うことができる。 In the robot apparatus 10 configured as described above, the self-position/orientation calculator 171 calculates the absolute positions obtained from at least two or more optical absolute position sensors 161 and the relative positions of the optical absolute position sensors 161 with respect to the reference position of the bottom surface. Based on the position, the self-position and orientation are calculated. Since the absolute position measured by the optical absolute position sensor 161 and the relative position stored in advance are used when calculating the self position and orientation, there is no need to perform processing with a high computational load such as image processing. , a process with a lower computational load (for example, an arithmetic process such as addition or multiplication) may be performed. Moreover, by measuring the absolute position using a plurality of optical absolute position sensors 161, the accuracy of the self-position and orientation can be improved. Therefore, the robot apparatus 10 to which the present disclosure is applied can estimate its own position and orientation with less computational load and with higher accuracy.
(センサの配置例)
 図11は、複数の光学式絶対位置センサ161の配置の例を示している。図11のA,Bは、ロボット装置10の側面図と底面図を示している。なお、実際には、ロボット装置10の底面に、車輪等の移動機構が設けられるが、センサの配置を分かりやすくするため、図示を省略している。
(Sensor arrangement example)
FIG. 11 shows an example of arrangement of a plurality of optical absolute position sensors 161. As shown in FIG. 11A and 11B show a side view and a bottom view of the robot device 10. FIG. In practice, a movement mechanism such as wheels is provided on the bottom surface of the robot device 10, but the illustration is omitted in order to make the arrangement of the sensors easier to understand.
 図11のBに示すように、ロボット装置10の底面には、光学式絶対位置センサ161-1乃至161-4の4つのセンサが、基準位置に対し、底面の長手方向に並べて配置されている。例えば、コート上におけるロボット装置10の位置を、機体の底面の中心(重心)と定義した場合、機体の底面の中心が基準位置となる。なお、基準位置は、機体の底面の中心に限らず、機体の底面の左上などの他の位置であってもよい。 As shown in FIG. 11B, on the bottom surface of the robot device 10, four sensors, optical absolute position sensors 161-1 to 161-4, are arranged side by side in the longitudinal direction of the bottom surface with respect to the reference position. . For example, when the position of the robot device 10 on the court is defined as the center (center of gravity) of the bottom surface of the machine body, the center of the bottom surface of the machine body is the reference position. Note that the reference position is not limited to the center of the bottom surface of the fuselage, and may be another position such as the upper left of the bottom surface of the fuselage.
 光学式絶対位置センサ161-1乃至161-4のそれぞれは、カメラを用いたシステムと比べて低精度であるものの、各光学式絶対位置センサ161は計測結果として絶対位置を出力するため、複数台設置した場合でも、高性能な計算機を用いる必要があるほど計算負荷が増加することはない。図11に示すように、光学式絶対位置センサ161-1乃至161-4を、底面の基準位置に対して長手方向に並べて配置した場合、長手方向(y方向)の値を固定値とすることができるため、計算量をさらに低減することができる。 Although each of the optical absolute position sensors 161-1 to 161-4 has lower accuracy than a system using a camera, each optical absolute position sensor 161 outputs an absolute position as a measurement result. Even when installed, the computational load does not increase so much that it is necessary to use a high-performance computer. As shown in FIG. 11, when the optical absolute position sensors 161-1 to 161-4 are arranged in the longitudinal direction with respect to the reference position on the bottom surface, the value in the longitudinal direction (y direction) is fixed. can be performed, the amount of calculation can be further reduced.
 また、ロボット装置10の底面の基準位置に対する相対位置に、光学式絶対位置センサ161-1乃至161-4を配置し、さらに各光学式絶対位置センサ161により計測可能な傾きに関する情報を使用しないため、光学式絶対位置センサ161-1乃至161-4の相対位置を予め計測して保持しておくことで、計測された絶対位置を用いて自己位置及び姿勢を算出することができる。なお、ロボット装置10では、光学式絶対位置センサ161として赤外線方式の絶対位置センサを用いているため、カメラを用いたシステムと比べて、別体の照明を設けたり、キャリブレーションを行ったりすることが不要となる。 In addition, since the optical absolute position sensors 161-1 to 161-4 are arranged at positions relative to the reference position on the bottom surface of the robot device 10, and information about the inclination measurable by each optical absolute position sensor 161 is not used. By measuring and holding the relative positions of the optical absolute position sensors 161-1 to 161-4 in advance, the self position and orientation can be calculated using the measured absolute positions. In addition, since the robot apparatus 10 uses an infrared absolute position sensor as the optical absolute position sensor 161, it is not necessary to provide a separate illumination or perform calibration compared to a system using a camera. becomes unnecessary.
 図11のBは、複数の光学式絶対位置センサ161の配置の一例であって、センサの数と配置は任意であり、他の配置を採用しても構わない。光学式絶対位置センサ161の数が多いほど、ロボット装置10の底面に広く分布させることができるため、誤差の低減を図ることができる。 FIG. 11B is an example of the arrangement of the plurality of optical absolute position sensors 161. The number and arrangement of the sensors are arbitrary, and other arrangements may be adopted. The greater the number of optical absolute position sensors 161, the more widely they can be distributed on the bottom surface of the robot device 10, and thus the more errors can be reduced.
 図12は、複数の光学式絶対位置センサ161の配置の他の例を示している。図12のA乃至Cには、図11のBと同様に、ロボット装置10の底面図を示している。 FIG. 12 shows another example of the arrangement of the plurality of optical absolute position sensors 161. FIG. Similar to FIG. 11B, FIGS. 12A to 12C show bottom views of the robot apparatus 10. FIG.
 図12のAにおいて、光学式絶対位置センサ161-1乃至161-4は、図11のBの配置と同様に、底面の基準位置(例えば底面の中心)に対して長手方向に並べて配置しているが、各センサの配置の間隔が異なっている。光学式絶対位置センサ161-1乃至161-4を、一直線上に配置することで、計算負荷の低減を図ることができる。 In FIG. 12A, the optical absolute position sensors 161-1 to 161-4 are arranged side by side in the longitudinal direction with respect to the reference position of the bottom surface (for example, the center of the bottom surface), similar to the arrangement of FIG. 11B. However, the spacing between the sensors is different. By arranging the optical absolute position sensors 161-1 to 161-4 in a straight line, the calculation load can be reduced.
 図12のBにおいて、光学式絶対位置センサ161-1,161-2は、底面の基準位置(例えば底面の中心)を対称点とした相対位置に配置されている。光学式絶対位置センサ161の数は、最低2以上あればよく、各センサ間の距離が大きいほど、ロボット装置10の姿勢(角度)の精度を高くすることができる。 In FIG. 12B, the optical absolute position sensors 161-1 and 161-2 are arranged at relative positions with respect to the reference position of the bottom surface (for example, the center of the bottom surface). The number of optical absolute position sensors 161 should be at least two, and the greater the distance between the sensors, the higher the accuracy of the posture (angle) of the robot device 10 can be.
 図12のCにおいて、光学式絶対位置センサ161-1乃至161-8は、底面の基準位置(例えば底面の中心)に対して2行4列(2×4)に並べて配置されている。光学式絶対位置センサ161の数を増やすことで、ロボット装置10の位置と姿勢(角度)の精度を共に向上させることができる。 In FIG. 12C, the optical absolute position sensors 161-1 to 161-8 are arranged in 2 rows and 4 columns (2×4) with respect to the reference position of the bottom surface (for example, the center of the bottom surface). By increasing the number of optical absolute position sensors 161, both the accuracy of the position and orientation (angle) of the robot device 10 can be improved.
(自己位置及び姿勢の算出例)
 次に、自己位置姿勢算出部171による自己位置及び姿勢の算出の例を説明する。図13は、複数の光学式絶対位置センサ161による計測結果を用いて自己位置及び姿勢を算出するに際して、光学式絶対位置センサ161により計測された絶対位置(位置P)と、自己位置姿勢算出部171により算出される自己位置(位置P)との関係を示している。なお、この例では、ロボット装置10の位置を、機体の底面の中心としているため、機体の底面の中心が基準位置となる。
(Calculation example of self-position and posture)
Next, an example of calculation of the self-position and posture by the self-position/posture calculation unit 171 will be described. FIG. 13 shows the absolute position (position P N ) measured by the optical absolute position sensor 161 and the self position/orientation calculation when calculating the self position and orientation using the measurement results of the plurality of optical absolute position sensors 161 . The relationship with the self-position (position P T ) calculated by the unit 171 is shown. In this example, since the robot device 10 is positioned at the center of the bottom surface of the machine body, the center of the bottom surface of the machine body is the reference position.
 自己位置姿勢算出部171では、ロボット装置10の真の位置と姿勢を、位置P(x, y, θ)としたとき、各光学式絶対位置センサ161が計測した位置Pとの最小二乗法を適用することで、最も誤差が小さくなる x, y, θの値を推定する最適化計算が行われる。 The self-position/orientation calculation unit 171 calculates the minimum distance between the position P N measured by each optical absolute position sensor 161 and the position P T (x, y, θ) as the true position and orientation of the robot device 10 . By applying multiplication, an optimization calculation is performed to estimate the values of x, y, and θ that minimize the error.
 ここで、複数の光学式絶対位置センサ161のうち、1番目の光学式絶対位置センサ161に注目すれば、機体の底面の中心からの位置P(xg1, yg1)は、次の式(1)により表される。ただし、式(1)において、x1, y1の値は、予め計測された既知の値となる。 Here, focusing on the first optical absolute position sensor 161 among the plurality of optical absolute position sensors 161, the position P 1 (x g1 , y g1 ) from the center of the bottom surface of the airframe is given by the following equation: (1). However, in Equation (1), the values of x 1 and y 1 are known values measured in advance.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 また、N番目に計測された光学式絶対位置センサ161の位置Pを、(xmn, ymn)としたとき、計測値と推定値との誤差は、次の式(2)により表される。ただし、式(2)において、xmn, ymnは,実際に計測された絶対位置を表した計測値であり、xgn, ygnは、これから推定すべき位置Pから理論的に算出したセンサの位置である。 Further, when the position P N of the optical absolute position sensor 161 measured at the Nth position is (x mn , y mn ), the error between the measured value and the estimated value is expressed by the following equation (2). be. However, in equation (2), x mn and y mn are measured values representing the actually measured absolute positions, and x gn and y gn are theoretically calculated from the position PT to be estimated from now on. position of the sensor.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 ここで、ロボット装置10の底面に設置された光学式絶対位置センサ161の数を用いて、式(2)の総和をとると、次の式(3)で表される。 Here, if the number of optical absolute position sensors 161 installed on the bottom surface of the robot device 10 is used to sum equation (2), it is expressed by the following equation (3).
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 式(3)を最小化する x, y, θの値の組み合わせを求めることで、計測値と推定値との誤差を最小にするx, y, θの値が計算される。 By finding a combination of x, y, and θ values that minimize Equation (3), the values of x, y, and θ that minimize the error between the measured value and the estimated value are calculated.
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 ここで、ロボット装置10の初期設置の誤差や、ロボット装置10に搭載されるジャイロセンサのドリフトに伴うヨー角度のずれは、非常に小さいと仮定することができる。そのため、cosθ = 1, sinθ = θと近似して線形化を行うことで、最小二乗法を適用することが可能となる。 Here, it can be assumed that the initial installation error of the robot device 10 and the deviation of the yaw angle due to the drift of the gyro sensor mounted on the robot device 10 are extremely small. Therefore, by approximating cos θ = 1, sin θ = θ and performing linearization, it is possible to apply the least squares method.
 すなわち、式(4)に示した3つの式は、次の式(5),式(6),式(7)でそれぞれ表すことができる。 That is, the three expressions shown in Expression (4) can be expressed by the following Expressions (5), (6), and (7), respectively.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 式(5),式(6),式(7)の最終的な結果をまとめると、次の式(8)で表される。 The final results of formulas (5), (6), and (7) are summarized in the following formula (8).
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
 式(8)は、展開してまとめることで、次の式(9)のように表される。 Formula (8) is expressed as the following formula (9) by expanding and summarizing.
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000009
 式(9)において、実際に計測された計測値(xmi, ymi)を代入することで、 x, y, θの値を得ることができる。そして、これらの値の組み合わせが、計測値と推定値との誤差を最小にする x, y, θ、つまり、自己位置(位置P)として推定した位置(x, y)と姿勢(θ)になる。 By substituting the actually measured values (x mi , y mi ) in Equation (9), the values of x, y, and θ can be obtained. Then, the combination of these values minimizes the error between the measured value and the estimated value. become.
 ここで、式(9)は、加算と乗算を含む単純な3元1次の連立方程式となるため、より小さい計算負荷で、解を得ることができる。さらに、ロボット装置10において、光学式絶対位置センサ161の設置数を増やした場合でも、光学式絶対位置センサ161を設置した機体の底面の位置が分かっていれば、式(9)を同様に適用することができる。 Here, equation (9) is a simple three-dimensional first-order simultaneous equation including addition and multiplication, so a solution can be obtained with a smaller computational load. Furthermore, even if the number of optical absolute position sensors 161 installed in the robot device 10 is increased, if the position of the bottom surface of the machine body on which the optical absolute position sensors 161 are installed is known, the equation (9) is similarly applied. can do.
 また、実際に計測された計測値(xmi, ymi)には、絶対位置に関する情報だけが含まれる(傾きに関する情報を含まない)が、自己位置としては、位置(x, y)だけでなく、姿勢(θ)も推定することができる。なお、上述した説明では、最小二乗法を用いた場合を例示したが、仮に最小二乗法を用いない場合には、非線型の最適化演算を行う必要があるため、計算コストが高くなる。 Also, the actually measured values (x mi , y mi ) contain only information about the absolute position (not including information about the tilt), but the self-position is only the position (x, y). can also estimate the orientation (θ). In the above description, the case of using the least squares method was exemplified. However, if the least squares method is not used, nonlinear optimization calculations must be performed, resulting in high calculation costs.
(ロボット装置の使用例)
 図14は、本開示を適用したロボット装置10の使用例を示している。
(Example of use of robot device)
FIG. 14 shows a usage example of the robot device 10 to which the present disclosure is applied.
 図14においては、バスケットボールのコート30内の領域で、複数のロボット装置10が同期して、走行や整列等の連携した動作による演技などを行う場面を想定している。各ロボット装置10は、自律移動機能を備えているが、車輪速度センサやIMU等の内界センサが搭載され、コート30に対する絶対的な位置を認識する手段を持っていない。 In FIG. 14, a scene is assumed in which a plurality of robot devices 10 are synchronized in a basketball court 30 and perform performances by coordinated actions such as running and lining up. Each robot device 10 has an autonomous movement function, but is equipped with an internal sensor such as a wheel speed sensor and an IMU, and does not have means for recognizing an absolute position with respect to the court 30 .
 コート30内の特定の領域(スタート地点など)には、絶対位置検出用マット20が、コート30に対する位置合わせを行った上で貼り付けるなどして固定されている。コート30外に置かれたロボット装置10-1乃至10-8のそれぞれは、動作を開始する前に、ロボット装置10-1から順に絶対位置検出用マット20上に置かれ、ユーザUがスイッチを押すなどの操作をトリガにして、初期位置と姿勢の計測を開始する。 An absolute position detection mat 20 is fixed to a specific area (starting point, etc.) within the court 30 by being aligned with the court 30 and pasted. Each of the robot devices 10-1 to 10-8 placed outside the court 30 is placed on the absolute position detection mat 20 in order from the robot device 10-1 before starting its operation, and the user U turns on the switch. An operation such as pressing is used as a trigger to start measuring the initial position and orientation.
 ロボット装置10-1においては、複数の光学式絶対位置センサ161が絶対位置検出用マット20に印刷された所定のパターンを読み取ることで計測された計測値を、上述した自己位置及び姿勢の算出手法を用いて統合して得られる自己位置及び姿勢(x, y, θ)を算出し、コート30に対する機体の初期位置と姿勢としてメモリに記録する。そして、初期位置と姿勢を記録したロボット装置10-1は、その時点での初期位置と姿勢に基づき、コート30内の所定の位置(演技を開始する地点など)に向かう経路に関する経路情報を生成する。ロボット装置10-1は、生成した経路情報に基づき、コート30内の所定の位置に向かって走行する。 In the robot device 10-1, the measurement values obtained by reading the predetermined pattern printed on the absolute position detection mat 20 by the plurality of optical absolute position sensors 161 are used as the self-position and orientation calculation method described above. is used to calculate the self-position and attitude (x, y, θ) obtained by integration, and recorded in the memory as the initial position and attitude of the aircraft with respect to the court 30 . Then, the robot device 10-1 that has recorded the initial position and posture generates route information regarding a route toward a predetermined position (such as the starting point of the performance) within the court 30 based on the initial position and posture at that time. do. The robot device 10-1 travels toward a predetermined position within the court 30 based on the generated route information.
 ロボット装置10-1の次に、ロボット装置10-2乃至10-8が順次、絶対位置検出用マット20に置かれる。ロボット装置10-2乃至10-8のそれぞれでは、ロボット装置10-1と同様に、複数の光学式絶対位置センサ161により計測された計測値に基づき、機体の初期位置と姿勢を算出することで、経路情報が生成され、コート30内の所定の位置(演技を開始する地点など)に向かって走行する。 Next to the robot device 10-1, the robot devices 10-2 to 10-8 are placed on the absolute position detection mat 20 in order. In each of the robot devices 10-2 to 10-8, similar to the robot device 10-1, the initial position and orientation of the machine body are calculated based on the measurement values measured by the plurality of optical absolute position sensors 161. , route information is generated, and the player runs toward a predetermined position (such as a starting point of the performance) within the court 30 .
 これにより、ロボット装置10-1乃至10-8のそれぞれが順に、コート30内の所定の領域に向かって走行して、自動で整列することになる。ロボット装置10-1乃至10-8は、初期位置と姿勢が合わされているので、連携して動作して演技(例えば、図5乃至図8に示した連携した動作など)を行うことができる。 As a result, each of the robot devices 10-1 to 10-8 sequentially travels toward a predetermined area within the court 30 and automatically aligns. Since the initial positions and postures of the robot devices 10-1 to 10-8 are matched, they can act in coordination to perform performances (eg, the coordinated motions shown in FIGS. 5 to 8).
 このように、複数のロボット装置10を順に絶対位置検出用マット20上に適当に置いて、ユーザUがスイッチを押すだけで、各ロボット装置10は、コート30内の機体(自機)の位置と姿勢を認識することができる。その際に、仮に、絶対位置検出用マット20がコート30に対してずれて固定されていたとしても、全てのロボット装置10が同じように誤差を持つため、相対的な誤差は乗らず、同期した所定の動作(連携した動作による演技など)に影響することはない。 In this way, by simply placing a plurality of robot devices 10 on the absolute position detection mat 20 in sequence and pressing a switch by the user U, each robot device 10 can detect the position of the machine body (own machine) on the court 30 . and posture can be recognized. At that time, even if the absolute position detection mat 20 were fixed to the court 30 with a deviation, since all the robot devices 10 have the same error, there is no relative error, and synchronization is possible. It does not affect predetermined actions (such as performances by coordinated actions).
 また、上述した説明では、複数のロボット装置10の初期位置と姿勢を合わせておくことで、複数のロボット装置10の演技を合わせる例を示したが、例えば、絶対位置検出用マット20をコート30内の所定の領域に固定して、演技中のロボット装置10がマット上を走行することで、複数のロボット装置10が演技中に、連携した動作のずれを修正することも可能である。 Further, in the above description, an example was shown in which the performances of the plurality of robot devices 10 were matched by matching the initial positions and postures of the plurality of robot devices 10. It is also possible to correct deviations in coordinated motions of a plurality of robot devices 10 during the performance by fixing the robot devices 10 to a predetermined area in the mat and running on the mat during the performance.
 なお、絶対位置を検出するためにカメラと2次元コードを用いた計測によって、ロボット装置の初期位置と姿勢を決定する場合、カメラにより撮影された画像に対する画像処理に負荷がかかるため、計算処理能力が潤沢ではないロボット装置では、そのような手法を用いることができない。そのため、現状では、ロボット装置の初期位置と姿勢を人の手で合わせるか、あるいはロボット装置の位置を規制可能な治具(位置決め治具)などを用いて合わせていた。 When determining the initial position and orientation of the robot device by measurement using a camera and a two-dimensional code to detect the absolute position, image processing of the image taken by the camera is burdensome, so computational processing capacity is limited. Such a technique cannot be used in a robotic device that is not abundant in power. Therefore, at present, the initial position and posture of the robot device are adjusted manually or by using a jig (positioning jig) capable of regulating the position of the robot device.
 しかしながら、人の手を介した場合には誤差が大きく、複数のロボット装置を同時に動かす場合には、全てのロボット装置がきれいに揃った動きを実現することは困難である。また、治具を用いた場合には、治具が大がかりになることや、差動二輪駆動型のロボット装置の場合には、左右方向に移動させることができないため、治具に合わせてロボット装置を設置することに手間と時間がかかっていた。 However, when it is done manually, the error is large, and when multiple robot devices are moved at the same time, it is difficult to realize the movement of all the robot devices neatly. In addition, if a jig is used, the jig becomes large, and in the case of a differential two-wheel drive type robot device, it cannot be moved in the left-right direction. It took time and effort to install the
 そこで、ロボット装置をある程度適当に置いたとしても、ロボット装置が自律的に機体の初期位置と姿勢を認識して、コート内の絶対位置を推定する手法が求められていたため、本開示では、上述した機体の初期位置と姿勢の算出手法を提案している。 Therefore, even if the robot device is placed appropriately to some extent, there is a need for a method in which the robot device autonomously recognizes the initial position and posture of the body and estimates the absolute position in the court. proposed a method to calculate the initial position and attitude of the airframe.
<2.変形例> <2. Variation>
 上述した説明では、ロボット装置10の駆動形式として、差動二輪駆動型を例示したが、全方位移動型などの他の駆動形式であってもよい。 In the above description, the differential two-wheel drive type was exemplified as the drive type of the robot device 10, but other drive types such as an omnidirectional movement type may be used.
 また、上述した説明では、ロボット装置10において、ディスプレイを含む映像表示ユニット102の姿勢を変化させる際に、1軸で駆動する場合を例示したが、1軸に限らず、2軸などで駆動してもよい。ディスプレイに表示する表示情報としては、映像に限らず、画像やテキストなどの情報であってもよい。 Further, in the above description, in the robot apparatus 10, when changing the attitude of the image display unit 102 including the display, the case of driving along one axis was exemplified. may Display information to be displayed on the display is not limited to video, and may be information such as images and text.
 本開示を適用したロボット装置10は、制御ユニット101等の制御部を有する自律型の移動装置であると捉えることができる。この制御部は、CPU等のプロセッサを有する情報処理装置として構成され、ロボット装置10の内部に設けられることは勿論、外部装置として構成されてもよい。また、本開示を適用したロボット装置10と、絶対位置検出用マット20とにより、情報処理システムが構成されると捉えてもよい。 The robot device 10 to which the present disclosure is applied can be regarded as an autonomous mobile device having a control unit such as the control unit 101. The control unit is configured as an information processing device having a processor such as a CPU, and may be provided inside the robot device 10 or may be configured as an external device. Further, it may be considered that an information processing system is configured by the robot device 10 to which the present disclosure is applied and the absolute position detection mat 20 .
 また、本開示を適用したロボット装置10は、制御装置や、センサ装置、ディスプレイ装置、通信装置、移動機構などの複数の装置を組み合わせたシステム(自律移動システム)として捉えることもできる。ここで、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれもシステムである。 Also, the robot device 10 to which the present disclosure is applied can be regarded as a system (autonomous movement system) in which multiple devices such as a control device, a sensor device, a display device, a communication device, and a movement mechanism are combined. Here, a system means a set of a plurality of constituent elements (devices, modules (parts), etc.), and it does not matter whether or not all the constituent elements are in the same housing. Therefore, a plurality of devices housed in separate enclosures and connected via a network, and a single device housing a plurality of modules within a single enclosure, are both systems.
 また、本開示を適用したロボット装置10は、さらに掃除用のアタッチメントを備えることができる。この掃除用のアタッチメントは、モップ状のものであって、ロボット装置10の前面、後面、側面、下面などに取り付けられることによって、ロボット装置10が自律的に走行しながら、走行経路を掃除することができる。掃除する箇所は、走行経路として事前に与えられてもよく、また、ジェスチャ認識によって、指示者の「ここを掃除しろ」といった指示を認識して掃除を行うようにしてもよい。このジェスチャ認識としては、センサ106(カメラ等)からのセンサ信号に基づき、ターゲットとなる指示者の姿勢や動き等の認識処理が行われることで、当該ターゲットのジェスチャが認識される。 In addition, the robot device 10 to which the present disclosure is applied can further include an attachment for cleaning. This cleaning attachment is in the form of a mop, and is attached to the front, rear, side, bottom, etc. of the robot device 10 so that the robot device 10 can autonomously travel to clean the travel route. can be done. The location to be cleaned may be given in advance as a travel route, or may be cleaned by recognizing an instruction such as "clean this area" from the instructor by gesture recognition. As for this gesture recognition, the gesture of the target is recognized by recognizing the posture, movement, etc. of the target indicator based on the sensor signal from the sensor 106 (camera or the like).
 さらに、掃除動作と映像表示とを協調して行うようにしてもよい。この場合、掃除を開始するとき、掃除中、掃除が完了したときにその旨の映像表示を行ってもよく、また掃除中に広告やその他の映像表示を行ってもよい。さらに映像表示ユニット102(のディスプレイ)の姿勢も併せて制御してもよい。また、掃除用のアタッチメントは、例示されたモップ状のものに限られず、塵取り形状のものなどその他のものを含む。 Furthermore, the cleaning operation and the image display may be performed in cooperation. In this case, when cleaning is started, during cleaning, and when cleaning is completed, a video display to that effect may be performed, or an advertisement or other video display may be performed during cleaning. Furthermore, the attitude of (the display of) the video display unit 102 may also be controlled. Also, the cleaning attachment is not limited to the illustrated mop-shaped one, and includes other types such as a dustpan-shaped one.
<3.コンピュータの構成> <3. Computer Configuration>
 上述した自己位置及び姿勢の推定における一連の処理は、ハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。当該一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、各装置のコンピュータにインストールされる。 A series of processes in estimating the self-position and orientation described above can be executed by hardware or by software. When executing the series of processes by software, a program constituting the software is installed in the computer of each device.
 図15は、上述した自己位置及び姿勢の推定における一連の処理をプログラムにより実行するコンピュータのハードウェアの構成例を示すブロック図である。 FIG. 15 is a block diagram showing an example of the hardware configuration of a computer that executes a series of processes in estimating the self-position and orientation described above by means of a program.
 コンピュータ1000において、CPU1001、ROM(Read Only Memory)1002、RAM(Random Access Memory)1003は、バス1004により相互に接続されている。バス1004には、さらに、入出力インタフェース1005が接続されている。入出力インタフェース1005には、入力部1006、出力部1007、記録部1008、通信部1009、及び、ドライブ1010が接続されている。 In computer 1000 , CPU 1001 , ROM (Read Only Memory) 1002 and RAM (Random Access Memory) 1003 are interconnected by bus 1004 . An input/output interface 1005 is further connected to the bus 1004 . An input unit 1006 , an output unit 1007 , a recording unit 1008 , a communication unit 1009 and a drive 1010 are connected to the input/output interface 1005 .
 入力部1006は、各種のセンサやマイクロフォン、スイッチなどよりなる。出力部1007は、スピーカ、ディスプレイなどよりなる。記録部1008は、HDD(Hard Disk Drive)やSSD(Solid State Drive)などの補助記憶装置などよりなる。通信部1009は、無線LAN(Local Area Network)やセルラー方式の通信(例えば5G等)、Bluetooth(登録商標)などの通信方式に対応した通信モジュールなどよりなる。ドライブ1010は、磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリなどのリムーバブルメディア1011を駆動する。 The input unit 1006 consists of various sensors, microphones, switches, and the like. The output unit 1007 includes a speaker, a display, and the like. The recording unit 1008 includes an auxiliary storage device such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive). The communication unit 1009 includes a communication module compatible with communication methods such as wireless LAN (Local Area Network), cellular communication (for example, 5G, etc.), Bluetooth (registered trademark), and the like. A drive 1010 drives a removable medium 1011 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
 以上のように構成されるコンピュータ1000では、CPU1001が、ROM1002や記録部1008に記録されているプログラムを、入出力インタフェース1005及びバス1004を介して、RAM1003にロードして実行することにより、上述した一連の処理が行われる。 In the computer 1000 configured as described above, the CPU 1001 loads a program recorded in the ROM 1002 and the recording unit 1008 into the RAM 1003 via the input/output interface 1005 and the bus 1004, and executes it. A series of processes are performed.
 コンピュータ1000(CPU1001)が実行するプログラムは、例えば、パッケージメディア等としてのリムーバブルメディア1011に記録して提供することができる。また、プログラムは、ローカルエリアネットワーク、インターネット、デジタル衛星放送といった、有線又は無線の伝送媒体を介して提供することができる。 The program executed by the computer 1000 (CPU 1001) can be provided by being recorded on removable media 1011 such as package media, for example. Also, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
 コンピュータ1000では、プログラムは、リムーバブルメディア1011をドライブ1010に装着することにより、入出力インタフェース1005を介して、記録部1008にインストールすることができる。また、プログラムは、有線又は無線の伝送媒体を介して、通信部1009で受信し、記録部1008にインストールすることができる。その他、プログラムは、ROM1002や記録部1008に、あらかじめインストールしておくことができる。 In the computer 1000 , the program can be installed in the recording unit 1008 via the input/output interface 1005 by loading the removable medium 1011 into the drive 1010 . Also, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the recording unit 1008 . In addition, programs can be installed in the ROM 1002 and the recording unit 1008 in advance.
 ここで、本明細書において、コンピュータがプログラムに従って行う処理は、並列的あるいは個別に実行される処理(例えば、並列処理あるいはオブジェクトによる処理)も含む。また、プログラムは、1のコンピュータ(プロセッサ)により処理されてもよいし、複数のコンピュータによって分散処理されてもよい。 Here, in this specification, processing performed by a computer according to a program also includes processing that is executed in parallel or individually (for example, parallel processing or processing by objects). Also, the program may be processed by one computer (processor) or may be processed by a plurality of computers in a distributed manner.
 また、上述した自己位置及び姿勢の推定における一連の処理の各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 In addition, each step of the series of processes in estimating the self-position and orientation described above can be performed by one device or shared by a plurality of devices. Furthermore, when one step includes a plurality of processes, the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
 なお、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本開示の要旨を逸脱しない範囲において種々の変更が可能である。また、本明細書に記載された効果はあくまで例示であって限定されるものではなく、他の効果があってもよい。 It should be noted that the embodiment of the present technology is not limited to the embodiment described above, and various modifications are possible without departing from the gist of the present disclosure. Moreover, the effects described in this specification are merely examples and are not limited, and other effects may be provided.
 なお、本開示は、以下のような構成をとることができる。 It should be noted that the present disclosure can be configured as follows.
(1)
 少なくとも2以上の絶対位置センサと、
 自己位置及び姿勢を算出する自己位置姿勢算出部と
 を備え、
 前記自己位置姿勢算出部は、前記絶対位置センサから取得した絶対位置と、移動装置の基準位置に対する前記絶対位置センサの相対位置とに基づいて、前記自己位置及び姿勢を算出する
 情報処理装置。
(2)
 前記絶対位置センサは、前記移動装置が移動可能な移動面に形成された所定のパターンを認識可能に設置された光学式絶対位置センサである
 前記(1)に記載の情報処理装置。
(3)
 前記絶対位置は、前記光学式絶対位置センサの座標を少なくとも含み、
 前記光学式絶対位置センサは、前記所定のパターンを認識することで、前記座標を特定する
 前記(2)に記載の情報処理装置。
(4)
 前記所定のパターンは、マット又は床面に形成された絶対位置を示す所定のパターンを含む
 前記(2)又は(3)に記載の情報処理装置。
(5)
 前記基準位置は、前記移動装置の底面の中心であり、
 前記絶対位置センサは、前記移動装置の底面の中心を基準にして配置される
 前記(1)乃至(4)のいずれかに記載の情報処理装置。
(6)
 前記移動装置は、1又は複数の他の移動装置と同期して動作する
 前記(1)乃至(5)のいずれかに記載の情報処理装置。
(7)
 情報処理装置が、
 少なくとも2以上の絶対位置センサから取得した絶対位置と、移動装置の基準位置に対する前記絶対位置センサの相対位置とに基づいて、前記移動装置の自己位置及び姿勢を算出する
 情報処理方法。
(8)
 コンピュータを、
 少なくとも2以上の絶対位置センサから取得した絶対位置と、移動装置の基準位置に対する前記絶対位置センサの相対位置とに基づいて、前記移動装置の自己位置及び姿勢を算出する自己位置姿勢算出部として機能させる
 プログラム。
(9)
 少なくとも2以上の絶対位置センサと、
 前記絶対位置センサから取得した絶対位置と、自己の基準位置に対する前記絶対位置センサの相対位置とに基づいて、自己位置及び姿勢を算出する制御部と
 を備えた移動装置。
(10)
 移動装置と、
 絶対位置を示す所定のパターンが形成された絶対位置検出用マットと
 を備え、
 前記移動装置は、
  少なくとも2以上の絶対位置センサと、
  前記絶対位置センサから取得した絶対位置と、自己の基準位置に対する前記絶対位置センサの相対位置とに基づいて、自己位置及び姿勢を算出する制御部と
 を有し、
 前記移動装置が前記絶対位置検出用マット上に存在するとき、前記絶対位置センサにより前記パターンを認識することで、前記絶対位置を特定する
 情報処理システム。
(11)
 前記絶対位置検出用マットは、前記絶対位置を示す赤外線パターンを有する
 前記(10)に記載の情報処理システム。
(1)
at least two absolute position sensors;
a self-position/orientation calculator that calculates the self-position and orientation,
The information processing apparatus, wherein the self position and orientation calculation unit calculates the self position and orientation based on the absolute position obtained from the absolute position sensor and the relative position of the absolute position sensor with respect to a reference position of the mobile device.
(2)
The information processing apparatus according to (1), wherein the absolute position sensor is an optical absolute position sensor installed so as to be able to recognize a predetermined pattern formed on a moving surface on which the moving device can move.
(3)
the absolute position includes at least the coordinates of the optical absolute position sensor;
The information processing apparatus according to (2), wherein the optical absolute position sensor identifies the coordinates by recognizing the predetermined pattern.
(4)
The information processing apparatus according to (2) or (3), wherein the predetermined pattern includes a predetermined pattern indicating an absolute position formed on a mat or a floor surface.
(5)
the reference position is the center of the bottom surface of the moving device;
The information processing apparatus according to any one of (1) to (4), wherein the absolute position sensor is arranged with reference to the center of the bottom surface of the mobile device.
(6)
The information processing apparatus according to any one of (1) to (5), wherein the mobile device operates in synchronization with one or more other mobile devices.
(7)
The information processing device
An information processing method, comprising: calculating a self position and orientation of the mobile device based on absolute positions obtained from at least two absolute position sensors and relative positions of the absolute position sensors with respect to a reference position of the mobile device.
(8)
the computer,
Functioning as a self-position/orientation calculator that calculates the self-position and orientation of the mobile device based on the absolute positions obtained from at least two or more absolute position sensors and the relative positions of the absolute position sensors with respect to the reference position of the mobile device. A program that makes
(9)
at least two absolute position sensors;
A mobile device, comprising: a control unit that calculates its own position and orientation based on the absolute position acquired from the absolute position sensor and the relative position of the absolute position sensor with respect to its own reference position.
(10)
a mobile device;
and an absolute position detection mat on which a predetermined pattern indicating an absolute position is formed,
The moving device
at least two absolute position sensors;
a control unit that calculates the self-position and orientation based on the absolute position acquired from the absolute position sensor and the relative position of the absolute position sensor with respect to the self-reference position,
An information processing system that identifies the absolute position by recognizing the pattern with the absolute position sensor when the mobile device is present on the absolute position detection mat.
(11)
The information processing system according to (10), wherein the absolute position detection mat has an infrared pattern indicating the absolute position.
 10 ロボット装置, 20 絶対位置検出用マット, 30 コート, 101 制御ユニット, 102 映像表示ユニット, 103 画面昇降ユニット, 104-1 左モータエンコーダ, 104-2 右モータエンコーダ, 105-1 左モータ, 105-2 右モータ, 106-1乃至106-3,106 センサ, 107 バッテリユニット, 151 メインCPU, 152 センサ, 161,161-1乃至161-N 光学式絶対位置センサ, 171 自己位置姿勢算出部, 172 制御部, 1000 コンピュータ, 1001 CPU 10 Robot device, 20 Absolute position detection mat, 30 Coat, 101 Control unit, 102 Video display unit, 103 Screen lifting unit, 104-1 Left motor encoder, 104-2 Right motor encoder, 105-1 Left motor, 105- 2 Right motor, 106-1 to 106-3, 106 sensor, 107 battery unit, 151 main CPU, 152 sensor, 161, 161-1 to 161-N optical absolute position sensor, 171 self-position/orientation calculator, 172 control Department, 1000 Computer, 1001 CPU

Claims (11)

  1.  少なくとも2以上の絶対位置センサと、
     自己位置及び姿勢を算出する自己位置姿勢算出部と
     を備え、
     前記自己位置姿勢算出部は、前記絶対位置センサから取得した絶対位置と、移動装置の基準位置に対する前記絶対位置センサの相対位置とに基づいて、前記自己位置及び姿勢を算出する
     情報処理装置。
    at least two absolute position sensors;
    a self-position/orientation calculator that calculates the self-position and orientation,
    The information processing apparatus, wherein the self position and orientation calculation unit calculates the self position and orientation based on the absolute position obtained from the absolute position sensor and the relative position of the absolute position sensor with respect to a reference position of the mobile device.
  2.  前記絶対位置センサは、前記移動装置が移動可能な移動面に形成された所定のパターンを認識可能に設置された光学式絶対位置センサである
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the absolute position sensor is an optical absolute position sensor installed so as to be able to recognize a predetermined pattern formed on a moving surface on which the moving device can move.
  3.  前記絶対位置は、前記光学式絶対位置センサの座標を少なくとも含み、
     前記光学式絶対位置センサは、前記所定のパターンを認識することで、前記座標を特定する
     請求項2に記載の情報処理装置。
    the absolute position includes at least the coordinates of the optical absolute position sensor;
    The information processing apparatus according to claim 2, wherein the optical absolute position sensor identifies the coordinates by recognizing the predetermined pattern.
  4.  前記所定のパターンは、マット又は床面に形成された絶対位置を示すパターンを含む
     請求項2に記載の情報処理装置。
    3. The information processing apparatus according to claim 2, wherein the predetermined pattern includes a pattern indicating an absolute position formed on a mat or a floor surface.
  5.  前記基準位置は、前記移動装置の底面の中心であり、
     前記絶対位置センサは、前記移動装置の底面の中心を基準にして配置される
     請求項1に記載の情報処理装置。
    the reference position is the center of the bottom surface of the moving device;
    The information processing device according to claim 1, wherein the absolute position sensor is arranged with reference to the center of the bottom surface of the mobile device.
  6.  前記移動装置は、1又は複数の他の移動装置と同期して動作する
     請求項1に記載の情報処理装置。
    2. The information processing apparatus of claim 1, wherein the mobile device operates synchronously with one or more other mobile devices.
  7.  情報処理装置が、
     少なくとも2以上の絶対位置センサから取得した絶対位置と、移動装置の基準位置に対する前記絶対位置センサの相対位置とに基づいて、前記移動装置の自己位置及び姿勢を算出する
     情報処理方法。
    The information processing device
    An information processing method, comprising: calculating a self position and orientation of the mobile device based on absolute positions obtained from at least two absolute position sensors and relative positions of the absolute position sensors with respect to a reference position of the mobile device.
  8.  コンピュータを、
     少なくとも2以上の絶対位置センサから取得した絶対位置と、移動装置の基準位置に対する前記絶対位置センサの相対位置とに基づいて、前記移動装置の自己位置及び姿勢を算出する自己位置姿勢算出部として機能させる
     プログラム。
    the computer,
    Functioning as a self-position/orientation calculator that calculates the self-position and orientation of the mobile device based on the absolute positions obtained from at least two or more absolute position sensors and the relative positions of the absolute position sensors with respect to the reference position of the mobile device. A program that makes
  9.  少なくとも2以上の絶対位置センサと、
     前記絶対位置センサから取得した絶対位置と、自己の基準位置に対する前記絶対位置センサの相対位置とに基づいて、自己位置及び姿勢を算出する制御部と
     を備えた移動装置。
    at least two absolute position sensors;
    A mobile device, comprising: a control unit that calculates its own position and orientation based on the absolute position acquired from the absolute position sensor and the relative position of the absolute position sensor with respect to its own reference position.
  10.  移動装置と、
     絶対位置を示す所定のパターンが形成された絶対位置検出用マットと
     を備え、
     前記移動装置は、
      少なくとも2以上の絶対位置センサと、
      前記絶対位置センサから取得した絶対位置と、自己の基準位置に対する前記絶対位置センサの相対位置とに基づいて、自己位置及び姿勢を算出する制御部と
     を有し、
     前記移動装置が前記絶対位置検出用マット上に存在するとき、前記絶対位置センサにより前記パターンを認識することで、前記絶対位置を特定する
     情報処理システム。
    a mobile device;
    and an absolute position detection mat on which a predetermined pattern indicating an absolute position is formed,
    The moving device
    at least two absolute position sensors;
    a control unit that calculates the self-position and orientation based on the absolute position obtained from the absolute position sensor and the relative position of the absolute position sensor with respect to the self-reference position,
    An information processing system that identifies the absolute position by recognizing the pattern with the absolute position sensor when the mobile device is present on the absolute position detection mat.
  11.  前記絶対位置検出用マットは、前記絶対位置を示す赤外線パターンを有する
     請求項10に記載の情報処理システム。
    11. The information processing system according to claim 10, wherein said absolute position detection mat has an infrared pattern indicating said absolute position.
PCT/JP2022/001718 2021-03-16 2022-01-19 Information processing device, information processing method, program, moving device, and information processing system WO2022196080A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021041981 2021-03-16
JP2021-041981 2021-03-16

Publications (1)

Publication Number Publication Date
WO2022196080A1 true WO2022196080A1 (en) 2022-09-22

Family

ID=83320140

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/001718 WO2022196080A1 (en) 2021-03-16 2022-01-19 Information processing device, information processing method, program, moving device, and information processing system

Country Status (1)

Country Link
WO (1) WO2022196080A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110170118A1 (en) * 2007-07-13 2011-07-14 Thorsten Mika Device and Method for Determining a Position and Orientation
JP2020087459A (en) * 2018-11-15 2020-06-04 グレイ オレンジ ピーティーイー. リミテッド System and method for handling items using movable-bots

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110170118A1 (en) * 2007-07-13 2011-07-14 Thorsten Mika Device and Method for Determining a Position and Orientation
JP2020087459A (en) * 2018-11-15 2020-06-04 グレイ オレンジ ピーティーイー. リミテッド System and method for handling items using movable-bots

Similar Documents

Publication Publication Date Title
US20200389595A1 (en) Systems and methods for rolling shutter correction
US10290154B1 (en) Stereo-based calibration apparatus
KR101988083B1 (en) Systems and methods for tracking location of movable target object
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
Tournier et al. Estimation and control of a quadrotor vehicle using monocular vision and moire patterns
EP2697697B1 (en) Object tracking with projected reference patterns
US9161019B2 (en) Multi-dimensional data capture of an environment using plural devices
US8731720B1 (en) Remotely controlled self-balancing robot including kinematic image stabilization
TWI397671B (en) System and method for locating carrier, estimating carrier posture and building map
CN106525074B (en) A kind of compensation method, device, holder and the unmanned plane of holder drift
CN111465886A (en) Selective tracking of head mounted displays
US20120287232A1 (en) Surround View System Camera Automatic Calibration
EP3744484B1 (en) Information processing device, information processing method, and information processing system
JP6259233B2 (en) Mobile robot, mobile robot control system, and program
CN108886573A (en) Increase steady system and method for digital video
CN108344401A (en) Localization method, device and computer readable storage medium
US9749535B1 (en) Stabilization of captured images for a robot
RU2758036C1 (en) Method and system for optical-inertial tracking of a mobile object
WO2020062089A1 (en) Magnetic sensor calibration method and movable platform
WO2022196080A1 (en) Information processing device, information processing method, program, moving device, and information processing system
Zhu et al. Wii remote–based low-cost motion capture for automated assembly simulation
WO2023141963A1 (en) Pose estimation method for movable platform, movable platform, and storage medium
US20210201011A1 (en) Data processing method for multi-sensor fusion, positioning apparatus and virtual reality device
KR20180106178A (en) Unmanned aerial vehicle, electronic device and control method thereof
CN110415329B (en) Three-dimensional modeling device and calibration method applied to same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22770842

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22770842

Country of ref document: EP

Kind code of ref document: A1