WO2020137685A1 - Control device, control method, program, and control system - Google Patents

Control device, control method, program, and control system Download PDF

Info

Publication number
WO2020137685A1
WO2020137685A1 PCT/JP2019/049367 JP2019049367W WO2020137685A1 WO 2020137685 A1 WO2020137685 A1 WO 2020137685A1 JP 2019049367 W JP2019049367 W JP 2019049367W WO 2020137685 A1 WO2020137685 A1 WO 2020137685A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving body
unit
lost
self
information
Prior art date
Application number
PCT/JP2019/049367
Other languages
French (fr)
Japanese (ja)
Inventor
諒 渡辺
雅貴 豊浦
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2020137685A1 publication Critical patent/WO2020137685A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present disclosure relates to a control device, a control method, a program, and a control system.
  • a technique for estimating the self-position based on information measured by a sensor or the like provided in the moving body is widespread.
  • the moving body may lose its position due to, for example, a system error or an external factor. Therefore, a technique for restoring the self-position when the mobile body loses the self-position is being developed.
  • Patent Document 1 discloses a technique for returning its own position by an RFID (Radio Frequency Identifier) system provided in the external environment of the mobile body. ing. Specifically, the mobile body acquires the position information of the RFID tag based on the information read from the RFID tag provided in the external environment of the mobile body, and identifies its own position from the acquired position information.
  • RFID Radio Frequency Identifier
  • Patent Document 1 the technology described in Patent Document 1 is based on the premise that an RFID system is provided in the external environment of the mobile body. Therefore, when the mobile body loses its own position in an environment where the RFID system is not provided, it is difficult for the mobile body to return to its own position.
  • a new and improved control device, control method, program, and control capable of returning its own position when the moving body approaches the approaching object when the moving body loses its own position. Suggest a system.
  • an estimation unit that estimates a self-position
  • a determination unit that determines whether or not a mobile body has lost the self-position
  • an approach target is approached according to a determination result by the determination unit.
  • a processing control unit that performs the processing of 1.
  • the approaching target is approached according to the estimation of the self-position, the determination of whether or not the mobile body has lost the self-position, and the determination result of the determination.
  • a control method executed by the processor the method including:
  • the computer is configured to approach the computer according to an estimation unit that estimates a self-position, a determination unit that determines whether or not the moving body has lost the self-position, and a determination result by the determination unit.
  • An estimation unit that estimates a self-position
  • a determination unit that determines whether or not the moving body has lost the self-position
  • a determination result by the determination unit e.g., a processing control unit that performs processing for approaching an object and a program that causes the processing control unit to function as the processing control unit are provided.
  • a first estimation unit that estimates the self-position
  • a first determination unit that determines whether or not the moving body has lost the self-position, and a determination performed by the first determination unit
  • a first processing control unit that performs processing for approaching the approaching object in accordance with the result
  • a second controller that estimates the self-positions of the moving body and the approaching object.
  • a second control device including: an estimation unit; and a second processing control unit that performs processing for approaching the moving body in accordance with a determination result by the first determination unit,
  • the first control device transmits request information regarding return of the self-position to the second control device via communication
  • the second control device transmits the request information to the second control device.
  • the control device causes the approaching target to approach the moving body based on the request information received from the first control device via communication, and causes the approaching target to estimate the self-position of the moving body.
  • the present disclosure relates to a technique of returning a self-position when a moving body loses the self-position.
  • the mobile body may be a device that can move autonomously.
  • the moving body may be a robot capable of autonomously moving (for example, walking).
  • autonomously movable robots include humanoid robots, pet robots, self-driving cars, and drones.
  • the present embodiment is not limited to this example.
  • the mobile may be a vehicle (eg, a vehicle, ship, or air vehicle), various industrial machines, or other types of devices such as toys.
  • the moving body is a robot used at home (hereinafter, also referred to as “home robot”).
  • the domestic robot is, for example, a pet robot.
  • the term "lost” may be used below in relation to the loss of the mobile body's position.
  • the “lost” includes meanings such as that the moving body loses its own position and that the moving body has lost its own position.
  • a moving body that has lost its own position is also called a lost moving body.
  • the mobile body may have difficulty performing the expected operation. For example, if the loss continues, the moving body may keep moving around in the same place. Therefore, when the moving body is lost, it is desirable to quickly return the moving body from the lost state.
  • the mobile body acquires RFID position information from an RFID tag provided in the external environment, estimates its own position based on the acquired position information, and returns from the lost state.
  • the technique is based on the premise that the RFID system provided in the external environment of the mobile body is used. Therefore, for example, when the mobile body loses its own position in an environment where the RFID system is not provided, it is difficult for the mobile body to return to its own position.
  • the embodiment of the present disclosure is conceived in view of the above points, and is a technique capable of returning the self-position by approaching the approach target when the mobile unit loses its own position.
  • the approach target is a target that approaches the lost moving body.
  • the approaching object for example, a person or another moving body can be cited.
  • the route along which the moving body can move is also referred to as a “moving route” below.
  • the route for the moving body and the approaching object to approach each other is also referred to as an “approaching route” below.
  • the present embodiment will be sequentially described in detail.
  • FIG. 1 is a block diagram showing a functional configuration example of a moving body according to an embodiment of the present disclosure.
  • the moving body 10 according to the embodiment of the present disclosure includes a sensor unit 100, a control unit 110, a storage unit 120, a drive unit 130, an output unit 140, and a communication unit 150.
  • the sensor unit 100 has a function of sensing information used for processing in the control unit 110.
  • the sensor unit 100 may include various sensor devices.
  • the sensor unit 100 may include an external sensor, an internal sensor, a camera, a microphone (hereinafter referred to as a microphone), and an optical sensor.
  • the sensor unit 100 performs sensing using the sensor described above. Then, the sensor unit 100 outputs the sensing information acquired by the various sensors by sensing to the control unit 110.
  • the sensor unit 100 acquires information used by the control unit 110 to estimate the self-position of the moving body 10 by using the external sensor and the internal sensor.
  • the external world sensor is a device that senses information outside the moving body 10.
  • the external sensor may include a camera, a distance measuring sensor, a depth sensor, a GPS (Global Positioning System) sensor, a magnetic sensor, a communication device, and the like.
  • the internal sensor is a device that senses information inside the moving body 10.
  • the internal sensor may include an acceleration sensor, a gyro sensor, an encoder, etc.
  • the sensor unit 100 uses a camera, a microphone, and an optical sensor to acquire information used by the control unit 110 to estimate the position of a person.
  • a camera is an image pickup device that has a lens system such as an RGB camera, a drive system, and an image pickup element, and picks up an image (still image or moving image).
  • the camera is capable of capturing an image of the outside of the moving body 10 by being provided so as to be able to capture the outside of the moving body 10, for example.
  • the sensor unit 100 can acquire a captured image of the periphery of the moving body 10 with the camera.
  • the distance measuring sensor is, for example, a device such as a ToF (Time of Flight) sensor that acquires distance information.
  • the depth sensor is, for example, an infrared distance measuring device, an ultrasonic distance measuring device, a LiDAR (Laser Imaging Detection and Ranging), or a device that acquires depth information such as a stereo camera.
  • the sensor unit 100 can acquire the distance information to the target around the moving body 10 by the distance measuring sensor or the depth sensor.
  • the GPS sensor is a device that measures the positional information including the latitude, longitude, and altitude of the mobile unit 10 by receiving GPS signals from GPS satellites.
  • the sensor unit 100 can acquire the position information of the moving body 10 by using the GPS sensor.
  • a magnetic sensor is a device that measures the magnitude and direction of a magnetic field.
  • the sensor unit 100 can acquire information on the magnetic field at the position of the moving body 10 by using the magnetic sensor.
  • the communication device is a device that communicates with another mobile unit 10.
  • the communication device performs communication by, for example, Bluetooth (registered trademark), Wi-Fi (registered trademark), or the like.
  • the sensor unit 100 can acquire communication information when the mobile unit 10 communicates with another mobile unit 10 by the communication device.
  • the communication device may be realized as the communication unit 150 instead of the sensor unit 100.
  • An acceleration sensor is a device that acquires the acceleration of an object.
  • the acceleration sensor measures acceleration, which is the amount of change in speed when the moving body 10 moves.
  • the gyro sensor is a device having a function of acquiring the angular velocity of an object.
  • the gyro sensor measures the angular velocity, which is the amount of change in the posture of the moving body 10.
  • An encoder is a device that acquires the rotation angle of an object. The encoder is provided, for example, on a wheel or joint of the moving body 10 and measures a rotation angle that is a change amount of the angle when the wheel or joint rotates.
  • the information acquired by the acceleration sensor, the gyro sensor, and the encoder is also referred to as “inertia information” below.
  • the sensor unit 100 can acquire inertial information of the moving body 10 by using the acceleration sensor, the gyro sensor, and the encoder.
  • Microphone is a device that detects surrounding sounds.
  • the microphone collects ambient sound and outputs audio data converted into a digital signal via an amplifier and an ADC (Analog Digital Converter).
  • the sensor unit 100 can acquire voice information around the moving body 10 by using the microphone.
  • the number of microphones is not limited to one, and a plurality of microphones may be used, or a so-called microphone array may be configured. When detecting the direction of the voice based on the voice information, the detection accuracy improves as the number of microphones increases. Therefore, it is desirable that the number of microphones is large.
  • the optical sensor is a device that detects ambient light.
  • the sensor unit 100 can acquire light information around the moving body 10 by the optical sensor.
  • Control unit 110 is a control device having a function of controlling the operation of the entire moving body 10. In order to realize the function, the control unit 110 has an estimation unit 112, a determination unit 114, a detection unit 116, and a processing control unit 118, as shown in FIG.
  • the estimation unit 112 has a function of estimating the self-position of the moving body 10. For example, the estimation unit 112 estimates the self-position of the moving body 10 based on the sensing information input from the sensor unit 100. Examples of methods by which the estimation unit 112 estimates the self-position of the moving body 10 include, for example, star reckoning and dead reckoning. The method by which the estimation unit 112 estimates the self-position of the moving body 10 is not limited to star reckoning and dead reckoning.
  • Star reckoning is a method of estimating the absolute self-position of the moving body 10 based on sensing information from an external sensor.
  • the external sensor is the above-described camera, distance measuring sensor, depth sensor, GPS sensor, magnetic sensor, communication device, or the like.
  • the sensing information by the external sensor may include a physical quantity such as the position and orientation of the moving body 10, for example.
  • the external world sensor can acquire the absolute position of the moving body 10 as sensing information. However, the rate at which the external sensor acquires the sensing information is lower than the rate at which the internal sensor used in dead reckoning acquires the sensing information.
  • the estimation unit 112 can directly calculate the absolute position of the moving body 10 from the physical quantity acquired by the external sensor by star reckoning, and estimate the calculated absolute position as the self-position of the moving body 10.
  • the estimating unit 112 may estimate the absolute position as the self-position of the moving body 10.
  • Dead reckoning is a method of estimating the relative self-position of the moving body 10 based on the sensing information from the internal sensor.
  • the internal sensor is the above-described acceleration sensor, gyro sensor, encoder, or the like. Sensing information by the internal sensor may include physical quantities such as the velocity, acceleration, relative position, and angular velocity of the moving body 10, for example.
  • the internal sensor does not acquire the absolute position of the moving body 10 as sensing information. However, the rate at which the internal sensor acquires sensing information is higher than the rate at which the external sensor used in star reckoning acquires sensing information.
  • the estimation unit 112 may calculate the relative position and posture of the moving body 10 by integrating the physical quantities acquired by the internal sensor by dead reckoning, and estimate the self-position of the moving body 10 from the relative position and posture. it can.
  • the internal sensor acquires sensing information at a higher rate than the external sensor. Therefore, when estimating the self-position by dead reckoning, the estimation unit 112 can estimate the self-position at a higher rate than the star reckoning.
  • the internal sensor can acquire sensing information without interruption at regular intervals. Therefore, the estimation unit 112 can perform continuous self-position estimation at regular intervals without interruption.
  • the self-position of the moving body 10 may be estimated based on the information acquired by odometry.
  • Odometry is, for example, a method of calculating the moving amount of the moving body 10 by forward dynamics calculation using the amount of rotation of the wheels of the moving body 10, the angle of the joint, the information on the geometrical shape, and the like.
  • the estimation unit 112 may use a method called wheel odometry that calculates the amount of movement from the amount of rotation of the wheels.
  • the estimation unit 112 may use a method called visual odometry that calculates the movement amount from the temporal change amount of the feature amount in the captured image captured by the camera. Then, the estimation unit 112 estimates the self-position of the moving body 10 from the calculated movement amount.
  • the estimation unit 112 estimates the self-position of the moving body 10 using only one of star reckoning and dead reckoning
  • the estimated self-position may have an error peculiar to each method.
  • the external sensor is a sensor that continuously receives position information at a constant cycle
  • the reception of position information becomes unstable due to deterioration of the radio wave condition
  • an error occurs in the self position estimated by the estimation unit 112 by star reckoning.
  • the processing in the estimation unit 112 becomes high load and the processing efficiency decreases, and thus the estimation unit 112 estimates by star reckoning.
  • An error may occur in the self position.
  • the relative position from the reference point and the posture of the moving body are estimated by the integration process for the physical quantity acquired by the internal sensor. Therefore, for example, if the physical quantity includes an error, the error is accumulated by the integration process, and a cumulative error may occur. Then, due to the accumulated error, an error may occur in the self-position estimated by the estimation unit 112 by dead reckoning.
  • the estimation unit 112 estimates the self-position of the moving body 10 using both star reckoning and dead reckoning. For example, the estimation unit 112 corrects the relative self-position estimated by dead reckoning with the absolute self-position estimated by star reckoning. As a result, the estimation unit 112 can reset the accumulated error accumulated by dead reckoning based on the absolute self-position.
  • the estimation unit 112 can estimate the self-position with higher accuracy than the estimation of the self-position using only one of the star reckoning and the dead reckoning.
  • the estimating unit 112 can also ensure the continuity of the estimated self-position.
  • the process of correcting the self-position estimated by dead reckoning with the self-position estimated by star reckoning is also referred to as "integration process" below.
  • the integration process is performed using, for example, a Kalman filter or a particle filter.
  • the estimation unit 112 can perform the integration process at the timing when the self position is estimated by the star reckoning, that is, at the timing when the data for correcting the self position estimated by the dead reckoning is acquired. If the self-position has not been estimated by star reckoning, that is, if the data for correcting the self-position estimated by dead reckoning has not been acquired, the estimation unit 112 may skip the integration process.
  • the estimation unit 112 has a function of estimating the position of the approach target. For example, the estimation unit 112 estimates the position of the approach target based on the information about the approach target (hereinafter, also referred to as “approach target information”). For example, when the detection unit 116 detects a person as the approach target, the estimation unit 112 estimates the position of the approach target based on the approach target information input from the detection unit 116. Specifically, the estimation unit 112 estimates the position of the person shown in the captured image (approach target information) input from the detection unit 116 as the position of the approach target.
  • the estimating unit 112 estimates the position of the approaching target based on the approaching target information input from the detecting unit 116. For example, when the detection unit 116 detects a voice as an approach target, the estimation unit 112 inputs information indicating the position of the sound source of the voice (hereinafter, also referred to as “sound source position information”) as the approach target information. To be done. Then, the estimating unit 112 estimates the position of the sound source as the position of the approach target based on the input sound source position information.
  • sound source position information information indicating the position of the sound source of the voice
  • the estimation unit 112 receives information indicating the position of the light source of the light (hereinafter, also referred to as “light source position information”) as the approach target information. To be done. Then, the estimation unit 112 estimates the position of the light source as the position of the approach target based on the input light source position information.
  • the estimation unit 112 also has a function of estimating the self-position of another lost mobile body 10.
  • the estimation unit 112 of the non-lost mobile unit 10 estimates the self-position of the lost mobile unit 10 from the relative positional relationship between the non-lost mobile unit 10 and the lost mobile unit 10. Specifically, first, the non-lost moving body 10 approaches the lost moving body 10 and the camera of the sensor unit 100 images the lost moving body 10. Next, the estimation unit 112 of the non-lost moving body 10 determines the relative position between the non-lost moving body 10 and the lost moving body 10 based on its own position and the position of the lost moving body 10 shown in the captured image. Calculate the relative position. Then, the estimation unit 112 of the mobile body 10 that is not lost estimates the calculated position as the self-position of the lost mobile body 10.
  • the determination unit 114 has a function of determining whether or not the moving body 10 is lost. For example, the determination unit 114 determines whether or not the mobile body 10 is lost based on the estimation result of the self-position of the mobile body 10 input from the estimation unit 112 (hereinafter, also referred to as “lost determination”). To do. When the determination unit 114 determines that the moving body 10 is lost, the determination unit 114 outputs the determination result to the estimation unit 112 and the detection unit 116. When the determination unit 114 determines that the moving body 10 is lost, the determination unit 114 stores the last acquired self position of the moving body 10 before the loss (hereinafter, also referred to as “lost position”) in the storage unit 120. It is output and stored in the storage unit 120.
  • the sensing result by the external sensor may change greatly due to external factors. That is, the estimation result of the self-position by the star reckoning may change greatly due to external factors. Therefore, when the sensing result of the external sensor changes significantly, the result of estimating the self-position by star reckoning may be an unexpected result, and the moving body 10 may be lost. For example, it is assumed that the brightness greatly changes due to turning off the electricity in the room or inserting sunlight while estimating the self-position by matching the characteristic points in the image captured by the camera.
  • the result of estimating the self-position by star reckoning becomes an unexpected result, and the moving body 10 is lost. Further, for example, when the moving body 10 is surrounded by an animal body such as a person during the point cloud matching by Lidar or the like, the moving body 10 is similarly lost.
  • the determining unit 114 determines that the moving body 10 is lost. For example, when the self-position estimated by the star reckoning is a position separated from the self-position estimated by the dead reckoning by a predetermined distance or more, the determination unit 114 determines that the self-position estimation result of the moving body 10 by the star reckoning is It is judged that the result was unexpected. That is, the determination unit 114 determines that the moving body 10 has been lost.
  • an arbitrary distance may be set as the predetermined distance. For example, the user may set an arbitrary value, or the distance calculated by general statistics may be set. Further, the method of determining whether or not the moving body 10 is lost is not limited to the above example.
  • the detection unit 116 has a function of detecting information about the periphery of the moving body 10. For example, the detection unit 116 detects the approach target from the sensing information of the sensor device included in the sensor unit 100. The process in which the detection unit 116 detects an approaching object is also referred to as a “detection process” below.
  • the detection unit 116 detects the approach target information from the sensing information of the sensor device included in the sensor unit 100. Then, the detection unit 116 outputs the detected approach target information to the estimation unit 112.
  • the approach target information is information for estimating the position of the approach target when the moving body 10 is lost. Therefore, the detection unit 116 performs a process of detecting the approach target information when the determination result indicating that the moving body 10 has been lost is input from the determination unit 114.
  • the detection unit 116 detects the approach target and the approach target information using the sensing information input from at least one of a camera (imaging device), a microphone, and an optical sensor.
  • the detection unit 116 detects the approach target based on the captured image acquired by the camera.
  • the detection unit 116 detects an imaged person as an approach target by performing image recognition processing on the captured image.
  • the detection unit 116 may perform image recognition processing by machine learning (for example, deep learning) when performing the image recognition processing. As a result, the detection unit 116 can detect a person existing around the moving body 10.
  • the detection unit 116 When a person existing around the moving body 10 is detected as the approach target based on the captured image, the detection unit 116 outputs the captured image to the estimation unit 112 as the approach target information.
  • the detection unit 116 may also acquire the position information of the person when performing the image recognition process, and output the acquired position information to the estimation unit 112 as the approach target information.
  • the detection unit 116 communicates with the mobile unit 10 from among the candidates (hereinafter, also referred to as the number of communications).
  • a person with a large number of people is detected as an approach target.
  • the detection unit 116 detects the communication partner of the mobile unit 10 each time the mobile unit 10 communicates with a person, and stores the communication partner in the storage unit 120 in association with the communication count.
  • the detection unit 116 detects feature information (for example, a face image) indicating the feature of the other party from the captured image by the image recognition processing, and outputs the detected feature information and the number of times of communication to the storage unit 120.
  • the detection unit 116 acquires the number of times of communication with the candidates from the storage unit 120 and compares the number of communication with the candidates, thereby detecting a person with a large number of communication times as the approach target. To do. Accordingly, the detection unit 116 can detect a person who is closer to the moving body 10 as an approach target. The person closer to the moving body 10 is, for example, the owner who takes care of the moving body 10 most.
  • Examples of communication include strokes (detection of contact with a person), voice conversation (detection of a person's voice), eye contact (detection of a person's line of sight), and the like.
  • a person who is close to the mobile unit 10 is more likely to know a coping method when the mobile unit 10 is lost than a person who is not close to the mobile unit 10. Therefore, since the detection unit 116 detects a person who is closer to the moving body 10 as an approach target, the moving body 10 has a higher possibility of being able to take an appropriate countermeasure earlier when lost. Therefore, by having a person close to the moving body 10 take appropriate measures, it is possible to recover from the lost faster than when a person not close to the person takes measures.
  • the detection unit 116 detects the position of the sound source as the approach target information based on the voice information acquired by the microphone. For example, the detection unit 116 detects the direction of voice from the voice information. When the direction of the sound source is detected from the voice information, the detection unit 116 detects the sound source existing in the direction of the voice as the approach target. Then, the detection unit 116 outputs the sound source position information indicating the detected position of the sound source to the estimation unit 112 as the approach target information.
  • the detection unit 116 detects the position of the light source as approach target information based on the optical information acquired by the optical sensor. For example, the detection unit 116 detects the direction of light from the light information. When the light direction is detected from the light information, the detection unit 116 detects a light source existing in the light direction as an approach target. Then, the detection unit 116 outputs the light source position information indicating the detected position of the light source to the estimation unit 112 as the approach target information.
  • An example of the light information is a light intensity gradient that indicates the rate of change of light intensity. Specifically, when the light intensity gradient increases in the bright direction and the light intensity gradient decreases in the dark direction, the light source may be present in the light intensity gradient increasing direction. Therefore, the detection unit 116 can detect the position of the light source by detecting the direction of light based on the light intensity gradient.
  • the detection unit 116 may set the priority when detecting the approaching object.
  • the moving body 10 can easily return to its own position by approaching the person and requesting help. Specifically, the moving body 10 can have the person move to a position where the own position can be restored.
  • the approaching object is a sound source or a light source
  • the priority when the detection unit 116 detects the approaching object is preferably set so that the detection unit 116 detects the person with the highest priority. As a result, the moving body 10 can increase the possibility of returning to its own position when approaching the approach target.
  • Processing control unit 118 has a function of controlling the processing in the control unit 110. For example, the processing control unit 118 performs processing according to whether or not the moving body 10 is lost.
  • the moving body 10 that is not determined to have been lost is also referred to as a “normal moving body 10” below. Further, the moving body 10 determined to have been lost is also referred to as “lost moving body 10” below.
  • the processing relating to the return of the self-position is also referred to as “return processing” below.
  • the request for returning the self-position is also referred to as a "return request" below.
  • the processing control unit 118 of the normal moving body 10 plans the moving route of the normal moving body 10 and moves the normal moving body 10 along the moving route.
  • the movement route is planned by the processing control unit 118 of the normal moving body 10 based on the normal position of the normal moving body 10 and the destination, for example.
  • the process control unit 118 of the normal moving body 10 routes from the self-position of the normal moving body 10 input from the estimating unit 112 to the destination of the normal moving body 10 acquired from the storage unit 120. Is planned as a travel route.
  • the normal moving body 10 can also perform processing for the lost moving body 10. For example, the normal moving body 10 performs a process of returning the lost moving body 10. At this time, the process control unit 118 of the normal moving body 10 controls the operation of the normal moving body 10 according to whether or not the request information regarding the return request from the approaching object is received.
  • the approaching object in this case is the lost moving body 10.
  • the request for return from the approaching object here is, for example, to return the lost self-position of the moving body 10.
  • the request information mentioned here is information including at least one of information indicating that the information has been lost, information indicating the lost position, and information indicating the locus until lost.
  • the information indicating the lost information is also referred to as “lost information” below.
  • the information indicating the lost position is also referred to as “lost position information” below.
  • the information indicating the locus until lost is also referred to as “trajectory information” below.
  • the process control unit 118 of the normal moving body 10 causes the normal moving body 10 to approach the approaching object based on the request information, and estimates the position of the approaching object to the normal moving body 10.
  • the processing control unit 118 of the normal moving body 10 calculates, based on the self-position of the normal moving body 10 estimated by the estimating unit 112 and the lost position information of the lost moving body 10 included in the request information, Plan an approach route to the lost moving body 10.
  • the processing control unit 118 of the normal moving body 10 moves the normal moving body 10 along the planned approach route, and brings the normal moving body 10 and the lost moving body 10 close to each other.
  • the process control unit 118 of the normal moving body 10 causes the normal moving body 10 to estimate the self position of the lost moving body 10.
  • the normal moving body 10 estimates the self-position of the lost moving body 10 by calculating the relative positional relationship between the normal moving body 10 and the lost moving body 10.
  • the normal moving body 10 can estimate the self-position of the lost moving body 10 by approaching the lost moving body 10.
  • the lost moving body 10 can restore its own position based on the self position of the lost moving body 10 estimated by the normal moving body 10.
  • the process control unit 118 of the normal mobile unit 10 transmits the estimated self-position to the lost mobile unit 10. Then, the lost moving body 10 identifies the self position received from the normal moving body 10 as the self position of the lost moving body 10. As a result, the lost moving body 10 can return to its own position.
  • the processing control unit 118 of the lost moving body 10 performs processing for allowing the lost moving body 10 to approach an approaching target (hereinafter, also referred to as “approaching processing”). For example, the process control unit 118 of the lost moving body 10 performs the approach process based on the position of the approach target. Specifically, the processing control unit 118 of the lost moving body 10 plans an approach route from the self position of the lost moving body 10 and the position of the approach target, and moves the lost moving body 10 along the approach route. The process is performed as an approach process. Further, the processing control unit 118 of the lost moving body 10 causes the lost moving body 10 to make a return request to the approaching object. As a result, the lost mobile body 10 can return its own position by issuing a return request to the approaching object. The contents of the approach process and the return request differ depending on the approach target.
  • the lost moving body 10 When the approach target is a person, the lost moving body 10 performs approach processing on the person and requests the person to return. Specifically, first, the processing control unit 118 of the lost moving body 10 plans an approach route for the lost moving body 10 to approach the person based on the position of the person. After the planning, the processing control unit 118 of the lost moving body 10 moves the lost moving body 10 along the planned approach route, and brings the lost moving body 10 and the person closer. After approaching, the process control unit 118 of the lost moving body 10 causes the lost moving body 10 to make a return request to a person. The processing control unit 118 of the lost moving body 10 may cause the lost moving body 10 to make a return request while the lost moving body 10 is approaching a person. As a result, the lost moving body 10 can return to its own position via a person. Further, the lost moving body 10 can restore its own position without depending on other external systems.
  • the return request here is, for example, to move the lost moving body 10 to a position where the own position can be returned.
  • the method by which the lost mobile unit 10 makes the return request is not particularly limited.
  • the process control unit 118 causes the lost moving body 10 to make a return request expressed by at least one of gesture, voice output, and display output.
  • the processing control unit 118 of the lost moving body 10 has information indicating the content of the return request (hereinafter, also referred to as “return request information”).
  • return request information information indicating the content of the return request
  • the arm or leg of the lost moving body 10 is driven.
  • the processing control unit 118 of the lost mobile unit 10 may cause the mobile unit 10 to make a return request using the output unit 140.
  • the processing control unit 118 of the lost mobile unit 10 generates output information according to the output device that realizes the function of the output unit 140, and causes the output unit 140 to output the generated output information.
  • the processing control unit 118 of the lost mobile unit 10 when the function of the output unit 140 is realized by a voice output device such as a speaker, the processing control unit 118 of the lost mobile unit 10 generates a voice indicating the return request information as the output information, and the generated voice. Is output from the audio output device. As a result, the lost moving body 10 can make a return request using the audio output device.
  • the processing control unit 118 of the lost moving body 10 When the function of the output unit 140 is realized by a display device such as a display, the processing control unit 118 of the lost moving body 10 generates an image or text indicating the return request information as output information, and the generated image or text. Etc. are displayed on the display device. As a result, the lost moving body 10 can make a return request using the display device.
  • the position where the self position can be returned here is, for example, the position of the charger of the lost moving body 10.
  • the mobile body 10 such as a home robot may estimate its own position based on the position of the charger. Therefore, the lost moving body 10 can obtain the information of the base point from the charger by being moved to the position of the charger by a person, for example. Then, the lost moving body 10 can estimate its own position based on the acquired information on the base point and restore the own position.
  • the information on the base point may be transmitted to the lost moving body 10 by connecting the lost moving body 10 to the charger, or when the lost moving body 10 moves to the vicinity of the charger. It may be transmitted to the lost mobile unit 10 by wireless communication or the like.
  • the processing control unit 118 of the lost mobile unit 10 causes the lost mobile unit 10 to transmit request information regarding the return request to the approach target.
  • the processing control unit 118 of the lost moving body 10 causes the lost moving body 10 to transmit request information to the normal moving body 10. Accordingly, the lost moving body 10 can restore its own position by detecting the normal moving body 10 even if it cannot detect a person, a sound source, a light source, or the like as an approach target.
  • the request information mentioned here is information including at least one of lost information, lost position information, and trajectory information.
  • the normal mobile unit 10 can detect that the source mobile unit 10 has been lost by receiving the lost information as the request information. Further, the normal moving body 10 can detect the lost position of the moving body 10 that is the transmission source by receiving the lost position information as the request information. Further, the normal moving body 10 receives the locus information of the lost moving body 10 as the request information, so that the moving body 10 as the transmission source lost the route (hereinafter, also referred to as “lost route”). It is possible to plan an approach route that does not include.
  • the normal mobile unit 10 can move to the position of the lost mobile unit 10 and estimate the self-position of the lost mobile unit 10. Then, the lost moving body 10 can restore its own position by identifying the self position received from the normal moving body 10 as the self position of the lost moving body 10.
  • the processing control unit 118 has a route planning unit 1180 and an operation control unit 1182 as shown in FIG.
  • Path planning unit 1180 The route planning unit 1180 has a function of planning a route. For example, the route planning unit 1180 performs the above-described process regarding the travel route planning as the process performed by the process control unit 118. Then, the route planning unit 1180 outputs information indicating the planned movement route to the operation control unit 1182.
  • the route planning unit 1180 may plan the movement route so that the moving body 10 does not pass the past lost position. Thereby, the route planning unit 1180 can prevent the recurrence of the lost of the mobile unit 10.
  • the route planning unit 1180 sets priorities for candidates of a moving route that the moving body 10 can move, and determines a candidate with a higher priority as the moving route of the moving body 10. There is a method of doing. For example, the route planning unit 1180 sets the priority of the candidate of the moving route including the lost position lower than the priority of the candidate of the moving route that does not include the lost position. Accordingly, when there are a moving route that does not include the lost position and a moving route that includes the lost position as candidates for the moving route, the route planning unit 1180 can select the moving route that does not include the lost position.
  • the route planning unit 1180 may set the priority according to the number of lost positions included in the candidates for the moving route. Specifically, the route planning unit 1180 may set the priority to a lower value for a candidate of a moving route including a larger number of lost positions. As a result, when there are a moving route that does not include the lost position and a moving route that includes the lost position as candidates, the route planning unit 1180 can select the moving route that does not include the lost position. In addition, when there are only a plurality of movement routes including the lost position as a candidate, the route planning unit 1180 can select the movement route having the smaller lost position.
  • the lost position described above may include both the lost position of the moving body 10 and the lost position of another moving body 10. Moreover, the data stored in the storage unit 120 each time the moving body 10 is lost is used as the above-mentioned lost position.
  • FIG. 2 is a diagram illustrating an example of a travel route plan that does not pass through a lost position according to the embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of an approach route plan that does not pass through a lost route according to the embodiment of the present disclosure.
  • FIG. 2 shows an example in which the moving body 10 plans a moving route 70 that does not include the lost point 50 in the passage 22 surrounded by the wall 20.
  • the moving body 10 shown in FIG. 2 has started moving along the moving route 60 from the starting point 30 toward the destination 40 in the past, and has been lost at the lost point 50. Therefore, when the moving body 10 passes through the moving route 60 again, there is a possibility that the moving body 10 is lost again at the lost point 50.
  • the moving body 10 when the moving body 10 re-plans the moving route from the starting point 30 to the destination 40, it plans a route that does not pass through the lost point 50. Specifically, the moving body 10 plans the moving route 70 shown in FIG. As shown in FIG. 2, the travel route 70 is a route that does not include the lost point 50. Therefore, the moving body 10 can move to the destination 40 without passing through the lost point 50 by moving along the movement route 70 from the start point 30. In addition, the moving body 10 moves along the moving route 70, so that the possibility of being lost can be reduced as compared with the case where the moving body 10 moves along the moving route including the lost point 50.
  • FIG. 3 shows an example of planning an approach route 90 that does not include the moving route 60 that is the lost route to the moving body 10a lost by the moving body 10b in the passage 22 surrounded by the wall 20.
  • the moving body 10a shown in FIG. 3 starts moving along the movement route 60 from the starting point 30a toward the destination 40 and is lost at the lost point 50. Therefore, when the moving body 10b approaches the moving body 10a by planning an approach route 80 including a route partially common to the moving route 60, there is a risk that the moving body 10b will be lost on the partially common route.
  • the moving body 10b plans the approach route to the moving body 10a, it plans a route that does not include the moving route 60 of the moving body 10a.
  • the moving body 10b plans an approaching route 90 that is a route that does not include the moving route 60 shown in FIG.
  • the moving body 10b plans an approach route 90 that does not pass through the route that the lost moving body 10a has moved to based on the lost position information and the trajectory information included in the request information received from the moving body 10a. To do.
  • the moving body 10b moves from the starting point 30b along the approach route 90, and thus can approach the moving body 10a lost at the lost point 50 without passing through the moving route 60.
  • the moving body 10b moves along the approaching route 90, so that the possibility of being lost can be reduced as compared with the case where the moving body 10b moves along the approaching route including the moving route 60.
  • Operation control unit 1182 has a function of controlling the operation of the moving body 10.
  • the operation control unit 1182 controls the operation of the moving body 10 by controlling the driving of the driving unit 130.
  • the operation control unit 1182 moves the moving body 10 along the moving route based on the information indicating the moving route input from the route planning unit 1180 (hereinafter, also referred to as “moving route information”).
  • Drive information for driving the drive unit 130 is generated when the drive is performed. Then, the operation control unit 1182 outputs the generated drive information to the drive unit 130.
  • the operation control unit 1182 also controls the drive of the drive unit 130 to control the operation when the moving body 10 makes a return request. Specifically, the operation control unit 1182 generates drive information for driving the drive unit 130 when the moving body 10 makes a return request based on the return request information. Then, the operation control unit 1182 outputs the generated drive information to the drive unit 130.
  • the operation control unit 1182 controls the output of the output unit 140 to control the operation when the mobile unit 10 makes a return request. Specifically, the operation control unit 1182 generates output information suitable for the output device according to the output device that realizes the function of the output unit 140. Then, the operation control unit 1182 outputs the generated output information to the output unit 140.
  • Storage unit 120 has a function of storing data regarding processing in the mobile unit 10.
  • the storage unit 120 stores information used when the moving body 10 plans a moving route.
  • the storage unit 120 stores information indicating destinations/route points when the moving body 10 moves (hereinafter, also referred to as “destination/route point information”) in the destination/route point information DB.
  • destination/route point information information indicating destinations/route points when the moving body 10 moves
  • the storage unit 120 also stores information used when the moving body 10 plans a moving route that does not pass through the lost position. Specifically, the storage unit 120 stores the lost position information of the moving body 10 in the lost position DB.
  • the data stored in the storage unit 120 is not limited to the above example.
  • the storage unit 120 may store the sensing information of the sensor unit 100.
  • the storage unit 120 may store information acquired via the communication unit 150.
  • the storage unit 120 may also store programs such as various applications.
  • the drive unit 130 has a function of operating the moving body 10.
  • the driving unit 130 operates the moving body 10 by driving based on the driving information input from the control unit 110.
  • the drive unit 130 can be realized by, for example, an actuator.
  • the actuator may be provided at the joint of the arm. Therefore, the drive unit 130 can operate the arm of the moving body 10 by driving based on the drive information input from the control unit 110.
  • the actuator may be provided at the joint of the leg. Therefore, as in the case of the arm, the drive unit 130 can drive the leg of the moving body 10 by driving based on the drive information.
  • the moving body 10 may include wheels, and the driving unit 130 may be configured to be able to drive the wheels.
  • Output unit 140 has a function of outputting according to the input from the control unit 110.
  • the function is implemented by the output device included in the moving body 10.
  • the function of the output unit 140 can be realized by an audio output device such as a speaker.
  • the output unit 140 outputs, for example, a sound indicating the content of the return request input from the control unit 110.
  • the function of the output unit 140 can be realized by a display device such as a display.
  • the output unit 140 displays information such as an image and text indicating the content of the return request on the display device.
  • the function of the output unit 140 may be realized by one output device or may be realized by a plurality of output devices.
  • the function of the output unit 140 may be realized by either the audio output device or the display device, or may be realized by both the audio output device and the display device.
  • the communication unit 150 has a function of communicating with an external device.
  • the communication unit 150 outputs the information received from the external device to the control unit 110 in the communication with the external device, for example.
  • the communication unit 150 receives the request information from the lost moving body 10 and outputs the request information to the control unit 110.
  • the communication unit 150 receives the determination result from the lost moving body 10 and outputs the determination result to the control unit 110.
  • the communication unit 150 receives its own position from the normal moving body 10 and outputs it to the control unit 110.
  • the communication unit 150 transmits information input from the control unit 110 to an external device in communication with the external device, for example. Specifically, the communication unit 150 transmits the request information input from the control unit 110 to the normal mobile body 10. The communication unit 150 also transmits the determination result input from the control unit 110 to the normal moving body 10. In addition, the communication unit 150 transmits the self position input from the control unit 110 to the lost mobile body 10.
  • Example of processing in mobile unit>> The example of the functional configuration of the mobile unit 10 according to the embodiment of the present disclosure has been described above. Next, a processing example in the moving body 10 according to the embodiment of the present disclosure will be described.
  • FIG. 4 is a block diagram showing an example of processing blocks when the moving body 10 according to the embodiment of the present disclosure makes a normal movement.
  • the functions of the sensor unit 100, the control unit 110, the storage unit 120, and the drive unit 130 are used in the process when the mobile unit 10 normally moves. Specifically, in the sensor unit 100, the external sensor 102 and the internal sensor 104 are used.
  • the control unit 110 uses the estimation unit 112 and the processing control unit 118.
  • the storage unit 120 the destination/route point DB 122 is used.
  • the estimation unit 112 uses the star reckoning unit 1122, the dead reckoning unit 1124, and the integration unit 1126 of the self-position estimation unit 1120. Further, the processing control unit 118 uses the route planning unit 1180 and the operation control unit 1182.
  • sensing is performed by the external sensor 102 and the internal sensor 104.
  • the sensing information sensed by the external sensor 102 is output to the star reckoning unit 1122.
  • the sensing information sensed by the inner world sensor 104 is output to the dead reckoning unit 1124.
  • the star reckoning unit 1122 and the dead reckoning unit 1124 each perform self-position estimation of the moving body 10 based on the sensing information. After the self-position estimation, the self-position estimated by each of the star reckoning unit 1122 and the dead reckoning unit 1124 is output to the integration unit 1126, and the integration unit 1126 performs integration processing. Then, the self-position after the integration processing is output to the route planning unit 1180.
  • the route planning unit 1180 information indicating the self-position after the integration process and the destination/route point acquired from the destination/route point DB 122 (hereinafter, also referred to as “destination/route point information”). Based on the above), the moving route of the moving body 10 is planned. After the planning, the movement route information indicating the movement route is output to the operation control unit 1182.
  • the operation control unit 1182 controls the driving unit 130 to move the moving body 10 along the movement route.
  • FIG. 5 is a flowchart showing an example of the flow of processing when the moving body 10 according to the embodiment of the present disclosure makes a normal movement.
  • the moving body 10 performs sensing by the inner world sensor 104 (S100). Next, the moving body 10 performs self-position estimation by dead reckoning based on the sensing information of the internal sensor (S102). Further, the moving body 10 performs sensing by the external sensor 102 (S104). Next, the moving body 10 performs self-position estimation by star reckoning based on the sensing information of the external sensor (S106). After estimating the self-position, the moving body 10 integrates the self-position estimated by the dead reckoning and the self-position estimated by the star reckoning (S108).
  • the mobile unit 10 acquires destination/route point information (S110). Next, the mobile unit 10 plans a travel route based on the integrated self-position and the acquired destination/route point information (S112). Then, the moving body 10 moves along the planned moving route (S114).
  • FIG. 6 is a block diagram illustrating an example of processing blocks when the lost moving body 10 according to the embodiment of the present disclosure requests a person to return.
  • the functions of the sensor unit 100, the control unit 110, the drive unit 130, and the output unit 140 are used in the process when the lost moving body 10 requests a person to return.
  • the external sensor 102, the internal sensor 104, the camera 106, the microphone 107, and the optical sensor 108 are used.
  • the control unit 110 uses the estimation unit 112, the determination unit 114, the detection unit 116, and the processing control unit 118.
  • the estimation unit 112 uses the star reckoning unit 1122, the dead reckoning unit 1124, the integration unit 1126, and the human position estimation unit 1128 of the self-position estimation unit 1120.
  • the determination unit 114 uses the lost determination unit 1140.
  • the detection unit 116 uses a human detection unit 1160, a sound source localization unit 1162, and a light source localization unit 1164.
  • the processing control unit 118 uses a route planning unit 1180 and an operation control unit 1182. It is assumed that the detection unit 116 has the highest priority when the approaching object is detected and that the light source has the lowest priority.
  • sensing is performed by the external sensor 102 and the internal sensor 104.
  • the sensing information sensed by the external sensor 102 is output to the star reckoning unit 1122.
  • the sensing information sensed by the inner world sensor 104 is output to the dead reckoning unit 1124.
  • the star reckoning unit 1122 and the dead reckoning unit 1124 each perform self-position estimation of the moving body 10 based on the sensing information. After the self-position estimation, the self-position estimated by each of the star reckoning unit 1122 and the dead reckoning unit 1124 is output to the integration unit 1126, and the integration unit 1126 performs integration processing.
  • the self position estimated by each of the star reckoning unit 1122 and the dead reckoning unit 1124 is output to the lost determination unit 1140.
  • the lost determination unit 1140 makes a lost determination based on the input self-position.
  • the lost determining unit 1140 outputs a determination result indicating that the moving body 10 has been lost to the detecting unit 116 and the person position estimating unit 1128.
  • the self-position estimated by the star reckoning unit 1122 and the self-position after the integration processing by the integration unit 1126 are not used in the subsequent processing. Therefore, the processing by the star reckoning unit 1122 and the integration unit 1126 may not be performed while the moving body 10 is lost.
  • the detection unit 116 performs detection processing.
  • the human detection unit 1160 performs detection processing based on the captured image captured by the camera 106.
  • the person detection unit 1160 outputs the captured image as the approach target information to the person position estimation unit 1128.
  • the sound source localization unit 1162 performs detection processing based on the voice information acquired by the microphone 107.
  • the sound source localization unit 1162 outputs the sound source position information to the person position estimating unit 1128 as approach target information.
  • the light source localization unit 1164 performs a detection process based on the light information acquired by the optical sensor.
  • the light source localization unit 1164 outputs the light source position information to the person position estimation unit 1128 as approach target information.
  • the person position estimation unit 1128 estimates the position of the approach target based on the approach target information input from the detection unit 116. After the estimation, the person position estimating unit 1128 outputs the estimated position of the approaching object to the route planning unit 1180.
  • the approach route of the moving body 10 is planned.
  • the information indicating the approach route (hereinafter, also referred to as “approach route information”) is output to the operation control unit 1182.
  • the operation control unit 1182 controls the driving unit 130 to move the moving body 10 along the approach route. Then, after the moving body 10 approaches the approaching object, the operation control unit 1182 controls the operation of the driving unit 130 or the output unit 140 to cause the moving body 10 to make a return request.
  • the operation control unit 1182 may cause the moving body 10 to make a return request while the moving body 10 is approaching the approach target.
  • FIG. 7 is a flowchart showing an example of the flow of processing in the mobile unit 10 when the lost mobile unit 10 according to the embodiment of the present disclosure requests a person to return.
  • the moving body 10 moves (S200).
  • the moving body 10 makes a lost decision while moving (S202).
  • the moving body 10 continues moving.
  • the mobile unit 10 performs a person detection process (S204).
  • S204 the moving body 10 estimates the position of the person (S206).
  • the mobile unit 10 When no person is detected (S204/NO), the mobile unit 10 performs a voice detection process (S208). When the voice is detected (S208/YES), the moving body 10 performs localization of the sound source (S210). After the localization of the sound source, the moving body 10 estimates the position of the person based on the position of the sound source (S206).
  • the moving body 10 performs a light intensity gradient detection process (S212).
  • the moving body 10 positions the light source (S214). After the position of the light source is localized, the moving body 10 estimates the position of the person based on the position of the light source (S206).
  • the moving body 10 suspends its operation for a predetermined time (S218). After the lapse of a predetermined time, the moving body 10 restarts the process from S204.
  • the mobile unit 10 plans an approach route to the person (S216). After the planning, the moving body 10 starts moving to the position of the person along the planned approach route. When the movement to the position of the person or its surroundings is completed (S220/YES), the moving body 10 makes a request for returning to the person (S222). When the movement to the position of the person or its periphery is not completed (S220/NO), the moving body 10 continues the movement. At this time, the moving body 10 may again plan the approach route to the position of the person.
  • FIG. 8 is a block diagram showing an example of a processing block when the lost moving body 10 according to the embodiment of the present disclosure makes a return request to another moving body 10.
  • the processing block of the mobile unit 10a in FIG. 8 shows the processing block of the lost mobile unit 10.
  • the processing block of the moving body 10b in FIG. 8 is a processing block of another moving body 10 detected by the lost moving body 10.
  • the processing blocks illustrated in FIG. 8 also indicate processing blocks of the control system 1000 including the moving body 10a (moving body having the first control device) and the moving body 10b (moving body having the second control device). There is.
  • the processing block of the mobile unit 10a uses the functions of the sensor unit 100a, the control unit 110a, and the communication unit 150a. Specifically, in the sensor unit 100a, the external sensor 102a and the internal sensor 104a are used. The estimation unit 112a and the determination unit 114a are used in the control unit 110a. More specifically, the estimation unit 112a uses the star reckoning unit 1122a, the dead reckoning unit 1124a, and the integration unit 1126a of the self-position estimation unit 1120a. The determination unit 114a uses the lost determination unit 1140a.
  • the functions of the sensor unit 100b, the control unit 110b, the drive unit 130b, and the communication unit 150b are used.
  • the sensor unit 100b uses the external sensor 102b and the internal sensor 104b.
  • the estimation unit 112b and the processing control unit 118b are used in the control unit 110b. More specifically, the estimation unit 112b uses the self-position estimation unit 1120b.
  • the process control unit 118b uses the route planning unit 1180b and the operation control unit 1182b.
  • sensing is performed by the external sensor 102a and the internal sensor 104a.
  • the sensing information sensed by the external sensor 102a is output to the star reckoning unit 1122a. Further, the sensing information sensed by the inner world sensor 104a is output to the dead reckoning unit 1124a.
  • the star reckoning unit 1122a and the dead reckoning unit 1124a each perform self-position estimation of the moving body 10a based on the sensing information. After the self position is estimated, the self position estimated by each of the star reckoning unit 1122a and the dead reckoning unit 1124a is output to the integration unit 1126a, and the integration unit 1126a performs integration processing.
  • the self position estimated by each of the star reckoning unit 1122a and the dead reckoning unit 1124a is output to the lost determination unit 1140a.
  • the lost determination unit 1140a makes a lost determination based on the input self-position.
  • the mobile unit 10a transmits a determination result indicating that the mobile unit 10a has been lost to the mobile unit 10b via the communication unit 150a.
  • the mobile unit 10a also transmits request information to the mobile unit 10b via the communication unit 150a.
  • the mobile unit 10b that has received the determination result and the request information via the communication unit 150b performs a process of returning the mobile unit 10a.
  • the moving body 10b plans the approach route to the moving body 10a by the route planning unit 1180b based on the received request information.
  • the approach route information is output to the operation control unit 1182b.
  • the operation control unit 1182b controls the driving unit 130b to move the moving body 10b along the approach path.
  • the moving body 10b detects the moving body 10a by the external sensor 102b.
  • the moving body 10b calculates the relative positional relationship between the moving body 10b and the moving body 10a based on the sensing information obtained by sensing the moving body 10a by the external sensor 102b, thereby determining the self-position of the moving body 10a. presume.
  • the mobile unit 10b transmits the estimated self position to the mobile unit 10a via the communication unit 150b.
  • the mobile unit 10a that has received its own position via the communication unit 150a identifies the self position received by the integration unit 1126a as the self position of the mobile unit 10a, and restores its own position.
  • FIG. 9 is a flowchart showing an example of the flow of processing when the lost moving body 10 according to the embodiment of the present disclosure issues a return request to another moving body 10.
  • the moving body 10a moves (S300). Similarly, the moving body 10b is also moving (S302).
  • the moving body 10a makes a lost decision while moving (S304). When it is not determined that the moving body 10a is lost (S304/NO), the moving body 10a continues moving. When it is determined that the moving body 10a is lost (S304/YES), the moving body 10a transmits request information including the lost information of the moving body 10a, the lost position information, and the trajectory information to the moving body 10b (S306). ..
  • the mobile unit 10b that has received the request information plans an approach route to the mobile unit 10a based on the lost position information and the trajectory information of the mobile unit 10a included in the request information (S308).
  • the moving body 10b can plan the approaching route 90 not including the lost route, instead of the approaching route 80 including the lost route shown in FIG. 3, from the lost position information and the trajectory information of the moving body 10a.
  • the moving body 10b moves to the position of the moving body 10a along the planned approach route (S310).
  • the moving body 10b confirms whether or not the moving body 10a is detected while moving (S312).
  • the moving body 10b calculates the relative positional relationship between the moving body 10b and the moving body 10a based on the sensing information to determine the self-position of the moving body 10a. It is estimated (S314).
  • the moving body 10a is not detected (S312/NO)
  • the moving body 10b continues moving. At this time, the moving body 10b may again plan the approach route to the position of the moving body 10a.
  • the mobile unit 10b After estimating the self-position of the mobile unit 10a, the mobile unit 10b transmits the estimated self-position to the mobile unit 10a (S316). Then, the moving body 10b restarts moving (S318).
  • the mobile unit 10a that has received the self-position from the mobile unit 10b identifies the received self-position as the self-position of the mobile unit 10a and restores the self-position. Then, the moving body 10a restarts moving (S322).
  • FIG. 10 is a block diagram illustrating an example of processing blocks when the moving body 10 according to the embodiment of the present disclosure plans a moving route that does not pass through the lost position.
  • the functions of the sensor unit 100, the control unit 110, the storage unit 120, and the drive unit 130 are used in the process when the moving body 10 plans a moving route that does not pass through the lost position.
  • the external sensor 102 and the internal sensor 104 are used in the sensor unit 100.
  • the control unit 110 uses the estimation unit 112 and the processing control unit 118.
  • the storage unit 120 uses a destination/route point DB 122 and a lost position DB 124.
  • the estimation unit 112 uses the star reckoning unit 1122, the dead reckoning unit 1124, and the integration unit 1126 of the self-position estimation unit 1120. Further, the processing control unit 118 uses the route planning unit 1180 and the operation control unit 1182.
  • sensing is performed by the external sensor 102 and the internal sensor 104.
  • the sensing information sensed by the external sensor 102 is output to the star reckoning unit 1122.
  • the sensing information sensed by the inner world sensor 104 is output to the dead reckoning unit 1124.
  • the star reckoning unit 1122 and the dead reckoning unit 1124 each perform self-position estimation of the moving body 10 based on the sensing information. After the self-position estimation, the self-position estimated by each of the star reckoning unit 1122 and the dead reckoning unit 1124 is output to the integration unit 1126, and the integration unit 1126 performs integration processing. Then, the self-position after the integration processing is output to the route planning unit 1180.
  • the route planning unit 1180 determines the lost position based on the self-position after the integration process, the destination/route point information acquired from the destination/route point DB 122, and the lost position information acquired from the lost position DB 124. Plans for routes that will not be taken. After planning, the movement route information is output to the operation control unit 1182.
  • the operation control unit 1182 controls the driving unit 130 to move the moving body 10 along the movement route.
  • FIG. 11 is a flowchart showing an example of the flow of processing when the moving body 10 according to the embodiment of the present disclosure plans a moving route that does not pass through the lost position.
  • the mobile unit 10 estimates its own position (S400).
  • the mobile unit 10 acquires the destination/route point information from the destination/route point DB 122, and plans a travel route candidate (S402). For example, the moving body 10 plans the moving route 70 shown in FIG. 2 and the moving route moving through the moving route 60 to the destination 40 as candidates.
  • the moving body 10 acquires the lost position information from the lost position DB 124 and excludes the movement route including the lost position from the candidates (S404). For example, the moving body 10 excludes a moving route that moves to the destination 40 through the moving route 60 shown in FIG. 2 from the candidates.
  • the mobile unit 10 selects a travel route from the remaining candidates (S406).
  • the moving body 10 may narrow down the moving route to one, for example, under arbitrary conditions. Specifically, the moving body 10 may select a travel route having a shorter distance to the destination than other travel routes. Then, the moving body 10 moves along the selected moving route (S408).
  • the moving body 10 may be a robot used in a factory (hereinafter, also referred to as “in-factory robot”).
  • the in-factory robot can estimate its own position, for example, by recognizing a marker (for example, Alvar marker) attached in the factory. Therefore, the lost factory robot issues a return request to move the marker to a recognizable position. For example, when a return request is made to a person, the lost factory robot causes the person to move the marker to a recognizable position. As a result, the lost factory robot can restore its own position.
  • FIG. 12 is a block diagram showing a hardware configuration example of the control device according to the present embodiment.
  • the control device 900 shown in FIG. 12 can realize the moving body 10 shown in FIG. 1, for example.
  • the control processing by the mobile unit 10 according to the present embodiment is realized by cooperation of software and hardware described below.
  • the control device 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, and a RAM (Random Access Memory) 903.
  • the control device 900 also includes a host bus 904, a bridge 905, an external bus 906, an interface 907, an input device 908, an output device 909, a storage device 910, a drive 911, a connection port 912, and a communication device 913.
  • the hardware configuration shown here is an example, and some of the components may be omitted.
  • the hardware configuration may further include components other than the components shown here.
  • the CPU 901 functions as, for example, an arithmetic processing device or a control device, and controls the overall operation of each component or a part thereof based on various programs recorded in the ROM 902, the RAM 903, or the storage device 910.
  • the ROM 902 is a means for storing programs read by the CPU 901, data used for calculation, and the like.
  • the RAM 903 temporarily or permanently stores, for example, a program read by the CPU 901 and various parameters that appropriately change when the program is executed. These are connected to each other by a host bus 904 including a CPU bus and the like.
  • the CPU 901, the ROM 902, and the RAM 903 can realize the function of the control unit 110 described with reference to FIG. 1 in cooperation with software, for example.
  • the CPU 901, the ROM 902, and the RAM 903 are mutually connected, for example, via a host bus 904 capable of high-speed data transmission.
  • the host bus 904 is connected to the external bus 906 having a relatively low data transmission rate, for example, via the bridge 905.
  • the external bus 906 is also connected to various components via the interface 907.
  • the input device 908 is realized by a device, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch and a lever, to which information is input by the user. Further, the input device 908 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an externally connected device such as a mobile phone or PDA that corresponds to the operation of the control device 900. Further, the input device 908 may include, for example, an input control circuit that generates an input signal based on the information input by the user using the above-described input unit and outputs the input signal to the CPU 901. By operating the input device 908, the user of the control device 900 can input various data to the control device 900 and instruct a processing operation.
  • a device such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch and a lever, to which information is input by the user.
  • the input device 908 may be, for example
  • the input device 908 may be formed by a device that detects information about the user.
  • the input device 908 includes an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, and a distance measuring sensor (for example, ToF (Time of Flight). ) Sensors), force sensors, and the like.
  • the input device 908 acquires information about the state of the control device 900 itself such as the posture and moving speed of the control device 900, and information about the surrounding environment of the control device 900 such as the brightness and noise around the control device 900. Good.
  • the input device 908 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite) and receives positional information including the latitude, longitude, and altitude of the device. It may include a GNSS module to measure. Regarding the position information, the input device 908 may detect the position by transmission/reception with Wi-Fi (registered trademark), a mobile phone/PHS/smartphone, or the like, or short-distance communication or the like. The input device 908 can realize the function of the sensor unit 100 described with reference to FIG. 1, for example.
  • GNSS Global Navigation Satellite System
  • GPS Global Positioning System
  • the output device 909 is formed of a device capable of visually or auditorily notifying the user of the acquired information.
  • Such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, and printer devices. ..
  • the output device 909 outputs the results obtained by various processes performed by the control device 900, for example.
  • the display device visually displays the results obtained by the various processes performed by the control device 900 in various formats such as text, images, tables, and graphs.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, etc. into an analog signal and outputs it audibly.
  • the output device 909 can realize the function of the output unit 140 described with reference to FIG. 1, for example.
  • the storage device 910 is a device for data storage formed as an example of a storage unit of the control device 900.
  • the storage device 910 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the storage device 910 may include a storage medium, a recording device that records data in the storage medium, a reading device that reads data from the storage medium, and a deletion device that deletes data recorded in the storage medium.
  • the storage device 910 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
  • the storage device 910 can realize the function of the storage unit 120 described with reference to FIG. 1, for example.
  • the drive 911 is a reader/writer for a storage medium, and is built in or externally attached to the control device 900.
  • the drive 911 reads out information recorded on a removable storage medium such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs it to the RAM 903.
  • the drive 911 can also write information to a removable storage medium.
  • connection port 912 is, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or a port for connecting an external device such as an optical audio terminal. ..
  • the communication device 913 is, for example, a communication interface formed of a communication device or the like for connecting to the network 920.
  • the communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various kinds of communication, or the like.
  • the communication device 913 can send and receive signals and the like to and from the Internet and other communication devices, for example, according to a predetermined protocol such as TCP/IP.
  • the communication device 913 can realize the function of the communication unit 150 described with reference to FIG. 1, for example.
  • the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920.
  • the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LAN (Local Area Network) including Ethernet (registered trademark), WAN (Wide Area Network), and the like.
  • the network 920 may also include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
  • the control device estimates the self position of the moving body 10a. Further, the control device determines whether or not the self-position of the moving body 10a has been lost. Then, the control device performs processing for approaching the approaching object according to the determination result of whether or not the moving body 10a has lost its own position. When it is determined that the moving body 10a has lost its own position, the control device brings the moving body 10a and the approaching object close to each other. As a result, the control device can cause the approaching object to perform an operation for returning the self-position of the moving body 10a.
  • each device described in this specification may be realized as a single device, or part or all may be realized as separate devices.
  • the control unit 110 included in the moving body 10 illustrated in FIG. 1 may be realized as a single device.
  • the control unit 110 may be realized as an independent device such as a server device and connected to the mobile unit 10 via a network or the like.
  • the series of processes performed by each device described in this specification may be realized using any of software, hardware, and a combination of software and hardware.
  • the programs constituting the software are stored in advance, for example, in a recording medium (non-transitory medium: non-transmission media) provided inside or outside each device. Then, each program is read into the RAM when it is executed by a computer, and executed by a processor such as a CPU.
  • the effects described in the present specification are merely explanatory or exemplifying ones, and are not limiting. That is, the technique according to the present disclosure may have other effects that are apparent to those skilled in the art from the description of the present specification, in addition to or instead of the above effects.
  • An estimation unit that estimates the self-position A determination unit that determines whether or not the mobile body has lost its own position, In accordance with the determination result by the determination unit, a processing control unit that performs a process for approaching the approach target, And a control device.
  • the approach target is a person
  • the processing control unit plans a route for the moving body to approach the person based on the position of the person, causes the moving body to approach the person, and makes the request for the person to the moving body.
  • the control device according to (2) above.
  • the control device according to (4), wherein the position where the self-position can be returned is the position of the charger of the moving body.
  • the processing control unit according to any one of (3) to (5) above, which causes the moving body to perform the request represented by at least one of gesture, voice output, and display output. Control device.
  • the estimation unit estimates the position of the approach target based on the approach target information
  • the control device according to any one of (1) to (7), wherein the processing control unit performs processing for approaching the approaching object based on a position of the approaching object.
  • the control device When the approach target is a person and there are multiple candidates for the approach target, The control device according to (8), wherein the detection unit detects, from the candidates, a person who has frequently communicated with the moving body as the approach target.
  • the sensor device is an imaging device, The control device according to (8) or (9), wherein the detection unit detects the approaching object based on a captured image acquired by the imaging device.
  • the sensor device is a microphone, The control device according to any one of (8) to (10), wherein the detection unit detects a position of a sound source as the approach target information based on voice information acquired by the microphone.
  • the sensor device is an optical sensor, The control device according to any one of (8) to (11), wherein the detection unit detects a position of a light source as the approach target information based on light information acquired by the optical sensor.
  • the processing control unit controls the operation of the moving body according to whether or not request information relating to a request from the approaching object is received, according to any one of (1) to (12) above. Control device.
  • the control device When the request information is received from the approach target, The control device according to (13), wherein the processing control unit causes the moving body to approach the approaching target based on the request information, and causes the moving body to estimate the position of the approaching target.
  • the request information includes at least one of information indicating the loss of the self-position, information indicating the lost position of the self-position, and information indicating a trajectory until the loss of the self-position, (7) The control device according to 1.
  • the processing control unit sets a priority for a candidate of a moving route along which the moving body can move, and determines the candidate with a higher priority as the moving route of the moving body, (1) The control device according to any one of to (15).
  • the processing control unit sets the priority of the candidate of the travel route including the position where the self position is lost to be lower than the priority of the candidate of the travel route that does not include the position where the self position is lost.
  • the control device according to (16) above.
  • a first estimation unit that estimates the self-position, a first determination unit that determines whether or not the moving body has lost the self-position, and an approach target according to the determination result by the first determination unit.
  • a first controller including a first processing controller that performs processing for approaching;
  • a second estimating unit that estimates the self-positions of the moving body and the approaching target, and second processing control that performs processing for approaching the moving body according to a determination result by the first determining unit.
  • a second control device comprising: Equipped with When it is determined that the moving body has lost the self-position, The first control device transmits request information regarding return of the self-position to the second control device via communication, The second control device causes the approaching target to approach the moving body based on the request information received from the first controlling device via communication, and the approaching target sets the self-position of the moving body to the approaching target. Estimate, control system.

Abstract

This control device comprises: an estimation unit that estimates a host position; a determination unit that determines whether a mobile body has lost the host position; and a processing control unit that, in accordance with the determination results of the determination unit, executes processing necessary for approaching a target of approach.

Description

制御装置、制御方法、プログラム、及び制御システムControl device, control method, program, and control system
 本開示は、制御装置、制御方法、プログラム、及び制御システムに関する。 The present disclosure relates to a control device, a control method, a program, and a control system.
 現在、自律して移動する移動体において、移動体に備えられたセンサ等が計測する情報に基づき、自己位置を推定する技術が普及している。当該移動体は、例えば、システムのエラーや外的要因等により、自己位置を喪失することがある。そこで、移動体が自己位置を喪失した際に、自己位置を復帰するための技術の開発が行われている。 Currently, in a moving body that moves autonomously, a technique for estimating the self-position based on information measured by a sensor or the like provided in the moving body is widespread. The moving body may lose its position due to, for example, a system error or an external factor. Therefore, a technique for restoring the self-position when the mobile body loses the self-position is being developed.
 移動体が自己位置を復帰するための技術に関連し、例えば、下記特許文献1には、移動体の外部環境に設けられたRFID(Radio Frequency Identifier)システムにより自己位置を復帰する技術が開示されている。具体的に、移動体は、移動体の外部環境に設けられたRFIDタグから読み取る情報に基づきRFIDタグの位置情報を取得し、取得した位置情報から自己位置を特定する。 Related to a technique for a mobile body to return its own position, for example, the following Patent Document 1 discloses a technique for returning its own position by an RFID (Radio Frequency Identifier) system provided in the external environment of the mobile body. ing. Specifically, the mobile body acquires the position information of the RFID tag based on the information read from the RFID tag provided in the external environment of the mobile body, and identifies its own position from the acquired position information.
特開2007-249735号公報JP, 2007-249735, A
 しかしながら、特許文献1に記載の技術は、移動体の外部環境にRFIDシステムが設けられていることが前提の技術である。そのため、RFIDシステムが設けられていない環境にて移動体が自己位置を喪失した場合、移動体が自己位置を復帰することは困難である。 However, the technology described in Patent Document 1 is based on the premise that an RFID system is provided in the external environment of the mobile body. Therefore, when the mobile body loses its own position in an environment where the RFID system is not provided, it is difficult for the mobile body to return to its own position.
 そこで、本開示では、移動体の自己位置の喪失時、移動体が接近対象と接近することで自己位置を復帰することが可能な、新規かつ改良された制御装置、制御方法、プログラム、及び制御システムを提案する。 Therefore, in the present disclosure, a new and improved control device, control method, program, and control capable of returning its own position when the moving body approaches the approaching object when the moving body loses its own position. Suggest a system.
 本開示によれば、自己位置を推定する推定部と、移動体が前記自己位置を喪失したか否かを判定する判定部と、前記判定部による判定結果に応じて、接近対象と接近するための処理を行う処理制御部と、を備える、制御装置が提供される。 According to the present disclosure, an estimation unit that estimates a self-position, a determination unit that determines whether or not a mobile body has lost the self-position, and an approach target is approached according to a determination result by the determination unit. And a processing control unit that performs the processing of 1.
 また、本開示によれば、自己位置を推定することと、移動体が前記自己位置を喪失したか否かを判定することと、前記判定することによる判定結果に応じて、接近対象に接近するための処理を行うことと、を含む、プロセッサにより実行される制御方法が提供される。 Further, according to the present disclosure, the approaching target is approached according to the estimation of the self-position, the determination of whether or not the mobile body has lost the self-position, and the determination result of the determination. And a control method executed by the processor, the method including:
 また、本開示によれば、コンピュータを、自己位置を推定する推定部と、移動体が前記自己位置を喪失したか否かを判定する判定部と、前記判定部による判定結果に応じて、接近対象に接近するための処理を行う処理制御部と、として機能させるためのプログラムが提供される。 Further, according to the present disclosure, the computer is configured to approach the computer according to an estimation unit that estimates a self-position, a determination unit that determines whether or not the moving body has lost the self-position, and a determination result by the determination unit. A processing control unit that performs processing for approaching an object and a program that causes the processing control unit to function as the processing control unit are provided.
 また、本開示によれば、自己位置を推定する第1の推定部と、移動体が前記自己位置を喪失したか否かを判定する第1の判定部と、前記第1の判定部による判定結果に応じて、接近対象に接近するための処理を行う第1の処理制御部と、を備える、第1の制御装置と、前記移動体及び前記接近対象の前記自己位置を推定する第2の推定部と、前記第1の判定部による判定結果に応じて、前記移動体に接近するための処理を行う第2の処理制御部と、を備える、第2の制御装置と、を備え、前記移動体が前記自己位置を喪失したと判定された場合、前記第1の制御装置は、通信を介して前記自己位置の復帰に関する要求情報を前記第2の制御装置へ送信し、前記第2の制御装置は、通信を介して前記第1の制御装置から受信する前記要求情報に基づき、前記接近対象を前記移動体へ接近させ、前記接近対象に前記移動体の前記自己位置を推定させる、制御システムが提供される。 Further, according to the present disclosure, a first estimation unit that estimates the self-position, a first determination unit that determines whether or not the moving body has lost the self-position, and a determination performed by the first determination unit A first processing control unit that performs processing for approaching the approaching object in accordance with the result; and a second controller that estimates the self-positions of the moving body and the approaching object. A second control device including: an estimation unit; and a second processing control unit that performs processing for approaching the moving body in accordance with a determination result by the first determination unit, When it is determined that the moving body has lost the self-position, the first control device transmits request information regarding return of the self-position to the second control device via communication, and the second control device transmits the request information to the second control device. The control device causes the approaching target to approach the moving body based on the request information received from the first control device via communication, and causes the approaching target to estimate the self-position of the moving body. A system is provided.
本開示の実施形態に係る移動体の機能構成例を示すブロック図である。It is a block diagram showing an example of functional composition of a mobile concerning an embodiment of this indication. 同実施形態に係るロスト位置を通らない移動経路の計画の例を示す図である。It is a figure which shows the example of the plan of the movement path which does not pass the lost position which concerns on the same embodiment. 同実施形態に係るロスト経路を通らない接近経路の計画の例を示す図である。It is a figure which shows the example of the plan of the approach route which does not pass the lost route which concerns on the same embodiment. 同実施形態に係る移動体が通常の移動を行う際の処理ブロックの例を示すブロック図である。It is a block diagram showing an example of a processing block when a mobile concerning the embodiment carries out usual movement. 同実施形態に係る移動体が通常の移動を行う際の処理の流れの例を示すフローチャートである。It is a flowchart which shows the example of the flow of a process when the mobile body which concerns on the same embodiment moves normally. 同実施形態に係るロストした移動体が人に復帰要求を行う際の処理ブロックの例を示すブロック図である。It is a block diagram which shows the example of a processing block when the lost moving body which concerns on the same embodiment makes a return request to a person. 同実施形態に係るロストした移動体が人に復帰要求を行う際の処理の流れの例を示すフローチャートである。It is a flowchart which shows the example of the flow of a process when the lost moving body which concerns on the embodiment makes a return request to a person. 同実施形態に係るロストした移動体が他の移動体に復帰要求を行う際の処理ブロックの例を示すブロック図である。It is a block diagram which shows the example of a processing block when the lost moving body which concerns on the embodiment makes a return request to another moving body. 同実施形態に係るロストした移動体が他の移動体に復帰要求を行う際の処理の流れの例を示すフローチャートである。It is a flowchart which shows the example of the flow of a process when the lost moving body which concerns on the embodiment makes a return request to another moving body. 同実施形態に係る移動体がロスト位置を通らない移動経路の計画を行う際の処理ブロックの例を示すブロック図である。It is a block diagram which shows the example of a processing block when the moving body which concerns on the same embodiment plans a moving path which does not pass a lost position. 同実施形態に係る移動体がロスト位置を通らない移動経路の計画を行う際の処理の流れの例を示すフローチャートである。It is a flowchart which shows the example of the flow of a process when the mobile body which concerns on the same embodiment plans the movement path which does not pass a lost position. 同実施形態に係る制御装置のハードウェア構成例を示すブロック図である。It is a block diagram showing an example of hardware constitutions of a control device concerning the embodiment.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the present specification and the drawings, constituent elements having substantially the same functional configuration are designated by the same reference numerals, and duplicate description will be omitted.
 なお、説明は以下の順序で行うものとする。
 1.概要
 2.移動体の機能構成例
 3.移動体における処理例
  3-1.移動体が通常の移動を行う際の処理
  3-2.ロストした移動体が人に復帰要求を行う際の処理
  3-3.ロストした移動体が他の移動体に復帰要求を行う際の処理
  3-4.移動体がロスト位置を通らない移動経路の計画を行う際の処理
 4.変形例
 5.ハードウェア構成例
 6.まとめ
The description will be given in the following order.
1. Outline 2. Example of functional configuration of mobile unit 3. Example of processing in mobile unit 3-1. Processing when the mobile body normally moves 3-2. Processing when the lost mobile unit requests a person to return 3-3. Processing when Lost Mobile Unit Requests Return to Other Mobile Unit 3-4. 3. Processing when planning a moving route in which the moving body does not pass through the lost position Modification 5. Example of hardware configuration 6. Summary
<<1.概要>>
 本開示は、移動体が自己位置を喪失した際に、自己位置を復帰する技術に関する。移動体は、自律的に移動可能な装置であり得る。例えば、移動体は、自律的に移動(例えば歩行等)可能なロボットであり得る。自律的に移動可能なロボットの一例として、例えば、ヒューマノイドロボット、ペットロボット、自動運転車、またはドローン等が挙げられる。ただし、本実施形態は係る例に限定されない。例えば、移動体は、乗り物(例えば、車両、船舶、または飛行体等)、各種の産業用機械、または玩具等の他の種類の装置であってもよい。なお、以下では、移動体が家庭内で使用されるロボット(以下では、「家庭内ロボット」とも称される)である例を説明する。家庭内ロボットは、例えば、ペットロボットである。
<<1. Overview >>
The present disclosure relates to a technique of returning a self-position when a moving body loses the self-position. The mobile body may be a device that can move autonomously. For example, the moving body may be a robot capable of autonomously moving (for example, walking). Examples of autonomously movable robots include humanoid robots, pet robots, self-driving cars, and drones. However, the present embodiment is not limited to this example. For example, the mobile may be a vehicle (eg, a vehicle, ship, or air vehicle), various industrial machines, or other types of devices such as toys. In the following, an example will be described in which the moving body is a robot used at home (hereinafter, also referred to as “home robot”). The domestic robot is, for example, a pet robot.
 移動体が自己位置を喪失することに関して、以下では、「ロスト」という表現が用いられ得る。当該「ロスト」には、移動体が自己位置を喪失すること、移動体が自己位置を喪失した状態であること等の意味が含まれる。例えば、自己位置を喪失した移動体は、ロストした移動体とも称される。ロストした場合、移動体は、想定した動作を行うことが困難となり得る。例えば、ロストが継続すると、移動体が同一の場所で動き回り続けてしまうことがある。そのため、移動体がロストした際には、移動体をロストからいち早く復帰させることが望ましい。 The term "lost" may be used below in relation to the loss of the mobile body's position. The “lost” includes meanings such as that the moving body loses its own position and that the moving body has lost its own position. For example, a moving body that has lost its own position is also called a lost moving body. When lost, the mobile body may have difficulty performing the expected operation. For example, if the loss continues, the moving body may keep moving around in the same place. Therefore, when the moving body is lost, it is desirable to quickly return the moving body from the lost state.
 移動体がロストから復帰する技術として、例えば、RFIDシステムを利用する技術が開示されている。当該技術では、移動体は、外部環境に設けられたRFIDタグからRFIDの位置情報を取得し、取得した位置情報に基づき自己位置を推定してロストから復帰する。しかしながら、当該技術は、移動体の外部環境に設けられたRFIDシステムを用いることが前提の技術である。そのため、例えば、RFIDシステムが設けられていない環境にて移動体が自己位置を喪失した場合、移動体が自己位置を復帰することは困難である。 As a technology for a mobile body to recover from a loss, for example, technology that uses an RFID system is disclosed. In this technique, the mobile body acquires RFID position information from an RFID tag provided in the external environment, estimates its own position based on the acquired position information, and returns from the lost state. However, the technique is based on the premise that the RFID system provided in the external environment of the mobile body is used. Therefore, for example, when the mobile body loses its own position in an environment where the RFID system is not provided, it is difficult for the mobile body to return to its own position.
 本開示の実施形態では、上記の点に着目して発想されたものであり、移動体の自己位置の喪失時、移動体が接近対象と接近することで自己位置を復帰することが可能な技術を提案する。接近対象は、ロストした移動体と接近する対象である。接近対象の一例として、例えば、人または他の移動体等が挙げられる。 The embodiment of the present disclosure is conceived in view of the above points, and is a technique capable of returning the self-position by approaching the approach target when the mobile unit loses its own position. To propose. The approach target is a target that approaches the lost moving body. As an example of the approaching object, for example, a person or another moving body can be cited.
 なお、移動体が移動し得る経路は、以下では、「移動経路」とも称される。また、移動経路の内、移動体と接近対象とが接近するための経路は、以下では、「接近経路」とも称される。以下、本実施形態について順次詳細に説明する。 Note that the route along which the moving body can move is also referred to as a “moving route” below. Further, of the moving routes, the route for the moving body and the approaching object to approach each other is also referred to as an “approaching route” below. Hereinafter, the present embodiment will be sequentially described in detail.
<<2.移動体の機能構成例>>
 まず、本開示の実施形態に係る移動体の機能構成例について説明する。図1は、本開示の実施形態に係る移動体の機能構成例を示すブロック図である。図1に示すように、本開示の実施形態に係る移動体10は、センサ部100、制御部110、記憶部120、駆動部130、出力部140、及び通信部150を備える。
<<2. Functional configuration example of mobile unit>>
First, a functional configuration example of a mobile unit according to an embodiment of the present disclosure will be described. FIG. 1 is a block diagram showing a functional configuration example of a moving body according to an embodiment of the present disclosure. As illustrated in FIG. 1, the moving body 10 according to the embodiment of the present disclosure includes a sensor unit 100, a control unit 110, a storage unit 120, a drive unit 130, an output unit 140, and a communication unit 150.
 (1)センサ部100
 センサ部100は、制御部110における処理に用いられる情報をセンシングする機能を有する。センサ部100は、多様なセンサ装置を備え得る。例えば、センサ部100は、外界センサ、内界センサ、カメラ、マイクロフォン(以下、マイクと称する)、及び光学センサを備え得る。センサ部100は、上述のセンサを用いてセンシングを行う。そして、センサ部100は、各種センサがセンシングにより取得したセンシング情報を制御部110へ出力する。
(1) Sensor unit 100
The sensor unit 100 has a function of sensing information used for processing in the control unit 110. The sensor unit 100 may include various sensor devices. For example, the sensor unit 100 may include an external sensor, an internal sensor, a camera, a microphone (hereinafter referred to as a microphone), and an optical sensor. The sensor unit 100 performs sensing using the sensor described above. Then, the sensor unit 100 outputs the sensing information acquired by the various sensors by sensing to the control unit 110.
 センサ部100は、外界センサ及び内界センサを用いて、制御部110が移動体10の自己位置を推定するために用いる情報を取得する。外界センサは、移動体10の外部の情報をセンシングする装置である。例えば、外界センサは、カメラ、測距センサ、デプスセンサ、GPS(Global Positioning System)センサ、磁気センサ、通信装置等を含み得る。内界センサは、移動体10の内部の情報をセンシングする装置である。例えば、内界センサは、加速度センサ、ジャイロセンサ、エンコーダ等を含み得る。また、センサ部100は、カメラ、マイク、及び光学センサを用いて、制御部110が人の位置を推定するために用いる情報を取得する。 The sensor unit 100 acquires information used by the control unit 110 to estimate the self-position of the moving body 10 by using the external sensor and the internal sensor. The external world sensor is a device that senses information outside the moving body 10. For example, the external sensor may include a camera, a distance measuring sensor, a depth sensor, a GPS (Global Positioning System) sensor, a magnetic sensor, a communication device, and the like. The internal sensor is a device that senses information inside the moving body 10. For example, the internal sensor may include an acceleration sensor, a gyro sensor, an encoder, etc. Further, the sensor unit 100 uses a camera, a microphone, and an optical sensor to acquire information used by the control unit 110 to estimate the position of a person.
 カメラは、RGBカメラ等のレンズ系、駆動系、及び撮像素子を有し、画像(静止画像又は動画像)を撮像する撮像装置である。カメラは、例えば、移動体10の外部を撮像可能に設けられることで、移動体10の周辺を撮像することができる。センサ部100は、当該カメラにより、移動体10の周辺の撮像画像を取得することができる。 A camera is an image pickup device that has a lens system such as an RGB camera, a drive system, and an image pickup element, and picks up an image (still image or moving image). The camera is capable of capturing an image of the outside of the moving body 10 by being provided so as to be able to capture the outside of the moving body 10, for example. The sensor unit 100 can acquire a captured image of the periphery of the moving body 10 with the camera.
 測距センサは、例えば、ToF(Time of Flight)センサ等の距離情報を取得する装置である。デプスセンサは、例えば、赤外線測距装置、超音波測距装置、LiDAR(Laser Imaging Detection and Ranging)又はステレオカメラ等の深度情報を取得する装置である。センサ部100は、当該測距センサまたは当該デプスセンサにより、移動体10の周辺の対象までの距離情報を取得することができる。 The distance measuring sensor is, for example, a device such as a ToF (Time of Flight) sensor that acquires distance information. The depth sensor is, for example, an infrared distance measuring device, an ultrasonic distance measuring device, a LiDAR (Laser Imaging Detection and Ranging), or a device that acquires depth information such as a stereo camera. The sensor unit 100 can acquire the distance information to the target around the moving body 10 by the distance measuring sensor or the depth sensor.
 GPSセンサは、GPS衛星からのGPS信号を受信することで、移動体10の緯度、経度及び高度を含む位置情報を測定する装置である。センサ部100は、GPSセンサにより、移動体10の位置情報を取得することができる。磁気センサは、磁場の大きさや方向を測定する装置である。センサ部100は、当該磁気センサにより、移動体10の位置における磁場の情報を取得することができる。 The GPS sensor is a device that measures the positional information including the latitude, longitude, and altitude of the mobile unit 10 by receiving GPS signals from GPS satellites. The sensor unit 100 can acquire the position information of the moving body 10 by using the GPS sensor. A magnetic sensor is a device that measures the magnitude and direction of a magnetic field. The sensor unit 100 can acquire information on the magnetic field at the position of the moving body 10 by using the magnetic sensor.
 通信装置は、他の移動体10と通信する装置である。当該通信装置は、例えば、Bluetooth(登録商標)、Wi-Fi(登録商標)等により通信を行う。センサ部100は、当該通信装置により、移動体10が他の移動体10と通信をした際の通信情報を取得することができる。なお、当該通信装置は、センサ部100ではなく通信部150として実現されてもよい。 The communication device is a device that communicates with another mobile unit 10. The communication device performs communication by, for example, Bluetooth (registered trademark), Wi-Fi (registered trademark), or the like. The sensor unit 100 can acquire communication information when the mobile unit 10 communicates with another mobile unit 10 by the communication device. The communication device may be realized as the communication unit 150 instead of the sensor unit 100.
 加速度センサは、物体の加速度を取得する装置である。例えば、加速度センサは、移動体10が移動した際の速度の変化量である加速度を計測する。ジャイロセンサは、物体の角速度を取得する機能を備える装置である。例えば、ジャイロセンサは、移動体10の姿勢の変化量である角速度を計測する。エンコーダは、物体の回転角度を取得する装置である。エンコーダは、例えば、移動体10の車輪や関節等に設けられ、当該車輪や関節が回転した際の角度の変化量である回転角度を計測する。加速度センサ、ジャイロセンサ、及びエンコーダにより取得される情報は、以下では、「慣性情報」とも称される。センサ部100は、当該加速度センサ、当該ジャイロセンサ、及び当該エンコーダにより、移動体10の慣性情報を取得することができる。 An acceleration sensor is a device that acquires the acceleration of an object. For example, the acceleration sensor measures acceleration, which is the amount of change in speed when the moving body 10 moves. The gyro sensor is a device having a function of acquiring the angular velocity of an object. For example, the gyro sensor measures the angular velocity, which is the amount of change in the posture of the moving body 10. An encoder is a device that acquires the rotation angle of an object. The encoder is provided, for example, on a wheel or joint of the moving body 10 and measures a rotation angle that is a change amount of the angle when the wheel or joint rotates. The information acquired by the acceleration sensor, the gyro sensor, and the encoder is also referred to as “inertia information” below. The sensor unit 100 can acquire inertial information of the moving body 10 by using the acceleration sensor, the gyro sensor, and the encoder.
 マイクは、周囲の音を検出する装置である。マイクは、周囲の音を収音し、アンプおよびADC(Analog Digital Converter)を介してデジタル信号に変換した音声データを出力する。センサ部100は、当該マイクにより、移動体10の周辺の音声情報を取得することができる。マイクの数は1つに限定されず、複数であってもよいし、いわゆるマイクアレイを構成していてもよい。なお、音声情報に基づき音声の方向を検出する場合、検出精度はマイクの数が多いほど向上する。そのため、マイクの数は多い方が望ましい。 Microphone is a device that detects surrounding sounds. The microphone collects ambient sound and outputs audio data converted into a digital signal via an amplifier and an ADC (Analog Digital Converter). The sensor unit 100 can acquire voice information around the moving body 10 by using the microphone. The number of microphones is not limited to one, and a plurality of microphones may be used, or a so-called microphone array may be configured. When detecting the direction of the voice based on the voice information, the detection accuracy improves as the number of microphones increases. Therefore, it is desirable that the number of microphones is large.
 光学センサは、周囲の光を検出する装置である。センサ部100は、当該光学センサにより、移動体10の周囲の光情報を取得することができる。  The optical sensor is a device that detects ambient light. The sensor unit 100 can acquire light information around the moving body 10 by the optical sensor.
 (2)制御部110
 制御部110は、移動体10全体の動作を制御する機能を有する制御装置である。当該機能を実現するために、制御部110は、図1に示すように、推定部112、判定部114、検出部116、及び処理制御部118を有する。
(2) Control unit 110
The control unit 110 is a control device having a function of controlling the operation of the entire moving body 10. In order to realize the function, the control unit 110 has an estimation unit 112, a determination unit 114, a detection unit 116, and a processing control unit 118, as shown in FIG.
 (2-1)推定部112
 推定部112は、移動体10の自己位置を推定する機能を有する。例えば、推定部112は、センサ部100から入力されるセンシング情報に基づき、移動体10の自己位置を推定する。推定部112が移動体10の自己位置を推定する方法の一例として、例えば、スターレコニングとデッドレコニングが挙げられる。なお、推定部112が移動体10の自己位置を推定する方法は、スターレコニングとデッドレコニングに限定されない。
(2-1) Estimating unit 112
The estimation unit 112 has a function of estimating the self-position of the moving body 10. For example, the estimation unit 112 estimates the self-position of the moving body 10 based on the sensing information input from the sensor unit 100. Examples of methods by which the estimation unit 112 estimates the self-position of the moving body 10 include, for example, star reckoning and dead reckoning. The method by which the estimation unit 112 estimates the self-position of the moving body 10 is not limited to star reckoning and dead reckoning.
 スターレコニングは、外界センサによるセンシング情報に基づき、移動体10の絶対的な自己位置を推定する方法である。外界センサは、上述したカメラ、測距センサ、デプスセンサ、GPSセンサ、磁気センサ、及び通信装置等である。外界センサによるセンシング情報は、例えば、移動体10の位置、及び姿勢等の物理量を含み得る。外界センサは、移動体10の絶対位置をセンシング情報として取得することができる。ただし、外界センサがセンシング情報を取得するレートは、デッドレコニングで用いられる内界センサがセンシング情報を取得するレートと比較して低い。 Star reckoning is a method of estimating the absolute self-position of the moving body 10 based on sensing information from an external sensor. The external sensor is the above-described camera, distance measuring sensor, depth sensor, GPS sensor, magnetic sensor, communication device, or the like. The sensing information by the external sensor may include a physical quantity such as the position and orientation of the moving body 10, for example. The external world sensor can acquire the absolute position of the moving body 10 as sensing information. However, the rate at which the external sensor acquires the sensing information is lower than the rate at which the internal sensor used in dead reckoning acquires the sensing information.
 推定部112は、スターレコニングにより、外界センサが取得した物理量から移動体10の絶対位置を直接算出し、算出した絶対位置を移動体10の自己位置として推定することができる。なお、外界センサが移動体10の絶対位置を取得した場合、推定部112は、当該絶対位置を移動体10の自己位置と推定してもよい。 The estimation unit 112 can directly calculate the absolute position of the moving body 10 from the physical quantity acquired by the external sensor by star reckoning, and estimate the calculated absolute position as the self-position of the moving body 10. When the external sensor acquires the absolute position of the moving body 10, the estimating unit 112 may estimate the absolute position as the self-position of the moving body 10.
 デッドレコニングは、内界センサによるセンシング情報に基づき、移動体10の相対的な自己位置を推定する方法である。内界センサは、上述した加速度センサ、ジャイロセンサ、エンコーダ等である。内界センサによるセンシング情報は、例えば、移動体10の速度、加速度、相対位置、及び角速度等の物理量を含み得る。なお、内界センサは、移動体10の絶対位置をセンシング情報として取得しない。ただし、内界センサがセンシング情報を取得するレートは、スターレコニングで用いられる外界センサがセンシング情報を取得するレートと比較して高い。 Dead reckoning is a method of estimating the relative self-position of the moving body 10 based on the sensing information from the internal sensor. The internal sensor is the above-described acceleration sensor, gyro sensor, encoder, or the like. Sensing information by the internal sensor may include physical quantities such as the velocity, acceleration, relative position, and angular velocity of the moving body 10, for example. The internal sensor does not acquire the absolute position of the moving body 10 as sensing information. However, the rate at which the internal sensor acquires sensing information is higher than the rate at which the external sensor used in star reckoning acquires sensing information.
 推定部112は、デッドレコニングにより、内界センサが取得した物理量を積算することで移動体10の相対位置と姿勢を算出し、当該相対位置と姿勢から移動体10の自己位置を推定することができる。内界センサは、外界センサよりも高レートでセンシング情報を取得する。よって、デッドレコニングによる自己位置の推定を行う場合、推定部112は、スターレコニングよりも高レートで自己位置の推定を行うことができる。また、内界センサは、一定周期で途切れることなくセンシング情報を取得することができる。よって、推定部112は、連続的な自己位置の推定を一定周期で途切れることなく行うことができる。 The estimation unit 112 may calculate the relative position and posture of the moving body 10 by integrating the physical quantities acquired by the internal sensor by dead reckoning, and estimate the self-position of the moving body 10 from the relative position and posture. it can. The internal sensor acquires sensing information at a higher rate than the external sensor. Therefore, when estimating the self-position by dead reckoning, the estimation unit 112 can estimate the self-position at a higher rate than the star reckoning. In addition, the internal sensor can acquire sensing information without interruption at regular intervals. Therefore, the estimation unit 112 can perform continuous self-position estimation at regular intervals without interruption.
 なお、スターレコニングでは、オドメトリにより取得される情報に基づき、移動体10の自己位置を推定してもよい。オドメトリは、例えば、移動体10の車輪の回転量、関節の角度、幾何形状の情報等を用いた順動力学演算により移動体10の移動量を算出する手法である。具体的に、移動体10が車輪を有する場合、推定部112は、車輪の回転量から移動量を算出するホイールオドメトリという手法を用いてもよい。また、推定部112は、カメラが撮像する撮像画像内の特徴量の時間的変化量から移動量を算出するビジュアルオドメトリという手法を用いてもよい。そして、推定部112は、算出した移動量から移動体10の自己位置を推定する。 Note that in star reckoning, the self-position of the moving body 10 may be estimated based on the information acquired by odometry. Odometry is, for example, a method of calculating the moving amount of the moving body 10 by forward dynamics calculation using the amount of rotation of the wheels of the moving body 10, the angle of the joint, the information on the geometrical shape, and the like. Specifically, when the moving body 10 has wheels, the estimation unit 112 may use a method called wheel odometry that calculates the amount of movement from the amount of rotation of the wheels. In addition, the estimation unit 112 may use a method called visual odometry that calculates the movement amount from the temporal change amount of the feature amount in the captured image captured by the camera. Then, the estimation unit 112 estimates the self-position of the moving body 10 from the calculated movement amount.
 推定部112がスターレコニングまたはデッドレコニングのいずれか一方のみを用いて移動体10の自己位置を推定する場合、推定した自己位置にはそれぞれの方法に特有の誤差が生じ得る。例えば、外界センサが一定周期で連続的に位置情報を受信するセンサである場合、電波状況の悪化により位置情報の受信が不安定になり、推定部112がスターレコニングにより推定した自己位置に誤差が生じ得る。また、例えば、外界センサが取得した画像やポイントクラウドデータ等のデータが大容量である場合、推定部112における処理が高負荷となり処理効率が低下することで、推定部112がスターレコニングにより推定した自己位置に誤差が生じ得る。 When the estimation unit 112 estimates the self-position of the moving body 10 using only one of star reckoning and dead reckoning, the estimated self-position may have an error peculiar to each method. For example, when the external sensor is a sensor that continuously receives position information at a constant cycle, the reception of position information becomes unstable due to deterioration of the radio wave condition, and an error occurs in the self position estimated by the estimation unit 112 by star reckoning. Can happen. In addition, for example, when the amount of data such as an image or point cloud data acquired by the external sensor is large, the processing in the estimation unit 112 becomes high load and the processing efficiency decreases, and thus the estimation unit 112 estimates by star reckoning. An error may occur in the self position.
 デッドレコニングでは、内界センサが取得した物理量に対する積分処理により基準点からの相対的な位置や移動体の姿勢を推定する。そのため、例えば、物理量に誤差が含まれていると、積分処理により当該誤差が蓄積され、累積誤差が生じ得る。そして、当該累積誤差により、推定部112がデッドレコニングにより推定した自己位置に誤差が生じ得る。 In dead reckoning, the relative position from the reference point and the posture of the moving body are estimated by the integration process for the physical quantity acquired by the internal sensor. Therefore, for example, if the physical quantity includes an error, the error is accumulated by the integration process, and a cumulative error may occur. Then, due to the accumulated error, an error may occur in the self-position estimated by the estimation unit 112 by dead reckoning.
 そこで、本実施形態に係る推定部112は、スターレコニングとデッドレコニングの両方を用いて移動体10の自己位置を推定する。例えば、推定部112は、デッドレコニングにより推定した相対的な自己位置を、スターレコニングにより推定した絶対的な自己位置で補正する。これにより、推定部112は、デッドレコニングにより蓄積された累積誤差を絶対的な自己位置によりリセットすることができる。 Therefore, the estimation unit 112 according to the present embodiment estimates the self-position of the moving body 10 using both star reckoning and dead reckoning. For example, the estimation unit 112 corrects the relative self-position estimated by dead reckoning with the absolute self-position estimated by star reckoning. As a result, the estimation unit 112 can reset the accumulated error accumulated by dead reckoning based on the absolute self-position.
 よって、推定部112は、スターレコニングまたはデッドレコニングのいずれか一方のみを用いて自己位置の推定を行うよりも、精度高く自己位置の推定を行うことができる。また、推定部112は、推定する自己位置の連続性を担保することもできる。 Therefore, the estimation unit 112 can estimate the self-position with higher accuracy than the estimation of the self-position using only one of the star reckoning and the dead reckoning. The estimating unit 112 can also ensure the continuity of the estimated self-position.
 なお、デッドレコニングにより推定された自己位置が、スターレコニングにより推定された自己位置で補正される処理は、以下では、「統合処理」とも称される。統合処理は、例えば、カルマンフィルタ(Kalman Filter)またはパーティクルフィルタ(Particle Filter)等を用いて行われる。推定部112は、スターレコニングにより自己位置を推定したタイミング、即ち、デッドレコニングにより推定した自己位置を補正するデータを取得したタイミングで当該統合処理を行い得る。スターレコニングにより自己位置を推定していない、即ち、デッドレコニングにより推定した自己位置を補正するデータを取得していない場合、推定部112は、統合処理をスキップしてもよい。 Note that the process of correcting the self-position estimated by dead reckoning with the self-position estimated by star reckoning is also referred to as "integration process" below. The integration process is performed using, for example, a Kalman filter or a particle filter. The estimation unit 112 can perform the integration process at the timing when the self position is estimated by the star reckoning, that is, at the timing when the data for correcting the self position estimated by the dead reckoning is acquired. If the self-position has not been estimated by star reckoning, that is, if the data for correcting the self-position estimated by dead reckoning has not been acquired, the estimation unit 112 may skip the integration process.
 また、推定部112は、接近対象の位置を推定する機能を有する。例えば、推定部112は、接近対象に関する情報(以下では、「接近対象情報」とも称される)に基づき、接近対象の位置を推定する。例えば、検出部116が接近対象として人を検出した場合、推定部112は、検出部116から入力される接近対象情報に基づき、接近対象の位置を推定する。具体的に、推定部112は、検出部116から入力される撮像画像(接近対象情報)に写る人の位置を接近対象の位置として推定する。 Further, the estimation unit 112 has a function of estimating the position of the approach target. For example, the estimation unit 112 estimates the position of the approach target based on the information about the approach target (hereinafter, also referred to as “approach target information”). For example, when the detection unit 116 detects a person as the approach target, the estimation unit 112 estimates the position of the approach target based on the approach target information input from the detection unit 116. Specifically, the estimation unit 112 estimates the position of the person shown in the captured image (approach target information) input from the detection unit 116 as the position of the approach target.
 また、近くに人が存在する可能性が高い接近対象を検出部116が検出した場合、推定部112は、検出部116から入力される接近対象情報に基づき、接近対象の位置を推定する。例えば、検出部116が接近対象として音声を検出した場合、推定部112には、当該音声の音源の位置を示す情報(以下では、「音源位置情報」とも称される)が接近対象情報として入力される。そして、推定部112は、入力された音源位置情報に基づき、音源の位置を接近対象の位置として推定する。 Further, when the detecting unit 116 detects an approaching target that is likely to have a person nearby, the estimating unit 112 estimates the position of the approaching target based on the approaching target information input from the detecting unit 116. For example, when the detection unit 116 detects a voice as an approach target, the estimation unit 112 inputs information indicating the position of the sound source of the voice (hereinafter, also referred to as “sound source position information”) as the approach target information. To be done. Then, the estimating unit 112 estimates the position of the sound source as the position of the approach target based on the input sound source position information.
 また、検出部116が接近対象として光を検出した場合、推定部112には、当該光の光源の位置を示す情報(以下では、「光源位置情報」とも称される)が接近対象情報として入力される。そして、推定部112は、入力された光源位置情報に基づき、光源の位置を接近対象の位置として推定する。 When the detection unit 116 detects light as an approach target, the estimation unit 112 receives information indicating the position of the light source of the light (hereinafter, also referred to as “light source position information”) as the approach target information. To be done. Then, the estimation unit 112 estimates the position of the light source as the position of the approach target based on the input light source position information.
 また、推定部112は、他のロストした移動体10の自己位置を推定する機能も有する。例えば、ロストしていない移動体10の推定部112は、ロストしていない移動体10とロストした移動体10との相対的な位置関係から、ロストした移動体10の自己位置を推定する。具体的に、まず、ロストしていない移動体10は、ロストした移動体10に接近し、センサ部100のカメラによりロストした移動体10を撮像する。次いで、ロストしていない移動体10の推定部112は、自身の自己位置と撮像画像に写るロストした移動体10の位置から、ロストしていない移動体10とロストした移動体10との相対的な位置関係を算出する。そして、ロストしていない移動体10の推定部112は、算出した位置をロストした移動体10の自己位置と推定する。 The estimation unit 112 also has a function of estimating the self-position of another lost mobile body 10. For example, the estimation unit 112 of the non-lost mobile unit 10 estimates the self-position of the lost mobile unit 10 from the relative positional relationship between the non-lost mobile unit 10 and the lost mobile unit 10. Specifically, first, the non-lost moving body 10 approaches the lost moving body 10 and the camera of the sensor unit 100 images the lost moving body 10. Next, the estimation unit 112 of the non-lost moving body 10 determines the relative position between the non-lost moving body 10 and the lost moving body 10 based on its own position and the position of the lost moving body 10 shown in the captured image. Calculate the relative position. Then, the estimation unit 112 of the mobile body 10 that is not lost estimates the calculated position as the self-position of the lost mobile body 10.
 (2-2)判定部114
 判定部114は、移動体10がロストしたか否かを判定する機能を有する。例えば、判定部114は、推定部112から入力される移動体10の自己位置の推定結果に基づき、移動体10がロストしたか否かを判定(以下では、「ロスト判定」とも称される)する。そして、判定部114は、移動体10がロストしたと判定した場合、判定結果を推定部112及び検出部116へ出力する。また、判定部114は、移動体10がロストしたと判定した場合、ロスト前の最後に取得された移動体10の自己位置(以下では、「ロスト位置」とも称される)を記憶部120へ出力し、記憶部120に記憶させる。
(2-2) Judgment unit 114
The determination unit 114 has a function of determining whether or not the moving body 10 is lost. For example, the determination unit 114 determines whether or not the mobile body 10 is lost based on the estimation result of the self-position of the mobile body 10 input from the estimation unit 112 (hereinafter, also referred to as “lost determination”). To do. When the determination unit 114 determines that the moving body 10 is lost, the determination unit 114 outputs the determination result to the estimation unit 112 and the detection unit 116. When the determination unit 114 determines that the moving body 10 is lost, the determination unit 114 stores the last acquired self position of the moving body 10 before the loss (hereinafter, also referred to as “lost position”) in the storage unit 120. It is output and stored in the storage unit 120.
 具体的な判定方法の一例として、デッドレコニングによる自己位置の推定結果とスターレコニングによる自己位置の推定結果を比較する方法が挙げられる。外界センサによるセンシング結果は外的要因により大きく変化し得る。すなわち、スターレコニングによる自己位置の推定結果も外的要因により大きく変化し得る。そのため、外界センサのセンシング結果が大きく変化した場合、スターレコニングによる自己位置の推定結果が予期せぬ結果となり、移動体10がロストする可能性がある。例えば、カメラの撮像画像における特徴点のマッチングにより自己位置の推定をしている際に、部屋の電気の消灯または日光の差し込み等により輝度が大きく変化したとする。この場合、スターレコニングによる自己位置の推定結果が予期せぬ結果となり、移動体10はロストする。また、例えば、Lidar等による点群マッチングをしている際に、移動体10が人等の動物体に周囲を囲まれた場合も同様に、移動体10はロストする。 As an example of a concrete determination method, there is a method of comparing the self-position estimation result by dead reckoning and the self-position estimation result by star reckoning. The sensing result by the external sensor may change greatly due to external factors. That is, the estimation result of the self-position by the star reckoning may change greatly due to external factors. Therefore, when the sensing result of the external sensor changes significantly, the result of estimating the self-position by star reckoning may be an unexpected result, and the moving body 10 may be lost. For example, it is assumed that the brightness greatly changes due to turning off the electricity in the room or inserting sunlight while estimating the self-position by matching the characteristic points in the image captured by the camera. In this case, the result of estimating the self-position by star reckoning becomes an unexpected result, and the moving body 10 is lost. Further, for example, when the moving body 10 is surrounded by an animal body such as a person during the point cloud matching by Lidar or the like, the moving body 10 is similarly lost.
 そこで、スターレコニングによる移動体10の自己位置の推定結果が予期せぬ結果となった場合、判定部114は、移動体10がロストしたと判定する。例えば、スターレコニングにより推定された自己位置が、デッドレコニングにより推定された自己位置から所定の距離以上離れた位置である場合、判定部114は、スターレコニングによる移動体10の自己位置の推定結果が予期せぬ結果となったと判定する。即ち、判定部114は、移動体10がロストしたと判定する。なお、所定の距離には、任意の距離が設定されてよい。例えば、ユーザにより任意の値が設定されてもよいし、一般的な統計により算出された距離が設定されてもよい。また、移動体10がロストしたか否かを判定する方法は、上述の例に限定されない。 Then, if the estimation result of the self-position of the moving body 10 by the star reckoning is an unexpected result, the determining unit 114 determines that the moving body 10 is lost. For example, when the self-position estimated by the star reckoning is a position separated from the self-position estimated by the dead reckoning by a predetermined distance or more, the determination unit 114 determines that the self-position estimation result of the moving body 10 by the star reckoning is It is judged that the result was unexpected. That is, the determination unit 114 determines that the moving body 10 has been lost. Note that an arbitrary distance may be set as the predetermined distance. For example, the user may set an arbitrary value, or the distance calculated by general statistics may be set. Further, the method of determining whether or not the moving body 10 is lost is not limited to the above example.
 (2-3)検出部116
 検出部116は、移動体10の周辺に関する情報を検出する機能を有する。例えば、検出部116は、センサ部100が備えるセンサ装置のセンシング情報から接近対象を検出する。なお、検出部116が接近対象を検出する処理は、以下では、「検出処理」とも称される。
(2-3) Detection unit 116
The detection unit 116 has a function of detecting information about the periphery of the moving body 10. For example, the detection unit 116 detects the approach target from the sensing information of the sensor device included in the sensor unit 100. The process in which the detection unit 116 detects an approaching object is also referred to as a “detection process” below.
 また、検出部116は、センサ部100が備えるセンサ装置のセンシング情報から接近対象情報を検出する。そして、検出部116は、検出した接近対象情報を推定部112へ出力する。なお、接近対象情報は、移動体10がロストした際に接近対象の位置を推定するための情報である。そのため、検出部116は、移動体10がロストした旨を示す判定結果を判定部114から入力されたときに接近対象情報を検出する処理を行う。 Further, the detection unit 116 detects the approach target information from the sensing information of the sensor device included in the sensor unit 100. Then, the detection unit 116 outputs the detected approach target information to the estimation unit 112. The approach target information is information for estimating the position of the approach target when the moving body 10 is lost. Therefore, the detection unit 116 performs a process of detecting the approach target information when the determination result indicating that the moving body 10 has been lost is input from the determination unit 114.
 検出部116は、カメラ(撮像装置)、マイク、または光学センサの少なくともいずれか1つから入力されるセンシング情報を用いて接近対象及び接近対象情報の検出を行う。センサ装置がカメラである場合、検出部116は、カメラが取得する撮像画像に基づき、接近対象を検出する。例えば、検出部116は、撮像画像に対して画像認識処理を行うことで、撮像画像に写る人を接近対象として検出する。なお、検出部116は、画像認識処理を行う際に、機械学習(例えば深層学習)による画像認識処理を行ってもよい。これにより、検出部116は、移動体10の周辺に存在する人を検出することができる。 The detection unit 116 detects the approach target and the approach target information using the sensing information input from at least one of a camera (imaging device), a microphone, and an optical sensor. When the sensor device is a camera, the detection unit 116 detects the approach target based on the captured image acquired by the camera. For example, the detection unit 116 detects an imaged person as an approach target by performing image recognition processing on the captured image. The detection unit 116 may perform image recognition processing by machine learning (for example, deep learning) when performing the image recognition processing. As a result, the detection unit 116 can detect a person existing around the moving body 10.
 撮像画像に基づき移動体10の周辺に存在する人を接近対象として検出した場合、検出部116は、当該撮像画像を接近対象情報として推定部112へ出力する。なお、検出部116は、画像認識処理をした際に人の位置情報も取得し、取得した位置情報を接近対象情報として推定部112へ出力してもよい。 When a person existing around the moving body 10 is detected as the approach target based on the captured image, the detection unit 116 outputs the captured image to the estimation unit 112 as the approach target information. The detection unit 116 may also acquire the position information of the person when performing the image recognition process, and output the acquired position information to the estimation unit 112 as the approach target information.
 なお、接近対象が人であり、複数人の接近対象の候補が存在する場合、検出部116は、候補の中から、移動体10とコミュニケーションを取った回数(以下、コミュニケーション回数とも称される)が多い人を接近対象として検出する。例えば、検出部116は、移動体10が人とコミュニケーションを取る度に、移動体10のコミュニケーションの相手を検出し、当該相手とのコミュニケーション回数を関連付けて記憶部120に記憶させる。具体的に、検出部116は、画像認識処理により撮像画像から相手の特徴を示す特徴情報(例えば顔画像)を検出し、検出した特徴情報とコミュニケーション回数を記憶部120へ出力する。そして、検出部116は、複数人の接近対象の候補を検出した際に、当該候補達とのコミュニケーション回数を記憶部120から取得して比較することで、コミュニケーション回数が多い人を接近対象として検出する。これにより、検出部116は、移動体10とより身近な人を接近対象として検出することができる。移動体10とより身近な人は、例えば、移動体10の世話を一番してくれている飼い主である。なお、コミュニケーションの一例として、なでられること(人との接触の検出)、音声対話(人の音声の検出)、アイコンタクト(人の視線の検出)等が挙げられる。 If the approach target is a person and there are multiple candidates for the approach target, the detection unit 116 communicates with the mobile unit 10 from among the candidates (hereinafter, also referred to as the number of communications). A person with a large number of people is detected as an approach target. For example, the detection unit 116 detects the communication partner of the mobile unit 10 each time the mobile unit 10 communicates with a person, and stores the communication partner in the storage unit 120 in association with the communication count. Specifically, the detection unit 116 detects feature information (for example, a face image) indicating the feature of the other party from the captured image by the image recognition processing, and outputs the detected feature information and the number of times of communication to the storage unit 120. Then, when the detection unit 116 detects a plurality of candidates for the approach target, the detection unit 116 acquires the number of times of communication with the candidates from the storage unit 120 and compares the number of communication with the candidates, thereby detecting a person with a large number of communication times as the approach target. To do. Accordingly, the detection unit 116 can detect a person who is closer to the moving body 10 as an approach target. The person closer to the moving body 10 is, for example, the owner who takes care of the moving body 10 most. Note that examples of communication include strokes (detection of contact with a person), voice conversation (detection of a person's voice), eye contact (detection of a person's line of sight), and the like.
 また、移動体10の身近な人は、移動体10の身近でない人と比べ、移動体10がロストした際の対処方法を知っている可能性が高い。よって、検出部116が接近対象として移動体10とより身近な人を検出することで、移動体10は、ロスト時に適切な対処をより早く行ってもらえる可能性が高くなる。そのため、移動体10が身近な人に適切な対処を行ってもらうことで、身近でない人が対処を行う場合と比べて早くロストから復帰することができる。 Also, a person who is close to the mobile unit 10 is more likely to know a coping method when the mobile unit 10 is lost than a person who is not close to the mobile unit 10. Therefore, since the detection unit 116 detects a person who is closer to the moving body 10 as an approach target, the moving body 10 has a higher possibility of being able to take an appropriate countermeasure earlier when lost. Therefore, by having a person close to the moving body 10 take appropriate measures, it is possible to recover from the lost faster than when a person not close to the person takes measures.
 センサ装置がマイクである場合、検出部116は、マイクが取得する音声情報に基づき、音源の位置を接近対象情報として検出する。例えば、検出部116は、音声情報から音声の方向を検出する。音声情報から音源の方向を検出した場合、検出部116は、音声の方向に存在する音源を接近対象として検出する。そして、検出部116は、検出した音源の位置を示す音源位置情報を接近対象情報として推定部112へ出力する。 When the sensor device is a microphone, the detection unit 116 detects the position of the sound source as the approach target information based on the voice information acquired by the microphone. For example, the detection unit 116 detects the direction of voice from the voice information. When the direction of the sound source is detected from the voice information, the detection unit 116 detects the sound source existing in the direction of the voice as the approach target. Then, the detection unit 116 outputs the sound source position information indicating the detected position of the sound source to the estimation unit 112 as the approach target information.
 センサ装置が光学センサである場合、検出部116は、光学センサが取得する光情報に基づき、光源の位置を接近対象情報として検出する。例えば、検出部116は、光情報から光の方向を検出する。光情報から光の方向を検出した場合、検出部116は、光の方向に存在する光源を接近対象として検出する。そして、検出部116は、検出した光源の位置を示す光源位置情報を接近対象情報として推定部112へ出力する。光情報の一例として、光の強さの変化率を示す光度勾配が挙げられる。具体的に、光度勾配が増加する方向が明るく、光度勾配が減少する方向が暗い場合、光度勾配が増加する方向に光源が存在し得る。よって、検出部116は、光度勾配に基づき光の方向を検出することで、光源の位置を検出することができる。 When the sensor device is an optical sensor, the detection unit 116 detects the position of the light source as approach target information based on the optical information acquired by the optical sensor. For example, the detection unit 116 detects the direction of light from the light information. When the light direction is detected from the light information, the detection unit 116 detects a light source existing in the light direction as an approach target. Then, the detection unit 116 outputs the light source position information indicating the detected position of the light source to the estimation unit 112 as the approach target information. An example of the light information is a light intensity gradient that indicates the rate of change of light intensity. Specifically, when the light intensity gradient increases in the bright direction and the light intensity gradient decreases in the dark direction, the light source may be present in the light intensity gradient increasing direction. Therefore, the detection unit 116 can detect the position of the light source by detecting the direction of light based on the light intensity gradient.
 なお、検出部116は、接近対象を検出する際に、優先度を設定してもよい。例えば、接近対象が人である場合、移動体10は、当該人に接近して助けを求めることで容易に自己位置を復帰することができる。具体的に、移動体10は、自己位置の復帰が可能な位置まで当該人に移動してもらうことができる。一方、接近対象が音源または光源である場合、移動体10が接近対象である音源または光源に接近しても、音源または光源の周囲に人が存在しない場合、移動体10が自己位置を復帰することが困難になり得る。そこで、検出部116が接近対象を検出する際の優先度は、検出部116が人を最優先に検出するように設定されることが望ましい。これにより、移動体10は、接近対象に接近した際に自己位置を復帰できる可能性を高めることができる。 Note that the detection unit 116 may set the priority when detecting the approaching object. For example, when the approach target is a person, the moving body 10 can easily return to its own position by approaching the person and requesting help. Specifically, the moving body 10 can have the person move to a position where the own position can be restored. On the other hand, when the approaching object is a sound source or a light source, even if the moving body 10 approaches the approaching sound source or the light source, if there is no person around the sound source or the light source, the moving body 10 returns to its own position. Can be difficult. Therefore, the priority when the detection unit 116 detects the approaching object is preferably set so that the detection unit 116 detects the person with the highest priority. As a result, the moving body 10 can increase the possibility of returning to its own position when approaching the approach target.
 (2-4)処理制御部118
 処理制御部118は、制御部110における処理を制御する機能を有する。例えば、処理制御部118は、移動体10がロストしたか否かに応じた処理を行う。なお、ロストしたと判定されていない移動体10は、以下では、「正常な移動体10」とも称される。また、ロストしたと判定された移動体10は、以下では、「ロストした移動体10」とも称される。また、自己位置の復帰に関する処理は、以下では、「復帰処理」とも称される。また、自己位置の復帰に関する要求は、以下では、「復帰要求」とも称される。
(2-4) Processing control unit 118
The processing control unit 118 has a function of controlling the processing in the control unit 110. For example, the processing control unit 118 performs processing according to whether or not the moving body 10 is lost. The moving body 10 that is not determined to have been lost is also referred to as a “normal moving body 10” below. Further, the moving body 10 determined to have been lost is also referred to as “lost moving body 10” below. Further, the processing relating to the return of the self-position is also referred to as “return processing” below. The request for returning the self-position is also referred to as a "return request" below.
 (移動体10がロストしたと判定されていない場合)
 正常な移動体10の処理制御部118は、正常な移動体10の移動経路を計画し、正常な移動体10を移動経路に沿って移動させる。移動経路の計画は、例えば、正常な移動体10の自己位置と目的地に基づき、正常な移動体10の処理制御部118により行われる。具体的に、正常な移動体10の処理制御部118は、推定部112から入力される正常な移動体10の自己位置から、記憶部120から取得する正常な移動体10の目的地までの経路を移動経路として計画する。
(When it is not determined that the moving body 10 is lost)
The processing control unit 118 of the normal moving body 10 plans the moving route of the normal moving body 10 and moves the normal moving body 10 along the moving route. The movement route is planned by the processing control unit 118 of the normal moving body 10 based on the normal position of the normal moving body 10 and the destination, for example. Specifically, the process control unit 118 of the normal moving body 10 routes from the self-position of the normal moving body 10 input from the estimating unit 112 to the destination of the normal moving body 10 acquired from the storage unit 120. Is planned as a travel route.
 なお、正常な移動体10は、ロストした移動体10に対する処理も行い得る。例えば、正常な移動体10は、ロストした移動体10の復帰処理を行う。この時、正常な移動体10の処理制御部118は、接近対象からの復帰要求に関する要求情報を受信したか否かに応じて、正常な移動体10の動作を制御する。なお、この場合の接近対象は、ロストした移動体10である。 The normal moving body 10 can also perform processing for the lost moving body 10. For example, the normal moving body 10 performs a process of returning the lost moving body 10. At this time, the process control unit 118 of the normal moving body 10 controls the operation of the normal moving body 10 according to whether or not the request information regarding the return request from the approaching object is received. The approaching object in this case is the lost moving body 10.
 ここでいう接近対象からの復帰要求は、例えば、ロストした移動体10の自己位置を復帰させることである。また、ここでいう要求情報は、ロストしたことを示す情報、ロスト位置を示す情報、及びロストするまでの軌跡を示す情報の少なくともいずれか1つを含む情報である。ロストしたことを示す情報は、以下では、「ロスト情報」とも称される。また、ロスト位置を示す情報は、以下では、「ロスト位置情報」とも称される。また、ロストするまでの軌跡を示す情報は、以下では、「軌跡情報」とも称される。 The request for return from the approaching object here is, for example, to return the lost self-position of the moving body 10. In addition, the request information mentioned here is information including at least one of information indicating that the information has been lost, information indicating the lost position, and information indicating the locus until lost. The information indicating the lost information is also referred to as “lost information” below. In addition, the information indicating the lost position is also referred to as “lost position information” below. The information indicating the locus until lost is also referred to as “trajectory information” below.
 接近対象から要求情報を受信した場合、正常な移動体10の処理制御部118は、要求情報に基づき正常な移動体10を接近対象に接近させ、正常な移動体10に接近対象の位置を推定させる。具体的に、まず、正常な移動体10の処理制御部118は、推定部112が推定する正常な移動体10の自己位置と、要求情報に含まれるロストした移動体10のロスト位置情報から、ロストした移動体10までの接近経路を計画する。次いで、正常な移動体10の処理制御部118は、計画した接近経路に沿って正常な移動体10を移動させ、正常な移動体10とロストした移動体10を接近させる。接近後、正常な移動体10の処理制御部118は、ロストした移動体10の自己位置を正常な移動体10に推定させる。例えば、正常な移動体10は、正常な移動体10とロストした移動体10との相対的な位置関係の算出によりロストした移動体10の自己位置を推定する。これにより、正常な移動体10は、ロストした移動体10に接近することで、ロストした移動体10の自己位置を推定することができる。また、ロストした移動体10は、正常な移動体10が推定したロストした移動体10の自己位置により、自身の自己位置を復帰することができる。 When the request information is received from the approaching object, the process control unit 118 of the normal moving body 10 causes the normal moving body 10 to approach the approaching object based on the request information, and estimates the position of the approaching object to the normal moving body 10. Let Specifically, first, the processing control unit 118 of the normal moving body 10 calculates, based on the self-position of the normal moving body 10 estimated by the estimating unit 112 and the lost position information of the lost moving body 10 included in the request information, Plan an approach route to the lost moving body 10. Next, the processing control unit 118 of the normal moving body 10 moves the normal moving body 10 along the planned approach route, and brings the normal moving body 10 and the lost moving body 10 close to each other. After approaching, the process control unit 118 of the normal moving body 10 causes the normal moving body 10 to estimate the self position of the lost moving body 10. For example, the normal moving body 10 estimates the self-position of the lost moving body 10 by calculating the relative positional relationship between the normal moving body 10 and the lost moving body 10. Thereby, the normal moving body 10 can estimate the self-position of the lost moving body 10 by approaching the lost moving body 10. Further, the lost moving body 10 can restore its own position based on the self position of the lost moving body 10 estimated by the normal moving body 10.
 また、推定後、正常な移動体10の処理制御部118は、推定した自己位置をロストした移動体10へ送信させる。そして、ロストした移動体10は、正常な移動体10から受信する自己位置をロストした移動体10の自己位置であると同定する。これにより、ロストした移動体10は、自己位置を復帰することができる。 After the estimation, the process control unit 118 of the normal mobile unit 10 transmits the estimated self-position to the lost mobile unit 10. Then, the lost moving body 10 identifies the self position received from the normal moving body 10 as the self position of the lost moving body 10. As a result, the lost moving body 10 can return to its own position.
 (移動体10がロストしたと判定された場合)
 ロストした移動体10の処理制御部118は、ロストした移動体10が接近対象と接近するための処理(以下では、「接近処理」とも称される)を行う。例えば、ロストした移動体10の処理制御部118は、接近対象の位置に基づき、接近処理を行う。具体的に、ロストした移動体10の処理制御部118は、ロストした移動体10の自己位置と接近対象の位置から接近経路を計画し、当該接近経路に沿ってロストした移動体10を移動させる処理を接近処理として行う。さらに、ロストした移動体10の処理制御部118は、接近対象に対する復帰要求をロストした移動体10に行わせる。これにより、ロストした移動体10は、接近対象に対して復帰要求を行い、自己位置を復帰することができる。なお、当該接近処理及び復帰要求の内容は、接近対象に応じて異なる。
(When it is determined that the moving body 10 is lost)
The processing control unit 118 of the lost moving body 10 performs processing for allowing the lost moving body 10 to approach an approaching target (hereinafter, also referred to as “approaching processing”). For example, the process control unit 118 of the lost moving body 10 performs the approach process based on the position of the approach target. Specifically, the processing control unit 118 of the lost moving body 10 plans an approach route from the self position of the lost moving body 10 and the position of the approach target, and moves the lost moving body 10 along the approach route. The process is performed as an approach process. Further, the processing control unit 118 of the lost moving body 10 causes the lost moving body 10 to make a return request to the approaching object. As a result, the lost mobile body 10 can return its own position by issuing a return request to the approaching object. The contents of the approach process and the return request differ depending on the approach target.
 接近対象が人である場合、ロストした移動体10は、人に対する接近処理を行い、当該人に対して復帰要求を行う。具体的に、まず、ロストした移動体10の処理制御部118は、人の位置に基づき、ロストした移動体10が人と接近するための接近経路を計画する。計画後、ロストした移動体10の処理制御部118は、計画した接近経路に沿ってロストした移動体10を移動させ、ロストした移動体10と人を接近させる。接近後、ロストした移動体10の処理制御部118は、人に対する復帰要求をロストした移動体10に行わせる。なお、ロストした移動体10の処理制御部118は、ロストした移動体10が人に接近している最中に、復帰要求をロストした移動体10に行わせてもよい。これにより、ロストした移動体10は、人を介して自己位置の復帰を行うことができる。また、ロストした移動体10は、外部の他のシステムに依存することなく自己位置を復帰することができる。 When the approach target is a person, the lost moving body 10 performs approach processing on the person and requests the person to return. Specifically, first, the processing control unit 118 of the lost moving body 10 plans an approach route for the lost moving body 10 to approach the person based on the position of the person. After the planning, the processing control unit 118 of the lost moving body 10 moves the lost moving body 10 along the planned approach route, and brings the lost moving body 10 and the person closer. After approaching, the process control unit 118 of the lost moving body 10 causes the lost moving body 10 to make a return request to a person. The processing control unit 118 of the lost moving body 10 may cause the lost moving body 10 to make a return request while the lost moving body 10 is approaching a person. As a result, the lost moving body 10 can return to its own position via a person. Further, the lost moving body 10 can restore its own position without depending on other external systems.
 ここでいう復帰要求は、例えば、自己位置の復帰が可能な位置へロストした移動体10を移動させることである。ロストした移動体10が当該復帰要求を行う方法は、特に限定されない。例えば、処理制御部118は、ジェスチャー、音声出力、または表示出力の少なくともいずれか1つにより表現される復帰要求をロストした移動体10に行わせる。 The return request here is, for example, to move the lost moving body 10 to a position where the own position can be returned. The method by which the lost mobile unit 10 makes the return request is not particularly limited. For example, the process control unit 118 causes the lost moving body 10 to make a return request expressed by at least one of gesture, voice output, and display output.
 具体的に、ロストした移動体10がアームや脚を備える場合、ロストした移動体10の処理制御部118は、復帰要求の内容を示す情報(以下では、「復帰要求情報」とも称される)に応じて、ロストした移動体10のアームや脚を駆動させる。 Specifically, when the lost moving body 10 has arms and legs, the processing control unit 118 of the lost moving body 10 has information indicating the content of the return request (hereinafter, also referred to as “return request information”). In response to this, the arm or leg of the lost moving body 10 is driven.
 また、ロストした移動体10の処理制御部118は、出力部140を用いた復帰要求を移動体10に行わせてもよい。例えば、ロストした移動体10の処理制御部118は、出力部140の機能を実現する出力装置に応じた出力情報を生成し、生成した出力情報を出力部140から出力させる。 The processing control unit 118 of the lost mobile unit 10 may cause the mobile unit 10 to make a return request using the output unit 140. For example, the processing control unit 118 of the lost mobile unit 10 generates output information according to the output device that realizes the function of the output unit 140, and causes the output unit 140 to output the generated output information.
 具体的に、出力部140の機能がスピーカ等の音声出力装置により実現される場合、ロストした移動体10の処理制御部118は、出力情報として復帰要求情報を示す音声を生成し、生成した音声を音声出力装置から出力させる。これにより、ロストした移動体10は、音声出力装置を用いた復帰要求を行うことができる。 Specifically, when the function of the output unit 140 is realized by a voice output device such as a speaker, the processing control unit 118 of the lost mobile unit 10 generates a voice indicating the return request information as the output information, and the generated voice. Is output from the audio output device. As a result, the lost moving body 10 can make a return request using the audio output device.
 出力部140の機能がディスプレイ等の表示装置により実現される場合、ロストした移動体10の処理制御部118は、出力情報として復帰要求情報を示す画像やテキスト等を生成し、生成した画像やテキスト等を表示装置に表示させる。これにより、ロストした移動体10は、表示装置を用いた復帰要求を行うことができる。 When the function of the output unit 140 is realized by a display device such as a display, the processing control unit 118 of the lost moving body 10 generates an image or text indicating the return request information as output information, and the generated image or text. Etc. are displayed on the display device. As a result, the lost moving body 10 can make a return request using the display device.
 また、ここでいう自己位置の復帰が可能な位置は、例えば、ロストした移動体10の充電器の位置である。家庭内ロボット等の移動体10は、充電器の位置を基点として自己位置を推定している場合がある。そのため、ロストした移動体10は、例えば人により充電器の位置まで移動されることで、充電器から基点の情報を取得することができる。そして、ロストした移動体10は、取得した基点の情報に基づき自己位置を推定し、自己位置を復帰することができる。なお、当該基点の情報は、ロストした移動体10が充電器と接続されることでロストした移動体10へ送信されてもよいし、ロストした移動体10が充電器の付近に移動した際に無線通信等によりロストした移動体10へ送信されてもよい。 Also, the position where the self position can be returned here is, for example, the position of the charger of the lost moving body 10. The mobile body 10 such as a home robot may estimate its own position based on the position of the charger. Therefore, the lost moving body 10 can obtain the information of the base point from the charger by being moved to the position of the charger by a person, for example. Then, the lost moving body 10 can estimate its own position based on the acquired information on the base point and restore the own position. The information on the base point may be transmitted to the lost moving body 10 by connecting the lost moving body 10 to the charger, or when the lost moving body 10 moves to the vicinity of the charger. It may be transmitted to the lost mobile unit 10 by wireless communication or the like.
 接近対象が人でない場合、ロストした移動体10の処理制御部118は、復帰要求に関する要求情報をロストした移動体10に接近対象へ送信させる。例えば、具体的な接近対象が正常な移動体10である場合、ロストした移動体10の処理制御部118は、ロストした移動体10に要求情報を正常な移動体10へ送信させる。これにより、ロストした移動体10は、接近対象として人、音源、光源等を検出できなくても、正常な移動体10を検出することで自己位置を復帰することができる。 When the approach target is not a person, the processing control unit 118 of the lost mobile unit 10 causes the lost mobile unit 10 to transmit request information regarding the return request to the approach target. For example, when the specific approach target is the normal moving body 10, the processing control unit 118 of the lost moving body 10 causes the lost moving body 10 to transmit request information to the normal moving body 10. Accordingly, the lost moving body 10 can restore its own position by detecting the normal moving body 10 even if it cannot detect a person, a sound source, a light source, or the like as an approach target.
 ここでいう要求情報は、ロスト情報、ロスト位置情報、及び軌跡情報の少なくともいずれか1つを含む情報である。 The request information mentioned here is information including at least one of lost information, lost position information, and trajectory information.
 正常な移動体10は、要求情報として、ロスト情報を受信することで、送信元の移動体10がロストしたことを検出することができる。また、正常な移動体10は、要求情報として、ロスト位置情報を受信することで、送信元の移動体10のロスト位置を検出することができる。また、正常な移動体10は、要求情報として、ロストした移動体10の軌跡情報を受信することで、送信元の移動体10がロストした経路(以下では、「ロスト経路」とも称される)を含まない接近経路を計画することができる。 The normal mobile unit 10 can detect that the source mobile unit 10 has been lost by receiving the lost information as the request information. Further, the normal moving body 10 can detect the lost position of the moving body 10 that is the transmission source by receiving the lost position information as the request information. Further, the normal moving body 10 receives the locus information of the lost moving body 10 as the request information, so that the moving body 10 as the transmission source lost the route (hereinafter, also referred to as “lost route”). It is possible to plan an approach route that does not include.
 正常な移動体10は、上述の要求情報を受信することで、ロストした移動体10の位置まで移動し、ロストした移動体10の自己位置を推定することができる。そして、ロストした移動体10は、正常な移動体10から受信する自己位置を、ロストした移動体10の自己位置であると同定することにより、自己位置を復帰することができる。 By receiving the above request information, the normal mobile unit 10 can move to the position of the lost mobile unit 10 and estimate the self-position of the lost mobile unit 10. Then, the lost moving body 10 can restore its own position by identifying the self position received from the normal moving body 10 as the self position of the lost moving body 10.
 上述の機能を実現するために、処理制御部118は、図1に示すように、経路計画部1180及び動作制御部1182を有する。 In order to realize the above functions, the processing control unit 118 has a route planning unit 1180 and an operation control unit 1182 as shown in FIG.
 (2-4-1)経路計画部1180
 経路計画部1180は、経路を計画する機能を有する。例えば、経路計画部1180は、処理制御部118が行う処理として上述した移動経路の計画に関する処理を行う。そして、経路計画部1180は、計画した移動経路を示す情報を動作制御部1182へ出力する。
(2-4-1) Path planning unit 1180
The route planning unit 1180 has a function of planning a route. For example, the route planning unit 1180 performs the above-described process regarding the travel route planning as the process performed by the process control unit 118. Then, the route planning unit 1180 outputs information indicating the planned movement route to the operation control unit 1182.
 なお、移動体10は、過去にロストした位置を通ると、再度ロストする恐れがある。そこで、経路計画部1180は、過去のロスト位置を移動体10が通らないように移動経路の計画を行ってもよい。これにより、経路計画部1180は、移動体10のロストの再発を防止することができる。 Note that the mobile unit 10 may be lost again if it passes through a position that was lost in the past. Therefore, the route planning unit 1180 may plan the movement route so that the moving body 10 does not pass the past lost position. Thereby, the route planning unit 1180 can prevent the recurrence of the lost of the mobile unit 10.
 具体的な計画方法の一例として、経路計画部1180が、移動体10が移動し得る移動経路の候補に対して優先度を設定し、優先度がより高い候補を移動体10の移動経路として決定する方法が挙げられる。例えば、経路計画部1180は、ロスト位置を含む移動経路の候補の優先度を、ロスト位置を含まない移動経路の候補の優先度よりも低く設定する。これにより、移動経路の候補としてロスト位置を含まない移動経路とロスト位置を含む移動経路が存在する場合、経路計画部1180は、ロスト位置を含まない移動経路を選択することができる。 As an example of a specific planning method, the route planning unit 1180 sets priorities for candidates of a moving route that the moving body 10 can move, and determines a candidate with a higher priority as the moving route of the moving body 10. There is a method of doing. For example, the route planning unit 1180 sets the priority of the candidate of the moving route including the lost position lower than the priority of the candidate of the moving route that does not include the lost position. Accordingly, when there are a moving route that does not include the lost position and a moving route that includes the lost position as candidates for the moving route, the route planning unit 1180 can select the moving route that does not include the lost position.
 また、経路計画部1180は、移動経路の候補に含まれるロスト位置の数に応じて優先度を設定してもよい。具体的に、経路計画部1180は、ロスト位置の数が多く含まれる移動経路の候補ほど、優先度を低く設定してもよい。これにより、候補としてロスト位置を含まない移動経路とロスト位置を含む移動経路が存在する場合、経路計画部1180は、ロスト位置を含まない移動経路を選択することができる。また、候補としてロスト位置を含む移動経路のみが複数存在する場合、経路計画部1180は、ロスト位置がより少ない移動経路を選択することができる。 Further, the route planning unit 1180 may set the priority according to the number of lost positions included in the candidates for the moving route. Specifically, the route planning unit 1180 may set the priority to a lower value for a candidate of a moving route including a larger number of lost positions. As a result, when there are a moving route that does not include the lost position and a moving route that includes the lost position as candidates, the route planning unit 1180 can select the moving route that does not include the lost position. In addition, when there are only a plurality of movement routes including the lost position as a candidate, the route planning unit 1180 can select the movement route having the smaller lost position.
 なお、上述のロスト位置には、移動体10のロスト位置と他の移動体10のロスト位置の両方が含まれ得る。また、上述のロスト位置には、移動体10がロストする度に記憶部120に記憶されるデータが用いられる。 Note that the lost position described above may include both the lost position of the moving body 10 and the lost position of another moving body 10. Moreover, the data stored in the storage unit 120 each time the moving body 10 is lost is used as the above-mentioned lost position.
 ここで、経路計画部1180による経路の計画の具体例を説明する。図2は、本開示の実施形態に係るロスト位置を通らない移動経路の計画の例を示す図である。図3は、本開示の実施形態に係るロスト経路を通らない接近経路の計画の例を示す図である。 Here, a specific example of the route planning by the route planning unit 1180 will be described. FIG. 2 is a diagram illustrating an example of a travel route plan that does not pass through a lost position according to the embodiment of the present disclosure. FIG. 3 is a diagram illustrating an example of an approach route plan that does not pass through a lost route according to the embodiment of the present disclosure.
 図2では、壁20に囲まれた通路22内にて、移動体10がロスト地点50を含まない移動経路70を計画する例が示されている。図2に示す移動体10は、過去に、開始地点30から目的地40に向けて移動経路60に沿って移動を開始し、ロスト地点50にてロストしている。そのため、移動体10は、移動経路60を再度通った場合、ロスト地点50にて再度ロストする恐れがある。 2 shows an example in which the moving body 10 plans a moving route 70 that does not include the lost point 50 in the passage 22 surrounded by the wall 20. The moving body 10 shown in FIG. 2 has started moving along the moving route 60 from the starting point 30 toward the destination 40 in the past, and has been lost at the lost point 50. Therefore, when the moving body 10 passes through the moving route 60 again, there is a possibility that the moving body 10 is lost again at the lost point 50.
 そこで、移動体10は、開始地点30から目的地40への移動経路を再度計画する際に、ロスト地点50を通らない経路を計画する。具体的に、移動体10は、図2に示す移動経路70を計画する。図2に示すように、移動経路70は、ロスト地点50を含まない経路である。よって、移動体10は、開始地点30から移動経路70に沿って移動することで、ロスト地点50を通ることなく目的地40まで移動することができる。また、移動体10は、移動経路70に沿って移動することで、ロスト地点50を含む移動経路に沿って移動する場合よりもロストする可能性を低減することができる。 Therefore, when the moving body 10 re-plans the moving route from the starting point 30 to the destination 40, it plans a route that does not pass through the lost point 50. Specifically, the moving body 10 plans the moving route 70 shown in FIG. As shown in FIG. 2, the travel route 70 is a route that does not include the lost point 50. Therefore, the moving body 10 can move to the destination 40 without passing through the lost point 50 by moving along the movement route 70 from the start point 30. In addition, the moving body 10 moves along the moving route 70, so that the possibility of being lost can be reduced as compared with the case where the moving body 10 moves along the moving route including the lost point 50.
 図3では、壁20に囲まれた通路22内にて、移動体10bがロストした移動体10aへのロスト経路である移動経路60を含まない接近経路90を計画する例が示されている。図3に示す移動体10aは、開始地点30aから目的地40に向けて移動経路60に沿って移動を開始し、ロスト地点50にてロストしている。そのため、移動体10bは、移動経路60と部分的に共通する経路を含む接近経路80を計画して移動体10aへ接近すると、部分的に共通する経路にてロストする恐れがある。 FIG. 3 shows an example of planning an approach route 90 that does not include the moving route 60 that is the lost route to the moving body 10a lost by the moving body 10b in the passage 22 surrounded by the wall 20. The moving body 10a shown in FIG. 3 starts moving along the movement route 60 from the starting point 30a toward the destination 40 and is lost at the lost point 50. Therefore, when the moving body 10b approaches the moving body 10a by planning an approach route 80 including a route partially common to the moving route 60, there is a risk that the moving body 10b will be lost on the partially common route.
 そこで、移動体10bは、移動体10aへの接近経路を計画する際に、移動体10aの移動経路60を含まない経路を計画する。例えば、移動体10bは、図3に示す移動経路60を含まない経路である接近経路90を計画する。具体的に、移動体10bは、移動体10aから受信する要求情報に含まれるロスト位置情報と軌跡情報に基づき、ロストした移動体10aがロストするまでに移動した経路を通らない接近経路90を計画する。そして、移動体10bは、開始地点30bから接近経路90に沿って移動することで、移動経路60を通ることなくロスト地点50でロストした移動体10aに接近することができる。また、移動体10bは、接近経路90に沿って移動することで、移動経路60を含む接近経路に沿って移動する場合よりもロストする可能性を低減することができる。 Therefore, when the moving body 10b plans the approach route to the moving body 10a, it plans a route that does not include the moving route 60 of the moving body 10a. For example, the moving body 10b plans an approaching route 90 that is a route that does not include the moving route 60 shown in FIG. Specifically, the moving body 10b plans an approach route 90 that does not pass through the route that the lost moving body 10a has moved to based on the lost position information and the trajectory information included in the request information received from the moving body 10a. To do. Then, the moving body 10b moves from the starting point 30b along the approach route 90, and thus can approach the moving body 10a lost at the lost point 50 without passing through the moving route 60. Further, the moving body 10b moves along the approaching route 90, so that the possibility of being lost can be reduced as compared with the case where the moving body 10b moves along the approaching route including the moving route 60.
 (2-4-2)動作制御部1182
 動作制御部1182は、移動体10の動作を制御する機能を有する。例えば、動作制御部1182は、駆動部130の駆動を制御することで、移動体10が移動する際の動作を制御する。具体的に、動作制御部1182は、経路計画部1180から入力される移動経路を示す情報(以下では、「移動経路情報」とも称される)に基づき、移動体10を移動経路に沿って移動させる際に駆動部130を駆動させるための駆動情報を生成する。そして、動作制御部1182は、生成した駆動情報を駆動部130へ出力する。
(2-4-2) Operation control unit 1182
The operation control unit 1182 has a function of controlling the operation of the moving body 10. For example, the operation control unit 1182 controls the operation of the moving body 10 by controlling the driving of the driving unit 130. Specifically, the operation control unit 1182 moves the moving body 10 along the moving route based on the information indicating the moving route input from the route planning unit 1180 (hereinafter, also referred to as “moving route information”). Drive information for driving the drive unit 130 is generated when the drive is performed. Then, the operation control unit 1182 outputs the generated drive information to the drive unit 130.
 また、動作制御部1182は、駆動部130の駆動を制御することで、移動体10が復帰要求を行う際の動作を制御する。具体的に、動作制御部1182は、復帰要求情報に基づき、移動体10が復帰要求を行う際に駆動部130を駆動させるための駆動情報を生成する。そして、動作制御部1182は、生成した駆動情報を駆動部130へ出力する。 The operation control unit 1182 also controls the drive of the drive unit 130 to control the operation when the moving body 10 makes a return request. Specifically, the operation control unit 1182 generates drive information for driving the drive unit 130 when the moving body 10 makes a return request based on the return request information. Then, the operation control unit 1182 outputs the generated drive information to the drive unit 130.
 また、動作制御部1182は、出力部140の出力を制御することで、移動体10が復帰要求を行う際の動作を制御する。具体的に、動作制御部1182は、出力部140の機能を実現する出力装置に応じて、当該出力装置に適した出力情報を生成する。そして、動作制御部1182は、生成した出力情報を出力部140へ出力する。 Further, the operation control unit 1182 controls the output of the output unit 140 to control the operation when the mobile unit 10 makes a return request. Specifically, the operation control unit 1182 generates output information suitable for the output device according to the output device that realizes the function of the output unit 140. Then, the operation control unit 1182 outputs the generated output information to the output unit 140.
 (3)記憶部120
 記憶部120は、移動体10における処理に関するデータを記憶する機能を有する。例えば、記憶部120は、移動体10が移動経路を計画する際に用いる情報を記憶する。具体的に、記憶部120は、移動体10が移動する際の目的地・経由点を示す情報(以下では、「目的地・経由点情報」とも称される)を目的地・経由点情報DBに記憶する。
(3) Storage unit 120
The storage unit 120 has a function of storing data regarding processing in the mobile unit 10. For example, the storage unit 120 stores information used when the moving body 10 plans a moving route. Specifically, the storage unit 120 stores information indicating destinations/route points when the moving body 10 moves (hereinafter, also referred to as “destination/route point information”) in the destination/route point information DB. Remember.
 また、記憶部120は、移動体10がロスト位置を通らない移動経路を計画する際に用いる情報を記憶する。具体的に、記憶部120は、移動体10のロスト位置情報をロスト位置DBに記憶する。 The storage unit 120 also stores information used when the moving body 10 plans a moving route that does not pass through the lost position. Specifically, the storage unit 120 stores the lost position information of the moving body 10 in the lost position DB.
 なお、記憶部120が記憶するデータは、上述の例に限定されない。例えば、記憶部120は、センサ部100のセンシング情報を記憶してもよい。また、記憶部120は、通信部150を介して取得された情報を記憶してもよい。また、記憶部120は、各種アプリケーション等のプログラムを記憶してもよい。 The data stored in the storage unit 120 is not limited to the above example. For example, the storage unit 120 may store the sensing information of the sensor unit 100. In addition, the storage unit 120 may store information acquired via the communication unit 150. The storage unit 120 may also store programs such as various applications.
 (4)駆動部130
 駆動部130は、移動体10を動作させる機能を有する。例えば、駆動部130は、制御部110から入力される駆動情報に基づき駆動することで、移動体10を動作させる。駆動部130は、例えば、アクチュエータにより実現され得る。移動体10が関節部を有するアームを備える場合、アクチュエータは、当該アームの関節部に設けられ得る。そのため、駆動部130は、制御部110から入力される駆動情報に基づき駆動することで、移動体10のアームを動作させることができる。
(4) Drive unit 130
The drive unit 130 has a function of operating the moving body 10. For example, the driving unit 130 operates the moving body 10 by driving based on the driving information input from the control unit 110. The drive unit 130 can be realized by, for example, an actuator. When the moving body 10 includes an arm having a joint, the actuator may be provided at the joint of the arm. Therefore, the drive unit 130 can operate the arm of the moving body 10 by driving based on the drive information input from the control unit 110.
 また、移動体10が関節部を有する脚を備える場合、アクチュエータは、当該脚の関節部に設けられ得る。そのため、駆動部130は、アームの場合と同様に、駆動情報に基づき駆動することで、移動体10の脚を動作させることができる。なお、移動体10は車輪を備えてもよく、駆動部130は、当該車輪を駆動可能に構成されてもよい。 When the moving body 10 includes a leg having a joint, the actuator may be provided at the joint of the leg. Therefore, as in the case of the arm, the drive unit 130 can drive the leg of the moving body 10 by driving based on the drive information. The moving body 10 may include wheels, and the driving unit 130 may be configured to be able to drive the wheels.
 (5)出力部140
 出力部140は、制御部110からの入力に応じた出力を行う機能を有する。当該機能は、移動体10が備える出力装置により実現される。例えば、出力部140の機能は、スピーカ等の音声出力装置により実現され得る。出力部140の機能が音声出力装置により実現される場合、出力部140は、例えば、制御部110から入力される復帰要求の内容を示す音声を出力する。
(5) Output unit 140
The output unit 140 has a function of outputting according to the input from the control unit 110. The function is implemented by the output device included in the moving body 10. For example, the function of the output unit 140 can be realized by an audio output device such as a speaker. When the function of the output unit 140 is realized by the audio output device, the output unit 140 outputs, for example, a sound indicating the content of the return request input from the control unit 110.
 また、出力部140の機能は、ディスプレイ等の表示装置により実現され得る。出力部140の機能が表示装置により実現される場合、出力部140は、復帰要求の内容を示す画像やテキスト等の情報を表示装置に表示する。 Also, the function of the output unit 140 can be realized by a display device such as a display. When the function of the output unit 140 is realized by the display device, the output unit 140 displays information such as an image and text indicating the content of the return request on the display device.
 なお、出力部140の機能は、1つの出力装置により実現されてもよいし、複数の出力装置により実現されてもよい。例えば、出力部140の機能は、音声出力装置または表示装置のいずれか一方により実現されてもよいし、音声出力装置及び表示装置の両方により実現されてもよい。 The function of the output unit 140 may be realized by one output device or may be realized by a plurality of output devices. For example, the function of the output unit 140 may be realized by either the audio output device or the display device, or may be realized by both the audio output device and the display device.
 (6)通信部150
 通信部150は、外部装置と通信を行う機能を有する。通信部150は、例えば、外部装置との通信において、外部装置から受信する情報を制御部110へ出力する。具体的に、通信部150は、ロストした移動体10から要求情報を受信し、制御部110へ出力する。また、通信部150は、ロストした移動体10から判定結果を受信し、制御部110へ出力する。また、通信部150は、正常な移動体10から自己位置を受信し、制御部110へ出力する。
(6) Communication unit 150
The communication unit 150 has a function of communicating with an external device. The communication unit 150 outputs the information received from the external device to the control unit 110 in the communication with the external device, for example. Specifically, the communication unit 150 receives the request information from the lost moving body 10 and outputs the request information to the control unit 110. Further, the communication unit 150 receives the determination result from the lost moving body 10 and outputs the determination result to the control unit 110. In addition, the communication unit 150 receives its own position from the normal moving body 10 and outputs it to the control unit 110.
 通信部150は、例えば、外部装置との通信において、制御部110から入力される情報を外部装置へ送信する。具体的に、通信部150は、制御部110から入力される要求情報を正常な移動体10へ送信する。また、通信部150は、制御部110から入力される判定結果を正常な移動体10へ送信する。また、通信部150は、制御部110から入力される自己位置をロストした移動体10へ送信する。 The communication unit 150 transmits information input from the control unit 110 to an external device in communication with the external device, for example. Specifically, the communication unit 150 transmits the request information input from the control unit 110 to the normal mobile body 10. The communication unit 150 also transmits the determination result input from the control unit 110 to the normal moving body 10. In addition, the communication unit 150 transmits the self position input from the control unit 110 to the lost mobile body 10.
<<3.移動体における処理例>>
 以上、本開示の実施形態に係る移動体10の機能構成例について説明した。続いて、本開示の実施形態に係る移動体10における処理例について説明する。
<<3. Example of processing in mobile unit>>
The example of the functional configuration of the mobile unit 10 according to the embodiment of the present disclosure has been described above. Next, a processing example in the moving body 10 according to the embodiment of the present disclosure will be described.
 <3-1.移動体が通常の移動を行う際の処理>
 <3-1-1.処理ブロック>
 移動体10が通常の移動を行う際の処理について説明する。まず、移動体10が通常の移動を行う際に用いられる処理ブロックついて説明する。図4は、本開示の実施形態に係る移動体10が通常の移動を行う際の処理ブロックの例を示すブロック図である。
<3-1. Processing when the mobile body moves normally>
<3-1-1. Processing block>
A process when the moving body 10 normally moves will be described. First, a processing block used when the moving body 10 normally moves will be described. FIG. 4 is a block diagram showing an example of processing blocks when the moving body 10 according to the embodiment of the present disclosure makes a normal movement.
 移動体10が通常の移動を行う際の処理では、図4に示すように、センサ部100、制御部110、記憶部120、及び駆動部130の機能が用いられる。具体的に、センサ部100では、外界センサ102及び内界センサ104が用いられる。制御部110では、推定部112及び処理制御部118が用いられる。記憶部120では、目的地・経由点DB122が用いられる。 The functions of the sensor unit 100, the control unit 110, the storage unit 120, and the drive unit 130 are used in the process when the mobile unit 10 normally moves. Specifically, in the sensor unit 100, the external sensor 102 and the internal sensor 104 are used. The control unit 110 uses the estimation unit 112 and the processing control unit 118. In the storage unit 120, the destination/route point DB 122 is used.
 より具体的に、推定部112では、自己位置推定部1120のスターレコニング部1122、デッドレコニング部1124、及び統合部1126が用いられる。また、処理制御部118では、経路計画部1180及び動作制御部1182が用いられる。 More specifically, the estimation unit 112 uses the star reckoning unit 1122, the dead reckoning unit 1124, and the integration unit 1126 of the self-position estimation unit 1120. Further, the processing control unit 118 uses the route planning unit 1180 and the operation control unit 1182.
 まず、センサ部100にて、外界センサ102と内界センサ104によるセンシングが行われる。外界センサ102がセンシングしたセンシング情報は、スターレコニング部1122へ出力される。また、内界センサ104がセンシングしたセンシング情報は、デッドレコニング部1124へ出力される。 First, in the sensor unit 100, sensing is performed by the external sensor 102 and the internal sensor 104. The sensing information sensed by the external sensor 102 is output to the star reckoning unit 1122. The sensing information sensed by the inner world sensor 104 is output to the dead reckoning unit 1124.
 次いで、スターレコニング部1122とデッドレコニング部1124の各々にて、センシング情報に基づき、移動体10の自己位置推定が行われる。自己位置推定後、スターレコニング部1122とデッドレコニング部1124の各々にて推定された自己位置が統合部1126へ出力され、統合部1126にて統合処理が行われる。そして、統合処理後の自己位置は、経路計画部1180へ出力される。 Next, the star reckoning unit 1122 and the dead reckoning unit 1124 each perform self-position estimation of the moving body 10 based on the sensing information. After the self-position estimation, the self-position estimated by each of the star reckoning unit 1122 and the dead reckoning unit 1124 is output to the integration unit 1126, and the integration unit 1126 performs integration processing. Then, the self-position after the integration processing is output to the route planning unit 1180.
 次いで、経路計画部1180にて、統合処理後の自己位置と、目的地・経由点DB122から取得される目的地・経由点を示す情報(以下では、「目的地・経由点情報」とも称される)とに基づき、移動体10の移動経路の計画が行われる。計画後、移動経路を示す移動経路情報は、動作制御部1182へ出力される。 Next, in the route planning unit 1180, information indicating the self-position after the integration process and the destination/route point acquired from the destination/route point DB 122 (hereinafter, also referred to as “destination/route point information”). Based on the above), the moving route of the moving body 10 is planned. After the planning, the movement route information indicating the movement route is output to the operation control unit 1182.
 そして、移動経路情報を入力された動作制御部1182は、移動体10を移動経路に沿って動かすために、駆動部130を駆動する制御を行う。 Then, the operation control unit 1182, to which the movement route information is input, controls the driving unit 130 to move the moving body 10 along the movement route.
 <3-1-2.処理フロー>
 次いで、移動体10が通常の移動を行う際の処理フローについて説明する。図5は、本開示の実施形態に係る移動体10が通常の移動を行う際の処理の流れの例を示すフローチャートである。
<3-1-2. Processing flow>
Next, a processing flow when the moving body 10 normally moves will be described. FIG. 5 is a flowchart showing an example of the flow of processing when the moving body 10 according to the embodiment of the present disclosure makes a normal movement.
 図5に示すように、まず、移動体10は、内界センサ104によるセンシングを行う(S100)。次いで、移動体10は、内界センサのセンシング情報に基づき、デッドレコニングによる自己位置推定を行う(S102)。また、移動体10は、外界センサ102によるセンシングを行う(S104)。次いで、移動体10は、外界センサのセンシング情報に基づき、スターレコニングによる自己位置推定を行う(S106)。自己位置推定後、移動体10は、デッドレコニングにより推定した自己位置とスターレコニングにより推定した自己位置とを統合する(S108)。 As shown in FIG. 5, first, the moving body 10 performs sensing by the inner world sensor 104 (S100). Next, the moving body 10 performs self-position estimation by dead reckoning based on the sensing information of the internal sensor (S102). Further, the moving body 10 performs sensing by the external sensor 102 (S104). Next, the moving body 10 performs self-position estimation by star reckoning based on the sensing information of the external sensor (S106). After estimating the self-position, the moving body 10 integrates the self-position estimated by the dead reckoning and the self-position estimated by the star reckoning (S108).
 次いで、移動体10は、目的地・経由点情報を取得する(S110)。次いで、移動体10は、統合後の自己位置と取得した目的地・経由点情報とに基づき、移動経路の計画を行う(S112)。そして、移動体10は、計画した移動経路に沿って移動する(S114)。 Next, the mobile unit 10 acquires destination/route point information (S110). Next, the mobile unit 10 plans a travel route based on the integrated self-position and the acquired destination/route point information (S112). Then, the moving body 10 moves along the planned moving route (S114).
 <3-2.ロストした移動体が人に復帰要求を行う際の処理>
 <3-2-1.処理ブロック>
 以上、移動体10が通常の移動を行う際の処理について説明した。続いて、ロストした移動体10が人に復帰要求を行う際の処理について説明する。まず、ロストした移動体10が人に復帰要求を行う際に用いられる処理ブロックについて説明する。図6は、本開示の実施形態に係るロストした移動体10が人に復帰要求を行う際の処理ブロックの例を示すブロック図である。
<3-2. Processing when a lost mobile unit requests a person to return>
<3-2-1. Processing block>
The processing performed when the moving body 10 normally moves has been described above. Subsequently, a process when the lost moving body 10 requests a person to return will be described. First, a processing block used when the lost moving body 10 requests a person to return will be described. FIG. 6 is a block diagram illustrating an example of processing blocks when the lost moving body 10 according to the embodiment of the present disclosure requests a person to return.
 ロストした移動体10が人に復帰要求を行う際の処理では、図6に示すように、センサ部100、制御部110、駆動部130、及び出力部140の機能が用いられる。具体的に、センサ部100では、外界センサ102、内界センサ104、カメラ106、マイク107、及び光学センサ108が用いられる。制御部110では、推定部112、判定部114、検出部116、及び処理制御部118が用いられる。 As shown in FIG. 6, the functions of the sensor unit 100, the control unit 110, the drive unit 130, and the output unit 140 are used in the process when the lost moving body 10 requests a person to return. Specifically, in the sensor unit 100, the external sensor 102, the internal sensor 104, the camera 106, the microphone 107, and the optical sensor 108 are used. The control unit 110 uses the estimation unit 112, the determination unit 114, the detection unit 116, and the processing control unit 118.
 より具体的に、推定部112では、自己位置推定部1120のスターレコニング部1122、デッドレコニング部1124、及び統合部1126、並びに人位置推定部1128が用いられる。判定部114では、ロスト判定部1140が用いられる。検出部116では、人検出部1160、音源定位部1162、及び光源定位部1164が用いられる。処理制御部118では、経路計画部1180及び動作制御部1182が用いられる。なお、検出部116が接近対象を検出する際の優先度は、人が一番高く、光源が一番低く設定してあるものとする。 More specifically, the estimation unit 112 uses the star reckoning unit 1122, the dead reckoning unit 1124, the integration unit 1126, and the human position estimation unit 1128 of the self-position estimation unit 1120. The determination unit 114 uses the lost determination unit 1140. The detection unit 116 uses a human detection unit 1160, a sound source localization unit 1162, and a light source localization unit 1164. The processing control unit 118 uses a route planning unit 1180 and an operation control unit 1182. It is assumed that the detection unit 116 has the highest priority when the approaching object is detected and that the light source has the lowest priority.
 まず、センサ部100にて、外界センサ102と内界センサ104によるセンシングが行われる。外界センサ102がセンシングしたセンシング情報は、スターレコニング部1122へ出力される。また、内界センサ104がセンシングしたセンシング情報は、デッドレコニング部1124へ出力される。 First, in the sensor unit 100, sensing is performed by the external sensor 102 and the internal sensor 104. The sensing information sensed by the external sensor 102 is output to the star reckoning unit 1122. The sensing information sensed by the inner world sensor 104 is output to the dead reckoning unit 1124.
 次いで、スターレコニング部1122とデッドレコニング部1124の各々にて、センシング情報に基づき、移動体10の自己位置推定が行われる。自己位置推定後、スターレコニング部1122とデッドレコニング部1124の各々にて推定された自己位置が統合部1126へ出力され、統合部1126にて統合処理が行われる。 Next, the star reckoning unit 1122 and the dead reckoning unit 1124 each perform self-position estimation of the moving body 10 based on the sensing information. After the self-position estimation, the self-position estimated by each of the star reckoning unit 1122 and the dead reckoning unit 1124 is output to the integration unit 1126, and the integration unit 1126 performs integration processing.
 ここで、スターレコニング部1122とデッドレコニング部1124の各々にて推定された自己位置は、ロスト判定部1140へ出力される。ロスト判定部1140は、入力される自己位置に基づき、ロスト判定を行う。ロスト判定により移動体10がロストしたと判定した場合、ロスト判定部1140は、移動体10がロストした旨を示す判定結果を検出部116及び人位置推定部1128へ出力する。なお、移動体10がロストしている時、スターレコニング部1122により推定された自己位置と、統合部1126による統合処理後の自己位置は、後続の処理にて用いられない。よって、移動体10がロストしている間、スターレコニング部1122及び統合部1126による処理は行われなくてもよい。 Here, the self position estimated by each of the star reckoning unit 1122 and the dead reckoning unit 1124 is output to the lost determination unit 1140. The lost determination unit 1140 makes a lost determination based on the input self-position. When it is determined by the lost determination that the moving body 10 has been lost, the lost determining unit 1140 outputs a determination result indicating that the moving body 10 has been lost to the detecting unit 116 and the person position estimating unit 1128. When the moving body 10 is lost, the self-position estimated by the star reckoning unit 1122 and the self-position after the integration processing by the integration unit 1126 are not used in the subsequent processing. Therefore, the processing by the star reckoning unit 1122 and the integration unit 1126 may not be performed while the moving body 10 is lost.
 次いで、判定結果を入力された検出部116は、検出処理を行う。まず、カメラ106が撮像する撮像画像に基づき、人検出部1160が検出処理を行う。人が検出された場合、人検出部1160は、撮像画像を接近対象情報として人位置推定部1128へ出力する。人が検出されなかった場合、マイク107が取得する音声情報に基づき、音源定位部1162が検出処理を行う。音源が検出された場合、音源定位部1162は、音源位置情報を接近対象情報として人位置推定部1128へ出力する。音源が検出されなかった場合、光学センサが取得する光情報に基づき、光源定位部1164が検出処理を行う。光源が検出された場合、光源定位部1164は、光源位置情報を接近対象情報として人位置推定部1128へ出力する。 Next, the detection unit 116, to which the determination result has been input, performs detection processing. First, the human detection unit 1160 performs detection processing based on the captured image captured by the camera 106. When a person is detected, the person detection unit 1160 outputs the captured image as the approach target information to the person position estimation unit 1128. When no person is detected, the sound source localization unit 1162 performs detection processing based on the voice information acquired by the microphone 107. When a sound source is detected, the sound source localization unit 1162 outputs the sound source position information to the person position estimating unit 1128 as approach target information. When the sound source is not detected, the light source localization unit 1164 performs a detection process based on the light information acquired by the optical sensor. When the light source is detected, the light source localization unit 1164 outputs the light source position information to the person position estimation unit 1128 as approach target information.
 次いで、ロスト判定部1140から判定結果を入力された人位置推定部1128は、検出部116から入力された接近対象情報に基づき、接近対象の位置を推定する。推定後、人位置推定部1128は、推定した接近対象の位置を経路計画部1180へ出力する。 Next, the person position estimation unit 1128, to which the determination result is input from the lost determination unit 1140, estimates the position of the approach target based on the approach target information input from the detection unit 116. After the estimation, the person position estimating unit 1128 outputs the estimated position of the approaching object to the route planning unit 1180.
 次いで、経路計画部1180にて、デッドレコニング部1124から入力されたロスト前の最後に取得された移動体10の自己位置と、人位置推定部1128から入力された接近対象の位置とに基づき、移動体10の接近経路の計画が行われる。計画後、接近経路を示す情報(以下では、「接近経路情報」とも称される)は、動作制御部1182へ出力される。 Then, in the route planning unit 1180, based on the self-position of the moving body 10 that is input last from the dead reckoning unit 1124 and is acquired before loss, and the position of the approaching target input from the human position estimation unit 1128, The approach route of the moving body 10 is planned. After the planning, the information indicating the approach route (hereinafter, also referred to as “approach route information”) is output to the operation control unit 1182.
 接近経路情報を入力された動作制御部1182は、移動体10を接近経路に沿って動かすために、駆動部130を駆動する制御を行う。そして、移動体10が接近対象に接近後、動作制御部1182は、駆動部130または出力部140の動作を制御することで、移動体10に復帰要求を行わせる。なお、動作制御部1182は、移動体10が接近対象に接近している最中に、移動体10に復帰要求を行わせてもよい。 The operation control unit 1182, to which the approach route information is input, controls the driving unit 130 to move the moving body 10 along the approach route. Then, after the moving body 10 approaches the approaching object, the operation control unit 1182 controls the operation of the driving unit 130 or the output unit 140 to cause the moving body 10 to make a return request. The operation control unit 1182 may cause the moving body 10 to make a return request while the moving body 10 is approaching the approach target.
 <3-2-2.処理フロー>
 次いで、ロストした移動体10が人に復帰要求を行う際の処理フローについて説明する。図7は、本開示の実施形態に係るロストした移動体10が人に復帰要求を行う際の移動体10における処理の流れの例を示すフローチャートである。
<3-2-2. Processing flow>
Next, a processing flow when the lost moving body 10 requests a person to return will be described. FIG. 7 is a flowchart showing an example of the flow of processing in the mobile unit 10 when the lost mobile unit 10 according to the embodiment of the present disclosure requests a person to return.
 図7に示すように、まず、移動体10は、移動を行う(S200)。次いで、移動体10は、移動しながらロスト判定を行う(S202)。移動体10がロストしたと判定されていない場合(S202/NO)、移動体10は、移動を継続する。 As shown in FIG. 7, first, the moving body 10 moves (S200). Next, the moving body 10 makes a lost decision while moving (S202). When it is not determined that the moving body 10 is lost (S202/NO), the moving body 10 continues moving.
 移動体10がロストしたと判定された場合(S202/YES)、移動体10は、人の検出処理を行う(S204)。人が検出された場合(S204/YES)、移動体10は、人の位置推定を行う(S206)。 If it is determined that the mobile unit 10 has been lost (S202/YES), the mobile unit 10 performs a person detection process (S204). When a person is detected (S204/YES), the moving body 10 estimates the position of the person (S206).
 人が検出されなかった場合(S204/NO)、移動体10は、音声の検出処理を行う(S208)。音声が検出された場合(S208/YES)、移動体10は、音源の位置定位を行う(S210)。音源の位置定位後、移動体10は、音源の位置に基づき、人の位置推定を行う(S206)。 When no person is detected (S204/NO), the mobile unit 10 performs a voice detection process (S208). When the voice is detected (S208/YES), the moving body 10 performs localization of the sound source (S210). After the localization of the sound source, the moving body 10 estimates the position of the person based on the position of the sound source (S206).
 音声が検出されなかった場合(S208/NO)、移動体10は、光度勾配の検出処理を行う(S212)。光度勾配が検出された場合(S212/YES)、移動体10は、光源の位置定位を行う(S214)。光源の位置定位後、移動体10は、光源の位置に基づき、人の位置推定を行う(S206)。 If no voice is detected (S208/NO), the moving body 10 performs a light intensity gradient detection process (S212). When the light intensity gradient is detected (S212/YES), the moving body 10 positions the light source (S214). After the position of the light source is localized, the moving body 10 estimates the position of the person based on the position of the light source (S206).
 光度勾配が検出されなかった場合(S212/NO)、移動体10は、動作を所定の時間だけ一時停止する(S218)。所定の時間経過後、移動体10は、再度S204から処理を再開する。 If the light intensity gradient is not detected (S212/NO), the moving body 10 suspends its operation for a predetermined time (S218). After the lapse of a predetermined time, the moving body 10 restarts the process from S204.
 人の位置推定後、移動体10は、人への接近経路の計画を行う(S216)。計画後、移動体10は、計画した接近経路に沿って人の位置まで移動を開始する。人の位置またはその周辺までの移動が完了した場合(S220/YES)、移動体10は、人へ復帰要求を行う(S222)。人の位置またはその周辺までの移動が完了していない場合(S220/NO)、移動体10は、移動を継続する。なお、この時、移動体10は、人の位置までの接近経路の計画を再度行ってもよい。 After estimating the position of the person, the mobile unit 10 plans an approach route to the person (S216). After the planning, the moving body 10 starts moving to the position of the person along the planned approach route. When the movement to the position of the person or its surroundings is completed (S220/YES), the moving body 10 makes a request for returning to the person (S222). When the movement to the position of the person or its periphery is not completed (S220/NO), the moving body 10 continues the movement. At this time, the moving body 10 may again plan the approach route to the position of the person.
 <3-3.ロストした移動体が他の移動体に復帰要求を行う際の処理>
 <3-3-1.処理ブロック>
 以上、ロストした移動体10が人に復帰要求を行う際の処理について説明した。続いて、ロストした移動体10が他の移動体10に復帰要求を行う際の処理について説明する。まず、ロストした移動体10が他の移動体10に復帰要求を行う際に用いる処理ブロックについて説明する。図8は、本開示の実施形態に係るロストした移動体10が他の移動体10に復帰要求を行う際の処理ブロックの例を示すブロック図である。
<3-3. Processing when a lost mobile makes a return request to another mobile>
<3-3-1. Processing block>
The processing when the lost moving body 10 makes a return request to a person has been described above. Next, a process when the lost moving body 10 makes a return request to another moving body 10 will be described. First, a processing block used when the lost mobile unit 10 makes a return request to another mobile unit 10 will be described. FIG. 8 is a block diagram showing an example of a processing block when the lost moving body 10 according to the embodiment of the present disclosure makes a return request to another moving body 10.
 なお、図8の移動体10aの処理ブロックは、ロストした移動体10の処理ブロックを示している。また、図8の移動体10bの処理ブロックは、ロストした移動体10に検出された他の移動体10の処理ブロックを示している。なお、図8に示す処理ブロックは、移動体10a(第1の制御装置を備える移動体)及び移動体10b(第2の制御装置を備える移動体)を含む制御システム1000の処理ブロックも示している。 The processing block of the mobile unit 10a in FIG. 8 shows the processing block of the lost mobile unit 10. The processing block of the moving body 10b in FIG. 8 is a processing block of another moving body 10 detected by the lost moving body 10. The processing blocks illustrated in FIG. 8 also indicate processing blocks of the control system 1000 including the moving body 10a (moving body having the first control device) and the moving body 10b (moving body having the second control device). There is.
 移動体10aが移動体10bに復帰要求を行う際、図8に示すように、移動体10aの処理ブロックでは、センサ部100a、制御部110a、及び通信部150aの機能が用いられる。具体的に、センサ部100aでは、外界センサ102a及び内界センサ104aが用いられる。制御部110aでは、推定部112a及び判定部114aが用いられる。より具体的に、推定部112aでは、自己位置推定部1120aのスターレコニング部1122a、デッドレコニング部1124a、及び統合部1126aが用いられる。判定部114aでは、ロスト判定部1140aが用いられる。 When the mobile unit 10a makes a return request to the mobile unit 10b, as shown in FIG. 8, the processing block of the mobile unit 10a uses the functions of the sensor unit 100a, the control unit 110a, and the communication unit 150a. Specifically, in the sensor unit 100a, the external sensor 102a and the internal sensor 104a are used. The estimation unit 112a and the determination unit 114a are used in the control unit 110a. More specifically, the estimation unit 112a uses the star reckoning unit 1122a, the dead reckoning unit 1124a, and the integration unit 1126a of the self-position estimation unit 1120a. The determination unit 114a uses the lost determination unit 1140a.
 また、図8に示すように、移動体10bの処理ブロックでは、センサ部100b、制御部110b、駆動部130b、及び通信部150bの機能が用いられる。具体的に、センサ部100bでは、外界センサ102b及び内界センサ104bが用いられる。制御部110bでは、推定部112b及び処理制御部118bが用いられる。より具体的に、推定部112bでは、自己位置推定部1120bが用いられる。処理制御部118bでは、経路計画部1180b及び動作制御部1182bが用いられる。 Further, as shown in FIG. 8, in the processing block of the mobile unit 10b, the functions of the sensor unit 100b, the control unit 110b, the drive unit 130b, and the communication unit 150b are used. Specifically, the sensor unit 100b uses the external sensor 102b and the internal sensor 104b. The estimation unit 112b and the processing control unit 118b are used in the control unit 110b. More specifically, the estimation unit 112b uses the self-position estimation unit 1120b. The process control unit 118b uses the route planning unit 1180b and the operation control unit 1182b.
 まず、移動体10aのセンサ部100aにて、外界センサ102aと内界センサ104aによるセンシングが行われる。外界センサ102aがセンシングしたセンシング情報は、スターレコニング部1122aへ出力される。また、内界センサ104aがセンシングしたセンシング情報は、デッドレコニング部1124aへ出力される。 First, in the sensor unit 100a of the moving body 10a, sensing is performed by the external sensor 102a and the internal sensor 104a. The sensing information sensed by the external sensor 102a is output to the star reckoning unit 1122a. Further, the sensing information sensed by the inner world sensor 104a is output to the dead reckoning unit 1124a.
 次いで、スターレコニング部1122aとデッドレコニング部1124aの各々にて、センシング情報に基づき、移動体10aの自己位置推定が行われる。自己位置推定後、スターレコニング部1122aとデッドレコニング部1124aの各々にて推定された自己位置が統合部1126aへ出力され、統合部1126aにて統合処理が行われる。 Next, the star reckoning unit 1122a and the dead reckoning unit 1124a each perform self-position estimation of the moving body 10a based on the sensing information. After the self position is estimated, the self position estimated by each of the star reckoning unit 1122a and the dead reckoning unit 1124a is output to the integration unit 1126a, and the integration unit 1126a performs integration processing.
 ここで、スターレコニング部1122aとデッドレコニング部1124aの各々にて推定された自己位置は、ロスト判定部1140aへ出力される。ロスト判定部1140aは、入力される自己位置に基づき、ロスト判定を行う。ロスト判定部1140aにより移動体10aがロストしたと判定された場合、移動体10aは、通信部150aを介して、移動体10aがロストした旨を示す判定結果を移動体10bへ送信する。同時に、移動体10aは、通信部150aを介して、要求情報も移動体10bへ送信する。 Here, the self position estimated by each of the star reckoning unit 1122a and the dead reckoning unit 1124a is output to the lost determination unit 1140a. The lost determination unit 1140a makes a lost determination based on the input self-position. When the lost determination unit 1140a determines that the mobile unit 10a has been lost, the mobile unit 10a transmits a determination result indicating that the mobile unit 10a has been lost to the mobile unit 10b via the communication unit 150a. At the same time, the mobile unit 10a also transmits request information to the mobile unit 10b via the communication unit 150a.
 通信部150bを介して判定結果と要求情報を受信した移動体10bは、移動体10aの復帰処理を行う。まず、移動体10bは、受信した要求情報に基づき、経路計画部1180bにて移動体10aへの接近経路を計画する。計画後、接近経路情報は、動作制御部1182bへ出力される。接近経路情報を入力された動作制御部1182bは、移動体10bを接近経路に沿って動かすために、駆動部130bを駆動する制御を行う。接近経路に沿って移動後、移動体10bは、外界センサ102bにより移動体10aを検出する。検出後、移動体10bは、外界センサ102bにより移動体10aをセンシングしたセンシング情報に基づき、移動体10bと移動体10aとの相対的な位置関係を算出することで、移動体10aの自己位置を推定する。推定後、移動体10bは、通信部150bを介して、推定した自己位置を移動体10aへ送信する。 The mobile unit 10b that has received the determination result and the request information via the communication unit 150b performs a process of returning the mobile unit 10a. First, the moving body 10b plans the approach route to the moving body 10a by the route planning unit 1180b based on the received request information. After the planning, the approach route information is output to the operation control unit 1182b. The operation control unit 1182b, to which the approach path information is input, controls the driving unit 130b to move the moving body 10b along the approach path. After moving along the approach route, the moving body 10b detects the moving body 10a by the external sensor 102b. After the detection, the moving body 10b calculates the relative positional relationship between the moving body 10b and the moving body 10a based on the sensing information obtained by sensing the moving body 10a by the external sensor 102b, thereby determining the self-position of the moving body 10a. presume. After the estimation, the mobile unit 10b transmits the estimated self position to the mobile unit 10a via the communication unit 150b.
 通信部150aを介して自己位置を受信した移動体10aは、統合部1126aにて受信した自己位置を移動体10aの自己位置であると同定し、自己位置を復帰する。 The mobile unit 10a that has received its own position via the communication unit 150a identifies the self position received by the integration unit 1126a as the self position of the mobile unit 10a, and restores its own position.
 <3-3-2.処理フロー>
 次いで、ロストした移動体10が他の移動体10に復帰要求を行う際の処理フローについて説明する。図9は、本開示の実施形態に係るロストした移動体10が他の移動体10に復帰要求を行う際の処理の流れの例を示すフローチャートである。
<3-3-2. Processing flow>
Next, a processing flow when the lost mobile unit 10 makes a return request to another mobile unit 10 will be described. FIG. 9 is a flowchart showing an example of the flow of processing when the lost moving body 10 according to the embodiment of the present disclosure issues a return request to another moving body 10.
 図9に示すように、まず、移動体10aは、移動を行う(S300)。同様に、移動体10bも移動を行っている(S302)。移動体10aは、移動しながらロスト判定を行う(S304)。移動体10aがロストしたと判定されていない場合(S304/NO)、移動体10aは、移動を継続する。移動体10aがロストしたと判定された場合(S304/YES)、移動体10aは、移動体10bへ移動体10aのロスト情報、ロスト位置情報、及び軌跡情報を含む要求情報を送信する(S306)。 As shown in FIG. 9, first, the moving body 10a moves (S300). Similarly, the moving body 10b is also moving (S302). The moving body 10a makes a lost decision while moving (S304). When it is not determined that the moving body 10a is lost (S304/NO), the moving body 10a continues moving. When it is determined that the moving body 10a is lost (S304/YES), the moving body 10a transmits request information including the lost information of the moving body 10a, the lost position information, and the trajectory information to the moving body 10b (S306). ..
 要求情報を受信した移動体10bは、要求情報に含まれる移動体10aのロスト位置情報及び軌跡情報に基づき、移動体10aまでの接近経路の計画を行う(S308)。例えば、移動体10bは、移動体10aのロスト位置情報及び軌跡情報から、図3に示したロスト経路を含む接近経路80ではなく、ロスト経路を含まない接近経路90を計画することができる。計画後、移動体10bは、計画した接近経路に沿って移動体10aの位置まで移動する(S310)。移動体10bは、移動しながら移動体10aを検出したか否かを確認する(S312)。移動体10aを検出した場合(S312/YES)、移動体10bは、センシング情報に基づき、移動体10bと移動体10aとの相対的な位置関係を算出することで、移動体10aの自己位置を推定する(S314)。移動体10aを検出していない場合(S312/NO)、移動体10bは、移動を継続する。なお、この時、移動体10bは、移動体10aの位置までの接近経路の計画を再度行ってもよい。 The mobile unit 10b that has received the request information plans an approach route to the mobile unit 10a based on the lost position information and the trajectory information of the mobile unit 10a included in the request information (S308). For example, the moving body 10b can plan the approaching route 90 not including the lost route, instead of the approaching route 80 including the lost route shown in FIG. 3, from the lost position information and the trajectory information of the moving body 10a. After the planning, the moving body 10b moves to the position of the moving body 10a along the planned approach route (S310). The moving body 10b confirms whether or not the moving body 10a is detected while moving (S312). When the moving body 10a is detected (S312/YES), the moving body 10b calculates the relative positional relationship between the moving body 10b and the moving body 10a based on the sensing information to determine the self-position of the moving body 10a. It is estimated (S314). When the moving body 10a is not detected (S312/NO), the moving body 10b continues moving. At this time, the moving body 10b may again plan the approach route to the position of the moving body 10a.
 移動体10aの自己位置の推定後、移動体10bは、推定した自己位置を移動体10aへ送信する(S316)。そして、移動体10bは、移動を再開する(S318)。 After estimating the self-position of the mobile unit 10a, the mobile unit 10b transmits the estimated self-position to the mobile unit 10a (S316). Then, the moving body 10b restarts moving (S318).
 移動体10bから自己位置を受信した移動体10aは、受信した自己位置を移動体10aの自己位置であると同定し、自己位置を復帰する。そして、移動体10aは、移動を再開する(S322)。 The mobile unit 10a that has received the self-position from the mobile unit 10b identifies the received self-position as the self-position of the mobile unit 10a and restores the self-position. Then, the moving body 10a restarts moving (S322).
 <3-4.移動体がロスト位置を通らない移動経路の計画を行う際の処理>
 <3-4-1.処理ブロック>
 以上、ロストした移動体10が他の移動体10に復帰要求を行う際の処理について説明した。続いて、移動体10がロスト位置を通らない移動経路の計画を行う際の処理について説明する。まず、移動体10がロスト位置を通らない移動経路の計画を行う際に用いる処理ブロックについて説明する。図10は、本開示の実施形態に係る移動体10がロスト位置を通らない移動経路の計画を行う際の処理ブロックの例を示すブロック図である。
<3-4. Process when planning a moving route where the moving body does not pass through the lost position>
<3-4-1. Processing block>
The processing when the lost moving body 10 makes a return request to another moving body 10 has been described above. Next, a process when the moving body 10 plans a moving route that does not pass through the lost position will be described. First, a processing block used when the moving body 10 plans a moving route that does not pass through the lost position will be described. FIG. 10 is a block diagram illustrating an example of processing blocks when the moving body 10 according to the embodiment of the present disclosure plans a moving route that does not pass through the lost position.
 移動体10がロスト位置を通らない移動経路の計画を行う際の処理では、図10に示すように、センサ部100、制御部110、記憶部120、及び駆動部130の機能が用いられる。具体的に、センサ部100では、外界センサ102及び内界センサ104が用いられる。制御部110では、推定部112及び処理制御部118が用いられる。記憶部120では、目的地・経由点DB122及びロスト位置DB124が用いられる。 As shown in FIG. 10, the functions of the sensor unit 100, the control unit 110, the storage unit 120, and the drive unit 130 are used in the process when the moving body 10 plans a moving route that does not pass through the lost position. Specifically, in the sensor unit 100, the external sensor 102 and the internal sensor 104 are used. The control unit 110 uses the estimation unit 112 and the processing control unit 118. The storage unit 120 uses a destination/route point DB 122 and a lost position DB 124.
 より具体的に、推定部112では、自己位置推定部1120のスターレコニング部1122、デッドレコニング部1124、及び統合部1126が用いられる。また、処理制御部118では、経路計画部1180及び動作制御部1182が用いられる。 More specifically, the estimation unit 112 uses the star reckoning unit 1122, the dead reckoning unit 1124, and the integration unit 1126 of the self-position estimation unit 1120. Further, the processing control unit 118 uses the route planning unit 1180 and the operation control unit 1182.
 まず、センサ部100にて、外界センサ102と内界センサ104によるセンシングが行われる。外界センサ102がセンシングしたセンシング情報は、スターレコニング部1122へ出力される。また、内界センサ104がセンシングしたセンシング情報は、デッドレコニング部1124へ出力される。 First, in the sensor unit 100, sensing is performed by the external sensor 102 and the internal sensor 104. The sensing information sensed by the external sensor 102 is output to the star reckoning unit 1122. The sensing information sensed by the inner world sensor 104 is output to the dead reckoning unit 1124.
 次いで、スターレコニング部1122とデッドレコニング部1124の各々にて、センシング情報に基づき、移動体10の自己位置推定が行われる。自己位置推定後、スターレコニング部1122とデッドレコニング部1124の各々にて推定された自己位置が統合部1126へ出力され、統合部1126にて統合処理が行われる。そして、統合処理後の自己位置は、経路計画部1180へ出力される。 Next, the star reckoning unit 1122 and the dead reckoning unit 1124 each perform self-position estimation of the moving body 10 based on the sensing information. After the self-position estimation, the self-position estimated by each of the star reckoning unit 1122 and the dead reckoning unit 1124 is output to the integration unit 1126, and the integration unit 1126 performs integration processing. Then, the self-position after the integration processing is output to the route planning unit 1180.
 次いで、経路計画部1180にて、統合処理後の自己位置、目的地・経由点DB122から取得される目的地・経由点情報、及びロスト位置DB124から取得されるロスト位置情報に基づき、ロスト位置を通らない移動経路の計画が行われる。計画後、移動経路情報は、動作制御部1182へ出力される。 Next, the route planning unit 1180 determines the lost position based on the self-position after the integration process, the destination/route point information acquired from the destination/route point DB 122, and the lost position information acquired from the lost position DB 124. Plans for routes that will not be taken. After planning, the movement route information is output to the operation control unit 1182.
 そして、移動経路情報を入力された動作制御部1182は、移動体10を移動経路に沿って動かすために、駆動部130を駆動する制御を行う。 Then, the operation control unit 1182, to which the movement route information is input, controls the driving unit 130 to move the moving body 10 along the movement route.
 <3-4-2.処理フロー>
 次いで、移動体10がロスト位置を通らない移動経路の計画を行う際の処理フローについて説明する。図11は、本開示の実施形態に係る移動体10がロスト位置を通らない移動経路の計画を行う際の処理の流れの例を示すフローチャートである。
<3-4-2. Processing flow>
Next, a processing flow when the moving body 10 plans a moving route that does not pass through the lost position will be described. FIG. 11 is a flowchart showing an example of the flow of processing when the moving body 10 according to the embodiment of the present disclosure plans a moving route that does not pass through the lost position.
 まず、移動体10は、自己位置の推定を行う(S400)。次いで、移動体10は、目的地・経由点DB122から目的地・経由点情報を取得し、移動経路の候補を計画する(S402)。例えば、移動体10は、図2に示した移動経路70と、移動経路60を通って目的地40まで移動する移動経路とを候補として計画する。次いで、移動体10は、ロスト位置DB124からロスト位置情報を取得し、ロスト位置を含む移動経路を候補から除外する(S404)。例えば、移動体10は、図2に示した移動経路60を通って目的地40まで移動する移動経路を候補から除外する。 First, the mobile unit 10 estimates its own position (S400). Next, the mobile unit 10 acquires the destination/route point information from the destination/route point DB 122, and plans a travel route candidate (S402). For example, the moving body 10 plans the moving route 70 shown in FIG. 2 and the moving route moving through the moving route 60 to the destination 40 as candidates. Next, the moving body 10 acquires the lost position information from the lost position DB 124 and excludes the movement route including the lost position from the candidates (S404). For example, the moving body 10 excludes a moving route that moves to the destination 40 through the moving route 60 shown in FIG. 2 from the candidates.
 次いで、移動体10は、残った候補から移動経路を選択する(S406)。複数の候補が残った場合、移動体10は、例えば、任意の条件により移動経路を1つに絞ってよい。具体的に、移動体10は、目的地までの距離が他の移動経路よりも短い移動経路を選択してもよい。そして、移動体10は、選択した移動経路に沿って移動する(S408)。 Next, the mobile unit 10 selects a travel route from the remaining candidates (S406). When a plurality of candidates remain, the moving body 10 may narrow down the moving route to one, for example, under arbitrary conditions. Specifically, the moving body 10 may select a travel route having a shorter distance to the destination than other travel routes. Then, the moving body 10 moves along the selected moving route (S408).
<<4.変形例>>
 以上、本開示の実施形態に係る移動体における処理例について説明した。続いて、本開示の実施形態に係る変形例について説明する。なお、以下に説明する変形例は、本開示の実施形態で説明した構成に代えて適用されてもよいし、本実施形態で説明した構成に対して追加的に適用されてもよい。
<<4. Modification>>
Heretofore, the processing example in the moving body according to the embodiment of the present disclosure has been described. Next, modified examples according to the embodiment of the present disclosure will be described. Note that the modified examples described below may be applied instead of the configuration described in the embodiment of the present disclosure, or may be additionally applied to the configuration described in the present embodiment.
 上述の実施形態では、移動体10が家庭内ロボットである例について説明したが、移動体10は、工場内で用いられるロボット(以下では、「工場内ロボット」とも称される)であってもよい。工場内ロボットは、例えば、工場内に張り付けられたマーカー(例えばAlvarマーカー等)を認識することで、自己位置を推定し得る。そこで、ロストした工場内ロボットは、マーカーを認識可能な位置まで移動してもらうように復帰要求を行う。例えば、人に対して復帰要求を行った場合、ロストした工場内ロボットは、マーカーを認識可能な位置まで人に移動してもらう。これにより、ロストした工場内ロボットは、自己位置を復帰することができる。 In the above-described embodiment, an example in which the moving body 10 is a home robot has been described, but the moving body 10 may be a robot used in a factory (hereinafter, also referred to as “in-factory robot”). Good. The in-factory robot can estimate its own position, for example, by recognizing a marker (for example, Alvar marker) attached in the factory. Therefore, the lost factory robot issues a return request to move the marker to a recognizable position. For example, when a return request is made to a person, the lost factory robot causes the person to move the marker to a recognizable position. As a result, the lost factory robot can restore its own position.
<<5.ハードウェア構成例>>
 最後に、図12を参照しながら、本実施形態に係る制御装置のハードウェア構成例について説明する。図12は、本実施形態に係る制御装置のハードウェア構成例を示すブロック図である。なお、図12に示す制御装置900は、例えば、図1に示した移動体10を実現し得る。本実施形態に係る移動体10による制御処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。
<<5. Hardware configuration example>>
Finally, a hardware configuration example of the control device according to the present embodiment will be described with reference to FIG. FIG. 12 is a block diagram showing a hardware configuration example of the control device according to the present embodiment. The control device 900 shown in FIG. 12 can realize the moving body 10 shown in FIG. 1, for example. The control processing by the mobile unit 10 according to the present embodiment is realized by cooperation of software and hardware described below.
 図12に示すように、制御装置900は、CPU(Central Processing Unit)901、ROM(Read Only Memory)902、及びRAM(Random Access Memory)903を備える。また、制御装置900は、ホストバス904、ブリッジ905、外部バス906、インタフェース907、入力装置908、出力装置909、ストレージ装置910、ドライブ911、接続ポート912、及び通信装置913を備える。なお、ここで示すハードウェア構成は一例であり、構成要素の一部が省略されてもよい。また、ハードウェア構成は、ここで示される構成要素以外の構成要素をさらに含んでもよい。 As shown in FIG. 12, the control device 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, and a RAM (Random Access Memory) 903. The control device 900 also includes a host bus 904, a bridge 905, an external bus 906, an interface 907, an input device 908, an output device 909, a storage device 910, a drive 911, a connection port 912, and a communication device 913. The hardware configuration shown here is an example, and some of the components may be omitted. The hardware configuration may further include components other than the components shown here.
 CPU901は、例えば、演算処理装置又は制御装置として機能し、ROM902、RAM903、又はストレージ装置910に記録された各種プログラムに基づいて各構成要素の動作全般又はその一部を制御する。ROM902は、CPU901に読み込まれるプログラムや演算に用いるデータ等を格納する手段である。RAM903には、例えば、CPU901に読み込まれるプログラムや、そのプログラムを実行する際に適宜変化する各種パラメータ等が一時的又は永続的に格納される。これらはCPUバスなどから構成されるホストバス904により相互に接続されている。CPU901、ROM902およびRAM903は、例えば、ソフトウェアとの協働により、図1を参照して説明した制御部110の機能を実現し得る。 The CPU 901 functions as, for example, an arithmetic processing device or a control device, and controls the overall operation of each component or a part thereof based on various programs recorded in the ROM 902, the RAM 903, or the storage device 910. The ROM 902 is a means for storing programs read by the CPU 901, data used for calculation, and the like. The RAM 903 temporarily or permanently stores, for example, a program read by the CPU 901 and various parameters that appropriately change when the program is executed. These are connected to each other by a host bus 904 including a CPU bus and the like. The CPU 901, the ROM 902, and the RAM 903 can realize the function of the control unit 110 described with reference to FIG. 1 in cooperation with software, for example.
 CPU901、ROM902、及びRAM903は、例えば、高速なデータ伝送が可能なホストバス904を介して相互に接続される。一方、ホストバス904は、例えば、ブリッジ905を介して比較的データ伝送速度が低速な外部バス906に接続される。また、外部バス906は、インタフェース907を介して種々の構成要素と接続される。 The CPU 901, the ROM 902, and the RAM 903 are mutually connected, for example, via a host bus 904 capable of high-speed data transmission. On the other hand, the host bus 904 is connected to the external bus 906 having a relatively low data transmission rate, for example, via the bridge 905. The external bus 906 is also connected to various components via the interface 907.
 入力装置908は、例えば、マウス、キーボード、タッチパネル、ボタン、マイク、スイッチ及びレバー等、ユーザによって情報が入力される装置によって実現される。また、入力装置908は、例えば、赤外線やその他の電波を利用したリモートコントロール装置であってもよいし、制御装置900の操作に対応した携帯電話やPDA等の外部接続機器であってもよい。さらに、入力装置908は、例えば、上記の入力手段を用いてユーザにより入力された情報に基づいて入力信号を生成し、CPU901に出力する入力制御回路などを含んでいてもよい。制御装置900のユーザは、この入力装置908を操作することにより、制御装置900に対して各種のデータを入力したり処理動作を指示したりすることができる。 The input device 908 is realized by a device, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch and a lever, to which information is input by the user. Further, the input device 908 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an externally connected device such as a mobile phone or PDA that corresponds to the operation of the control device 900. Further, the input device 908 may include, for example, an input control circuit that generates an input signal based on the information input by the user using the above-described input unit and outputs the input signal to the CPU 901. By operating the input device 908, the user of the control device 900 can input various data to the control device 900 and instruct a processing operation.
 他にも、入力装置908は、ユーザに関する情報を検知する装置により形成され得る。例えば、入力装置908は、画像センサ(例えば、カメラ)、深度センサ(例えば、ステレオカメラ)、加速度センサ、ジャイロセンサ、地磁気センサ、光センサ、音センサ、測距センサ(例えば、ToF(Time of Flight)センサ)、力センサ等の各種のセンサを含み得る。また、入力装置908は、制御装置900の姿勢、移動速度等、制御装置900自身の状態に関する情報や、制御装置900の周辺の明るさや騒音等、制御装置900の周辺環境に関する情報を取得してもよい。また、入力装置908は、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して装置の緯度、経度及び高度を含む位置情報を測定するGNSSモジュールを含んでもよい。また、位置情報に関しては、入力装置908は、Wi-Fi(登録商標)、携帯電話・PHS・スマートフォン等との送受信、または近距離通信等により位置を検知するものであってもよい。入力装置908は、例えば、図1を参照して説明したセンサ部100の機能を実現し得る。 Alternatively, the input device 908 may be formed by a device that detects information about the user. For example, the input device 908 includes an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, and a distance measuring sensor (for example, ToF (Time of Flight). ) Sensors), force sensors, and the like. Further, the input device 908 acquires information about the state of the control device 900 itself such as the posture and moving speed of the control device 900, and information about the surrounding environment of the control device 900 such as the brightness and noise around the control device 900. Good. Further, the input device 908 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite) and receives positional information including the latitude, longitude, and altitude of the device. It may include a GNSS module to measure. Regarding the position information, the input device 908 may detect the position by transmission/reception with Wi-Fi (registered trademark), a mobile phone/PHS/smartphone, or the like, or short-distance communication or the like. The input device 908 can realize the function of the sensor unit 100 described with reference to FIG. 1, for example.
 出力装置909は、取得した情報をユーザに対して視覚的又は聴覚的に通知することが可能な装置で形成される。このような装置として、CRTディスプレイ装置、液晶ディスプレイ装置、プラズマディスプレイ装置、ELディスプレイ装置、レーザープロジェクタ、LEDプロジェクタ及びランプ等の表示装置や、スピーカ及びヘッドフォン等の音声出力装置や、プリンタ装置等がある。出力装置909は、例えば、制御装置900が行った各種処理により得られた結果を出力する。具体的には、表示装置は、制御装置900が行った各種処理により得られた結果を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。他方、音声出力装置は、再生された音声データや音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。出力装置909は、例えば、図1を参照して説明した出力部140の機能を実現し得る。 The output device 909 is formed of a device capable of visually or auditorily notifying the user of the acquired information. Such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, and printer devices. .. The output device 909 outputs the results obtained by various processes performed by the control device 900, for example. Specifically, the display device visually displays the results obtained by the various processes performed by the control device 900 in various formats such as text, images, tables, and graphs. On the other hand, the audio output device converts an audio signal composed of reproduced audio data, acoustic data, etc. into an analog signal and outputs it audibly. The output device 909 can realize the function of the output unit 140 described with reference to FIG. 1, for example.
 ストレージ装置910は、制御装置900の記憶部の一例として形成されたデータ格納用の装置である。ストレージ装置910は、例えば、HDD等の磁気記憶部デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等により実現される。ストレージ装置910は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置および記憶媒体に記録されたデータを削除する削除装置などを含んでもよい。このストレージ装置910は、CPU901が実行するプログラムや各種データ及び外部から取得した各種のデータ等を格納する。ストレージ装置910は、例えば、図1を参照して説明した記憶部120の機能を実現し得る。 The storage device 910 is a device for data storage formed as an example of a storage unit of the control device 900. The storage device 910 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like. The storage device 910 may include a storage medium, a recording device that records data in the storage medium, a reading device that reads data from the storage medium, and a deletion device that deletes data recorded in the storage medium. The storage device 910 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like. The storage device 910 can realize the function of the storage unit 120 described with reference to FIG. 1, for example.
 ドライブ911は、記憶媒体用リーダライタであり、制御装置900に内蔵、あるいは外付けされる。ドライブ911は、装着されている磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリ等のリムーバブル記憶媒体に記録されている情報を読み出して、RAM903に出力する。また、ドライブ911は、リムーバブル記憶媒体に情報を書き込むこともできる。 The drive 911 is a reader/writer for a storage medium, and is built in or externally attached to the control device 900. The drive 911 reads out information recorded on a removable storage medium such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs it to the RAM 903. The drive 911 can also write information to a removable storage medium.
 接続ポート912は、例えば、USB(Universal Serial Bus)ポート、IEEE1394ポート、SCSI(Small Computer System Interface)、RS-232Cポート、又は光オーディオ端子等のような外部接続機器を接続するためのポートである。 The connection port 912 is, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or a port for connecting an external device such as an optical audio terminal. ..
 通信装置913は、例えば、ネットワーク920に接続するための通信デバイス等で形成された通信インタフェースである。通信装置913は、例えば、有線若しくは無線LAN(Local Area Network)、LTE(Long Term Evolution)、Bluetooth(登録商標)又はWUSB(Wireless USB)用の通信カード等である。また、通信装置913は、光通信用のルータ、ADSL(Asymmetric Digital Subscriber Line)用のルータ又は各種通信用のモデム等であってもよい。この通信装置913は、例えば、インターネットや他の通信機器との間で、例えばTCP/IP等の所定のプロトコルに則して信号等を送受信することができる。通信装置913は、例えば、図1を参照して説明した通信部150の機能を実現し得る。 The communication device 913 is, for example, a communication interface formed of a communication device or the like for connecting to the network 920. The communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB). The communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various kinds of communication, or the like. The communication device 913 can send and receive signals and the like to and from the Internet and other communication devices, for example, according to a predetermined protocol such as TCP/IP. The communication device 913 can realize the function of the communication unit 150 described with reference to FIG. 1, for example.
 なお、ネットワーク920は、ネットワーク920に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、ネットワーク920は、インターネット、電話回線網、衛星通信網などの公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)などを含んでもよい。また、ネットワーク920は、IP-VPN(Internet Protocol-Virtual Private Network)などの専用回線網を含んでもよい。 The network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920. For example, the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LAN (Local Area Network) including Ethernet (registered trademark), WAN (Wide Area Network), and the like. The network 920 may also include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
 以上、本実施形態に係る制御装置900の機能を実現可能なハードウェア構成の一例を示した。上記の各構成要素は、汎用的な部材を用いて実現されていてもよいし、各構成要素の機能に特化したハードウェアにより実現されていてもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用するハードウェア構成を変更することが可能である。 The example of the hardware configuration capable of realizing the function of the control device 900 according to the present embodiment has been described above. Each component described above may be implemented by using a general-purpose member, or may be implemented by hardware specialized for the function of each component. Therefore, it is possible to appropriately change the hardware configuration to be used according to the technical level at the time of implementing the present embodiment.
<<6.まとめ>>
 以上説明したように、本開示の実施形態に係る制御装置は、移動体10aの自己位置を推定する。また、制御装置は、移動体10aの自己位置を喪失したか否かを判定する。そして、制御装置は、移動体10aが自己位置を喪失したか否かの判定結果に応じて、接近対象と接近するための処理を行う。移動体10aが自己位置を喪失したと判定された場合、制御装置は、移動体10aと、接近対象とを接近させる。これにより、制御装置は、接近対象に対して、移動体10aの自己位置を復帰するための動作を行わせることができる。
<<6. Summary >>
As described above, the control device according to the embodiment of the present disclosure estimates the self position of the moving body 10a. Further, the control device determines whether or not the self-position of the moving body 10a has been lost. Then, the control device performs processing for approaching the approaching object according to the determination result of whether or not the moving body 10a has lost its own position. When it is determined that the moving body 10a has lost its own position, the control device brings the moving body 10a and the approaching object close to each other. As a result, the control device can cause the approaching object to perform an operation for returning the self-position of the moving body 10a.
 よって、移動体の自己位置の喪失時、移動体が接近対象と接近することで自己位置を復帰することが可能な、新規かつ改良された制御装置、制御方法、プログラム、及び制御システムを提供することが可能である。 Therefore, there is provided a new and improved control device, control method, program, and control system capable of returning its own position when the moving body approaches the approaching object when the moving body loses its own position. It is possible.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that the invention also belongs to the technical scope of the present disclosure.
 例えば、本明細書において説明した各装置は、単独の装置として実現されてもよく、一部または全部が別々の装置として実現されても良い。例えば、図1に示した移動体10が備える制御部110が、単独の装置として実現されてもよい。例えば、制御部110は、サーバ装置等の独立した装置として実現され、ネットワーク等を介して移動体10と接続されてもよい。 For example, each device described in this specification may be realized as a single device, or part or all may be realized as separate devices. For example, the control unit 110 included in the moving body 10 illustrated in FIG. 1 may be realized as a single device. For example, the control unit 110 may be realized as an independent device such as a server device and connected to the mobile unit 10 via a network or the like.
 また、本明細書において説明した各装置による一連の処理は、ソフトウェア、ハードウェア、及びソフトウェアとハードウェアとの組合せのいずれを用いて実現されてもよい。ソフトウェアを構成するプログラムは、例えば、各装置の内部又は外部に設けられる記録媒体(非一時的な媒体:non-transitory media)に予め格納される。そして、各プログラムは、例えば、コンピュータによる実行時にRAMに読み込まれ、CPUなどのプロセッサにより実行される。 The series of processes performed by each device described in this specification may be realized using any of software, hardware, and a combination of software and hardware. The programs constituting the software are stored in advance, for example, in a recording medium (non-transitory medium: non-transmission media) provided inside or outside each device. Then, each program is read into the RAM when it is executed by a computer, and executed by a processor such as a CPU.
 また、本明細書においてフローチャート及びシーケンス図を用いて説明した処理は、必ずしも図示された順序で実行されなくてもよい。いくつかの処理ステップは、並列的に実行されてもよい。また、追加的な処理ステップが採用されてもよく、一部の処理ステップが省略されてもよい。 Also, the processes described using the flowcharts and sequence diagrams in this specification do not necessarily have to be executed in the order shown. Some processing steps may be performed in parallel. Further, additional processing steps may be adopted, and some processing steps may be omitted.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 Also, the effects described in the present specification are merely explanatory or exemplifying ones, and are not limiting. That is, the technique according to the present disclosure may have other effects that are apparent to those skilled in the art from the description of the present specification, in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 自己位置を推定する推定部と、
 移動体が前記自己位置を喪失したか否かを判定する判定部と、
 前記判定部による判定結果に応じて、接近対象と接近するための処理を行う処理制御部と、
を備える、制御装置。
(2)
 前記移動体が前記自己位置を喪失したと判定された場合、
 前記処理制御部は、前記接近対象に対する前記自己位置の復帰に関する要求を前記移動体にさらに行わせる、前記(1)に記載の制御装置。
(3)
 前記接近対象が人である場合、
 前記処理制御部は、前記人の位置に基づき、前記移動体が前記人と接近するための経路を計画し、前記移動体を前記人に接近させ、前記人に対する前記要求を前記移動体に行わせる、前記(2)に記載の制御装置。
(4)
 前記要求は、前記移動体の前記自己位置の復帰が可能な位置へ前記移動体を移動させることである、前記(3)に記載の制御装置。
(5)
 前記自己位置の復帰が可能な位置は、前記移動体の充電器の位置である、前記(4)に記載の制御装置。
(6)
 前記処理制御部は、ジェスチャー、音声出力、または表示出力の少なくともいずれか1つにより表現される前記要求を前記移動体に行わせる、前記(3)~(5)のいずれか1項に記載の制御装置。
(7)
 前記接近対象が人でない場合、
 前記処理制御部は、前記要求に関する要求情報を前記移動体に前記接近対象へ送信させる、前記(2)~(6)のいずれか1項に記載の制御装置。
(8)
 センサ装置のセンシング情報から前記接近対象に関する接近対象情報を検出する検出部をさらに備え、
 前記推定部は、前記接近対象情報に基づき、前記接近対象の位置を推定し、
 前記処理制御部は、前記接近対象の位置に基づき、前記接近対象と接近するための処理を行う、前記(1)~(7)のいずれか1項に記載の制御装置。
(9)
 前記接近対象が人であり、複数人の前記接近対象の候補が存在する場合、
 前記検出部は、前記候補の中から、前記移動体とコミュニケーションを取った回数が多い人を前記接近対象として検出する、前記(8)に記載の制御装置。
(10)
 前記センサ装置は撮像装置であり、
 前記検出部は、前記撮像装置が取得する撮像画像に基づき、前記接近対象を検出する、前記(8)または(9)に記載の制御装置。
(11)
 前記センサ装置はマイクロフォンであり、
 前記検出部は、前記マイクロフォンが取得する音声情報に基づき、音源の位置を前記接近対象情報として検出する、前記(8)~(10)のいずれか1項に記載の制御装置。
(12)
 前記センサ装置は光学センサであり、
 前記検出部は、前記光学センサが取得する光情報に基づき、光源の位置を前記接近対象情報として検出する、前記(8)~(11)のいずれか1項に記載の制御装置。
(13)
 前記移動体が前記自己位置を喪失したと判定されていない場合、
 前記処理制御部は、前記接近対象からの要求に関する要求情報を受信したか否かに応じて、前記移動体の動作を制御する、前記(1)~(12)のいずれか1項に記載の制御装置。
(14)
 前記接近対象から前記要求情報を受信した場合、
 前記処理制御部は、前記要求情報に基づき前記移動体を前記接近対象に接近させ、前記移動体に前記接近対象の位置を推定させる、前記(13)に記載の制御装置。
(15)
 前記要求情報は、前記自己位置の喪失を示す情報、前記自己位置の喪失位置を示す情報、及び前記自己位置を喪失するまでの軌跡を示す情報の少なくともいずれか1つを含む、前記(7)に記載の制御装置。
(16)
 前記処理制御部は、前記移動体が移動し得る移動経路の候補に対して優先度を設定し、前記優先度がより高い前記候補を前記移動体の前記移動経路として決定する、前記(1)~(15)のいずれか1項に記載の制御装置。
(17)
 前記処理制御部は、前記自己位置を喪失した位置を含む前記移動経路の候補の前記優先度を、前記自己位置を喪失した位置を含まない前記移動経路の候補の前記優先度よりも低く設定する、前記(16)に記載の制御装置。
(18)
 自己位置を推定することと、
 移動体が前記自己位置を喪失したか否かを判定することと、
 前記判定することによる判定結果に応じて、接近対象に接近するための処理を行うことと、
を含む、プロセッサにより実行される制御方法。
(19)
 コンピュータを、
 自己位置を推定する推定部と、
 移動体が前記自己位置を喪失したか否かを判定する判定部と、
 前記判定部による判定結果に応じて、接近対象に接近するための処理を行う処理制御部と、
として機能させるためのプログラム。
(20)
 自己位置を推定する第1の推定部と、移動体が前記自己位置を喪失したか否かを判定する第1の判定部と、前記第1の判定部による判定結果に応じて、接近対象に接近するための処理を行う第1の処理制御部と、を備える、第1の制御装置と、
 前記移動体及び前記接近対象の前記自己位置を推定する第2の推定部と、前記第1の判定部による判定結果に応じて、前記移動体に接近するための処理を行う第2の処理制御部と、を備える、第2の制御装置と、
を備え、
 前記移動体が前記自己位置を喪失したと判定された場合、
 前記第1の制御装置は、通信を介して前記自己位置の復帰に関する要求情報を前記第2の制御装置へ送信し、
 前記第2の制御装置は、通信を介して前記第1の制御装置から受信する前記要求情報に基づき、前記接近対象を前記移動体へ接近させ、前記接近対象に前記移動体の前記自己位置を推定させる、制御システム。
Note that the following configurations also belong to the technical scope of the present disclosure.
(1)
An estimation unit that estimates the self-position,
A determination unit that determines whether or not the mobile body has lost its own position,
In accordance with the determination result by the determination unit, a processing control unit that performs a process for approaching the approach target,
And a control device.
(2)
When it is determined that the moving body has lost the self-position,
The control device according to (1), wherein the processing control unit further causes the moving body to make a request for returning the self-position to the approaching target.
(3)
When the approach target is a person,
The processing control unit plans a route for the moving body to approach the person based on the position of the person, causes the moving body to approach the person, and makes the request for the person to the moving body. The control device according to (2) above.
(4)
The control device according to (3), wherein the request is to move the moving body to a position where the self position of the moving body can be returned.
(5)
The control device according to (4), wherein the position where the self-position can be returned is the position of the charger of the moving body.
(6)
6. The processing control unit according to any one of (3) to (5) above, which causes the moving body to perform the request represented by at least one of gesture, voice output, and display output. Control device.
(7)
When the approach target is not a person,
The control device according to any one of (2) to (6), wherein the processing control unit causes the mobile unit to transmit request information regarding the request to the approach target.
(8)
Further comprising a detection unit that detects approaching object information regarding the approaching object from the sensing information of the sensor device,
The estimation unit estimates the position of the approach target based on the approach target information,
The control device according to any one of (1) to (7), wherein the processing control unit performs processing for approaching the approaching object based on a position of the approaching object.
(9)
When the approach target is a person and there are multiple candidates for the approach target,
The control device according to (8), wherein the detection unit detects, from the candidates, a person who has frequently communicated with the moving body as the approach target.
(10)
The sensor device is an imaging device,
The control device according to (8) or (9), wherein the detection unit detects the approaching object based on a captured image acquired by the imaging device.
(11)
The sensor device is a microphone,
The control device according to any one of (8) to (10), wherein the detection unit detects a position of a sound source as the approach target information based on voice information acquired by the microphone.
(12)
The sensor device is an optical sensor,
The control device according to any one of (8) to (11), wherein the detection unit detects a position of a light source as the approach target information based on light information acquired by the optical sensor.
(13)
If it is not determined that the mobile has lost the self-position,
The processing control unit controls the operation of the moving body according to whether or not request information relating to a request from the approaching object is received, according to any one of (1) to (12) above. Control device.
(14)
When the request information is received from the approach target,
The control device according to (13), wherein the processing control unit causes the moving body to approach the approaching target based on the request information, and causes the moving body to estimate the position of the approaching target.
(15)
The request information includes at least one of information indicating the loss of the self-position, information indicating the lost position of the self-position, and information indicating a trajectory until the loss of the self-position, (7) The control device according to 1.
(16)
The processing control unit sets a priority for a candidate of a moving route along which the moving body can move, and determines the candidate with a higher priority as the moving route of the moving body, (1) The control device according to any one of to (15).
(17)
The processing control unit sets the priority of the candidate of the travel route including the position where the self position is lost to be lower than the priority of the candidate of the travel route that does not include the position where the self position is lost. The control device according to (16) above.
(18)
Estimating self position,
Determining whether the mobile has lost its self-position,
Performing processing for approaching the approaching object according to the determination result of the determination,
A control method executed by a processor, including:
(19)
Computer,
An estimation unit that estimates the self-position,
A determination unit that determines whether or not the mobile body has lost its own position,
In accordance with the determination result by the determination unit, a processing control unit that performs a process for approaching the approach target,
Program to function as.
(20)
A first estimation unit that estimates the self-position, a first determination unit that determines whether or not the moving body has lost the self-position, and an approach target according to the determination result by the first determination unit. A first controller including a first processing controller that performs processing for approaching;
A second estimating unit that estimates the self-positions of the moving body and the approaching target, and second processing control that performs processing for approaching the moving body according to a determination result by the first determining unit. A second control device comprising:
Equipped with
When it is determined that the moving body has lost the self-position,
The first control device transmits request information regarding return of the self-position to the second control device via communication,
The second control device causes the approaching target to approach the moving body based on the request information received from the first controlling device via communication, and the approaching target sets the self-position of the moving body to the approaching target. Estimate, control system.
 10 移動体
 100 センサ部
 110 制御部
 112 推定部
 114 判定部
 116 検出部
 118 処理制御部
 1180 経路計画部
 1182 動作制御部
 120 記憶部
 130 駆動部
 140 出力部
 150 通信部
 1000 制御システム
10 moving body 100 sensor part 110 control part 112 estimation part 114 judgment part 116 detection part 118 processing control part 1180 route planning part 1182 operation control part 120 storage part 130 driving part 140 output part 150 communication part 1000 control system

Claims (20)

  1.  自己位置を推定する推定部と、
     移動体が前記自己位置を喪失したか否かを判定する判定部と、
     前記判定部による判定結果に応じて、接近対象と接近するための処理を行う処理制御部と、
    を備える、制御装置。
    An estimation unit that estimates the self-position,
    A determination unit that determines whether or not the mobile body has lost its own position,
    In accordance with the determination result by the determination unit, a processing control unit that performs a process for approaching the approach target,
    And a control device.
  2.  前記移動体が前記自己位置を喪失したと判定された場合、
     前記処理制御部は、前記接近対象に対する前記自己位置の復帰に関する要求を前記移動体にさらに行わせる、請求項1に記載の制御装置。
    When it is determined that the moving body has lost the self-position,
    The control device according to claim 1, wherein the processing control unit further causes the moving body to make a request for returning the self-position to the approaching object.
  3.  前記接近対象が人である場合、
     前記処理制御部は、前記人の位置に基づき、前記移動体が前記人と接近するための経路を計画し、前記移動体を前記人に接近させ、前記人に対する前記要求を前記移動体に行わせる、請求項2に記載の制御装置。
    When the approach target is a person,
    The processing control unit plans a route for the moving body to approach the person based on the position of the person, causes the moving body to approach the person, and makes the request for the person to the moving body. The control device according to claim 2, wherein
  4.  前記要求は、前記移動体の前記自己位置の復帰が可能な位置へ前記移動体を移動させることである、請求項3に記載の制御装置。 The control device according to claim 3, wherein the request is to move the mobile body to a position where the self-position of the mobile body can be returned.
  5.  前記自己位置の復帰が可能な位置は、前記移動体の充電器の位置である、請求項4に記載の制御装置。 The control device according to claim 4, wherein the position where the self-position can be returned is the position of the charger of the moving body.
  6.  前記処理制御部は、ジェスチャー、音声出力、または表示出力の少なくともいずれか1つにより表現される前記要求を前記移動体に行わせる、請求項3に記載の制御装置。 The control device according to claim 3, wherein the processing control unit causes the mobile unit to make the request represented by at least one of a gesture, a voice output, and a display output.
  7.  前記接近対象が人でない場合、
     前記処理制御部は、前記要求に関する要求情報を前記移動体に前記接近対象へ送信させる、請求項2に記載の制御装置。
    When the approach target is not a person,
    The control device according to claim 2, wherein the processing control unit causes the mobile unit to transmit request information regarding the request to the approach target.
  8.  センサ装置のセンシング情報から前記接近対象に関する接近対象情報を検出する検出部をさらに備え、
     前記推定部は、前記接近対象情報に基づき、前記接近対象の位置を推定し、
     前記処理制御部は、前記接近対象の位置に基づき、前記接近対象と接近するための処理を行う、請求項1に記載の制御装置。
    Further comprising a detection unit that detects approaching object information regarding the approaching object from the sensing information of the sensor device,
    The estimation unit estimates the position of the approach target based on the approach target information,
    The control device according to claim 1, wherein the processing control unit performs processing for approaching the approach target based on a position of the approach target.
  9.  前記接近対象が人であり、複数人の前記接近対象の候補が存在する場合、
     前記検出部は、前記候補の中から、前記移動体とコミュニケーションを取った回数が多い人を前記接近対象として検出する、請求項8に記載の制御装置。
    When the approach target is a person and there are multiple candidates for the approach target,
    The control device according to claim 8, wherein the detection unit detects, from the candidates, a person who has frequently communicated with the moving body as the approach target.
  10.  前記センサ装置は撮像装置であり、
     前記検出部は、前記撮像装置が取得する撮像画像に基づき、前記接近対象を検出する、請求項8に記載の制御装置。
    The sensor device is an imaging device,
    The control device according to claim 8, wherein the detection unit detects the approach target based on a captured image acquired by the imaging device.
  11.  前記センサ装置はマイクロフォンであり、
     前記検出部は、前記マイクロフォンが取得する音声情報に基づき、音源の位置を前記接近対象情報として検出する、請求項8に記載の制御装置。
    The sensor device is a microphone,
    The control device according to claim 8, wherein the detection unit detects a position of a sound source as the approach target information based on voice information acquired by the microphone.
  12.  前記センサ装置は光学センサであり、
     前記検出部は、前記光学センサが取得する光情報に基づき、光源の位置を前記接近対象情報として検出する、請求項8に記載の制御装置。
    The sensor device is an optical sensor,
    The control device according to claim 8, wherein the detection unit detects the position of the light source as the approach target information based on the light information acquired by the optical sensor.
  13.  前記移動体が前記自己位置を喪失したと判定されていない場合、
     前記処理制御部は、前記接近対象からの要求に関する要求情報を受信したか否かに応じて、前記移動体の動作を制御する、請求項1に記載の制御装置。
    If it is not determined that the mobile has lost the self-position,
    The control device according to claim 1, wherein the processing control unit controls the operation of the moving body according to whether or not request information regarding a request from the approach target is received.
  14.  前記接近対象から前記要求情報を受信した場合、
     前記処理制御部は、前記要求情報に基づき前記移動体を前記接近対象に接近させ、前記移動体に前記接近対象の位置を推定させる、請求項13に記載の制御装置。
    When the request information is received from the approach target,
    The control device according to claim 13, wherein the processing control unit causes the moving body to approach the approaching target based on the request information, and causes the moving body to estimate the position of the approaching target.
  15.  前記要求情報は、前記自己位置の喪失を示す情報、前記自己位置の喪失位置を示す情報、及び前記自己位置を喪失するまでの軌跡を示す情報の少なくともいずれか1つを含む、請求項7に記載の制御装置。 8. The request information includes at least one of information indicating a loss of the self position, information indicating a lost position of the self position, and information indicating a trajectory until the loss of the self position. The control device described.
  16.  前記処理制御部は、前記移動体が移動し得る移動経路の候補に対して優先度を設定し、前記優先度がより高い前記候補を前記移動体の前記移動経路として決定する、請求項1に記載の制御装置。 The processing control unit sets a priority for a candidate of a moving route on which the moving body can move, and determines the candidate having a higher priority as the moving route of the moving body. The control device described.
  17.  前記処理制御部は、前記自己位置を喪失した位置を含む前記移動経路の候補の前記優先度を、前記自己位置を喪失した位置を含まない前記移動経路の候補の前記優先度よりも低く設定する、請求項16に記載の制御装置。 The processing control unit sets the priority of the candidate of the travel route including the position where the self position is lost to be lower than the priority of the candidate of the travel route that does not include the position where the self position is lost. The control device according to claim 16.
  18.  自己位置を推定することと、
     移動体が前記自己位置を喪失したか否かを判定することと、
     前記判定することによる判定結果に応じて、接近対象に接近するための処理を行うことと、
    を含む、プロセッサにより実行される制御方法。
    Estimating self position,
    Determining whether the mobile has lost its self-position,
    Performing processing for approaching the approaching object according to the determination result of the determination,
    A control method executed by a processor, including:
  19.  コンピュータを、
     自己位置を推定する推定部と、
     移動体が前記自己位置を喪失したか否かを判定する判定部と、
     前記判定部による判定結果に応じて、接近対象に接近するための処理を行う処理制御部と、
    として機能させるためのプログラム。
    Computer,
    An estimation unit that estimates the self-position,
    A determination unit that determines whether or not the mobile body has lost its own position,
    In accordance with the determination result by the determination unit, a processing control unit that performs a process for approaching the approach target,
    Program to function as.
  20.  自己位置を推定する第1の推定部と、移動体が前記自己位置を喪失したか否かを判定する第1の判定部と、前記第1の判定部による判定結果に応じて、接近対象に接近するための処理を行う第1の処理制御部と、を備える、第1の制御装置と、
     前記移動体及び前記接近対象の前記自己位置を推定する第2の推定部と、前記第1の判定部による判定結果に応じて、前記移動体に接近するための処理を行う第2の処理制御部と、を備える、第2の制御装置と、
    を備え、
     前記移動体が前記自己位置を喪失したと判定された場合、
     前記第1の制御装置は、通信を介して前記自己位置の復帰に関する要求情報を前記第2の制御装置へ送信し、
     前記第2の制御装置は、通信を介して前記第1の制御装置から受信する前記要求情報に基づき、前記接近対象を前記移動体へ接近させ、前記接近対象に前記移動体の前記自己位置を推定させる、制御システム。
    A first estimation unit that estimates the self-position, a first determination unit that determines whether or not the moving body has lost the self-position, and an approach target according to the determination result by the first determination unit. A first controller including a first processing controller that performs processing for approaching;
    A second estimating unit that estimates the self-positions of the moving body and the approaching target, and second processing control that performs processing for approaching the moving body according to a determination result by the first determining unit. A second control device comprising:
    Equipped with
    When it is determined that the moving body has lost the self-position,
    The first control device transmits request information regarding return of the self-position to the second control device via communication,
    The second control device causes the approaching target to approach the moving body based on the request information received from the first controlling device via communication, and the approaching target sets the self-position of the moving body to the approaching target. Estimate, control system.
PCT/JP2019/049367 2018-12-27 2019-12-17 Control device, control method, program, and control system WO2020137685A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-244895 2018-12-27
JP2018244895 2018-12-27

Publications (1)

Publication Number Publication Date
WO2020137685A1 true WO2020137685A1 (en) 2020-07-02

Family

ID=71129755

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/049367 WO2020137685A1 (en) 2018-12-27 2019-12-17 Control device, control method, program, and control system

Country Status (1)

Country Link
WO (1) WO2020137685A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007249735A (en) * 2006-03-17 2007-09-27 Fujitsu Ltd Robot location controller and robot self-location restoration method
JP2009001425A (en) * 2007-05-21 2009-01-08 Panasonic Corp Automatic transfer method, transfer robot, and automatic transfer system
JP2009116634A (en) * 2007-11-07 2009-05-28 Nec Access Technica Ltd Charging control device, charging control system, and charging control method and program used therefor
WO2011005559A2 (en) * 2009-06-23 2011-01-13 Mark Costin Roser Human locomotion assisting shoe
JP2014186694A (en) * 2013-03-25 2014-10-02 Murata Mach Ltd Autonomously mobile unmanned carrier, and autonomously mobile unmanned carrying system
JP2017129908A (en) * 2016-01-18 2017-07-27 株式会社エクォス・リサーチ Moving body
JP2018163496A (en) * 2017-03-24 2018-10-18 カシオ計算機株式会社 Autonomous mobile apparatus, autonomous mobile method and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007249735A (en) * 2006-03-17 2007-09-27 Fujitsu Ltd Robot location controller and robot self-location restoration method
JP2009001425A (en) * 2007-05-21 2009-01-08 Panasonic Corp Automatic transfer method, transfer robot, and automatic transfer system
JP2009116634A (en) * 2007-11-07 2009-05-28 Nec Access Technica Ltd Charging control device, charging control system, and charging control method and program used therefor
WO2011005559A2 (en) * 2009-06-23 2011-01-13 Mark Costin Roser Human locomotion assisting shoe
JP2014186694A (en) * 2013-03-25 2014-10-02 Murata Mach Ltd Autonomously mobile unmanned carrier, and autonomously mobile unmanned carrying system
JP2017129908A (en) * 2016-01-18 2017-07-27 株式会社エクォス・リサーチ Moving body
JP2018163496A (en) * 2017-03-24 2018-10-18 カシオ計算機株式会社 Autonomous mobile apparatus, autonomous mobile method and program

Similar Documents

Publication Publication Date Title
JP6948325B2 (en) Information processing equipment, information processing methods, and programs
US9400503B2 (en) Mobile human interface robot
US9902069B2 (en) Mobile robot system
TWI827649B (en) Apparatuses, systems and methods for vslam scale estimation
CN106292657B (en) Mobile robot and patrol path setting method thereof
WO2021077941A1 (en) Method and device for robot positioning, smart robot, and storage medium
WO2016031105A1 (en) Information-processing device, information processing method, and program
WO2011146259A2 (en) Mobile human interface robot
EP2571660A2 (en) Mobile human interface robot
WO2019144827A1 (en) Parking path acquisition method, apparatus, computer device, and storage medium
EP3907974A2 (en) End device, three-party communication system comprising cloud server and edge server for controlling end device, and operation method therefor
US11801602B2 (en) Mobile robot and driving method thereof
KR102560462B1 (en) Moving robot
AU2017201879B2 (en) Mobile robot system
AU2013263851B2 (en) Mobile robot system
US11076264B2 (en) Localization of a mobile device based on image and radio words
Jiang et al. Robot-assisted smartphone localization for human indoor tracking
JP2008070208A (en) Light position detection system, light id tag device, position detection device, and light position detection method
WO2020137685A1 (en) Control device, control method, program, and control system
WO2020062255A1 (en) Photographing control method and unmanned aerial vehicle
US11842515B2 (en) Information processing device, information processing method, and image capturing apparatus for self-position-posture estimation
CN113014658B (en) Device control, device, electronic device, and storage medium
CN111324129B (en) Navigation method and device based on face recognition
WO2022000209A1 (en) Positioning method and positioning device
TWI788217B (en) Method for positioning in a three-dimensional space and positioning system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19901789

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19901789

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP