WO2022024412A1 - Information processing device, information processing system, information processing method, and program - Google Patents

Information processing device, information processing system, information processing method, and program Download PDF

Info

Publication number
WO2022024412A1
WO2022024412A1 PCT/JP2020/047282 JP2020047282W WO2022024412A1 WO 2022024412 A1 WO2022024412 A1 WO 2022024412A1 JP 2020047282 W JP2020047282 W JP 2020047282W WO 2022024412 A1 WO2022024412 A1 WO 2022024412A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
user
information processing
control unit
moving body
Prior art date
Application number
PCT/JP2020/047282
Other languages
French (fr)
Japanese (ja)
Inventor
健太朗 児玉
秀章 塩澤
善弘 新海
馨 豊口
知英 猪俣
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2022024412A1 publication Critical patent/WO2022024412A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This disclosure relates to information processing devices, information processing systems, information processing methods, and programs.
  • Patent Document 1 by displaying an avatar based on the position information detected by the robot, a plurality of users can share the recognition of the three-dimensional position and obtain the feeling of being in the same space.
  • the technology that can be done is described.
  • the purpose of this disclosure is to provide an information processing device, an information processing system, an information processing method, and a program that enable an experience of being at home while being at a remote location.
  • the information processing apparatus relates to an image of the user's line of sight taken by an imaging unit provided on a moving body that can move in a second place according to an operation of the user in the first place.
  • a video data acquisition unit that acquires video data
  • a display control unit that displays the video data on the display unit
  • a posture information acquisition unit that acquires posture information regarding the posture of the user
  • a posture acquired by the posture information acquisition unit It is provided with an operation control unit that changes the posture of the moving body according to information.
  • the information processing system includes an information processing device according to one aspect of the present disclosure and a terminal device capable of communicating with the information processing device, and the terminal device is photographed by an image pickup unit.
  • the video data acquisition unit that acquires the video of the moving object, the avatar generation unit that generates the user's avatar of the information processing device, and the video acquired by the video data acquisition unit are displayed on the display unit, and the display unit displays the video.
  • a display control unit that superimposes the user's avatar of the information processing image on the moving body is provided.
  • the information processing method relates to an image of the user's line of sight taken by an imaging unit provided on a moving body that can move in a second place according to an operation of a user in the first place.
  • the program according to one aspect of the present disclosure is video data relating to an image of the user's line of sight taken by an imaging unit provided on a moving body that can move in a second place according to an operation of a user in the first place.
  • FIG. 1 is a diagram for explaining a configuration example of an information processing system according to an embodiment.
  • FIG. 2 is a block diagram showing a configuration example of the information processing apparatus according to the embodiment.
  • FIG. 3 is a diagram showing a configuration example of the controller according to the embodiment.
  • FIG. 4 is a diagram for explaining how to use the information processing apparatus according to the embodiment and the controller.
  • FIG. 5 is a block diagram showing a configuration example of the moving body according to the embodiment.
  • FIG. 6 is a block diagram showing a configuration example of the terminal device according to the embodiment.
  • FIG. 7 is a diagram for explaining how to use the mobile body according to the embodiment and the terminal device.
  • FIG. 8 is a flowchart showing an example of the processing flow of the information processing system according to the embodiment.
  • FIG. 8 is a flowchart showing an example of the processing flow of the information processing system according to the embodiment.
  • FIG. 9 is a flowchart showing an example of the flow of processing for the moving body of the information processing apparatus according to the embodiment.
  • FIG. 10 is a flowchart showing an example of the flow of processing for the terminal device of the information processing device according to the embodiment.
  • FIG. 11 is a flowchart showing an example of the flow of processing for the moving body of the controller according to the embodiment.
  • FIG. 12 is a flowchart showing an example of the flow of processing of the moving body according to the embodiment.
  • FIG. 13 is a flowchart showing an example of the processing flow of the terminal device according to the embodiment.
  • FIG. 1 is a diagram showing a configuration example of an information processing system according to an embodiment.
  • the information processing system 1 includes an information processing device 10, a controller 20, a mobile body 30, and a terminal device 40.
  • the information processing device 10, the controller 20, the mobile body 30, and the terminal device 40 are communicably connected via the network N1.
  • the network N1 is, for example, an internet network.
  • the information processing apparatus 10 and the controller 20 are communicably connected to each other via the network N2.
  • the mobile body 30 and the terminal device 40 are communicably connected to each other via the network N3.
  • the network N2 and the network N3 are, for example, networks using Bluetooth (registered trademark) or Wi-Fi (registered trademark).
  • the information processing device 10 and the controller 20 are used in the first place.
  • Examples of the first place include, but are not limited to, hospitals and facilities.
  • the mobile body 30 and the terminal device 40 are used in a second place different from the first place.
  • the second place is exemplified by, but is not limited to, the home.
  • the user at the first place and the user at the second place can make a videophone call by using the information processing device 10 and the terminal device 40, respectively.
  • the information processing device 10 is exemplified by a configuration using an HMD (Head Mounted Display) device that the user wears on the head and uses, but the information processing device 10 is not limited to this.
  • the controller 20 is an operating device for the mobile body 30 that is gripped and used by the user.
  • the controller 20 has, for example, a stick-like shape, but is not limited thereto.
  • the user can control various operations including the movement of the moving body 30 by operating the controller 20 while checking the image displayed on the information processing device 10.
  • the mobile body 30 is a robot that can be operated by using the controller 20. Examples of robots include, but are not limited to, drones.
  • Examples of the terminal device 40 include smartphones, tablet terminals, AR (Augmented Reality) devices, and the like, but the terminal device 40 is not limited thereto.
  • the user in the second space makes a call with the user of the information processing apparatus 10 while photographing the moving body 30 and displaying it on the display unit.
  • the terminal device 40 superimposes the avatar image of the user of the information processing device 10 on the moving body 30 displayed on the display unit.
  • the user of the information processing apparatus 10 can experience a pseudo-temporary return home while being hospitalized, for example.
  • the user of the terminal device 40 can experience as if the user of the information processing device 10 is at home.
  • FIG. 2 is a block diagram showing a configuration example of the information processing apparatus according to the embodiment.
  • the information processing apparatus 10 includes a display unit 11, a microphone 12, a speaker 13, an operation unit 14, a sensor 15, a storage unit 16, a first communication unit 17, and a second communication.
  • a unit 18 and a control unit 19 are provided.
  • the display unit 11, the microphone 12, the speaker 13, the operation unit 14, the sensor 15, the storage unit 16, the first communication unit 17, the second communication unit 18, and the control unit 19 are buses, respectively. It is connected by B1.
  • the display unit 11 displays various images.
  • the display unit 11 is, for example, a head-mounted display.
  • the display unit 11 is, for example, a VR (Virtual Reality) device and a head-mounted display such as an AR glass.
  • the display unit 11 is arranged at a position that covers both eyes of the user while being worn on the head of the user.
  • the microphone 12 picks up the voice spoken by the user.
  • the microphone 12 is arranged, for example, in a housing constituting the information processing apparatus 10.
  • the microphone 12 outputs voice data related to the picked-up voice to the voice data acquisition unit 192.
  • the speaker 13 outputs sound.
  • the speaker 13 outputs voice based on voice data related to voice spoken by the user of the terminal device 40 acquired from the terminal device 40.
  • the speaker 13 is arranged in, for example, a housing constituting the information processing apparatus 10.
  • the operation unit 14 receives various operations on the information processing device 10.
  • the operation unit 14 accepts, for example, an operation for starting communication, an operation for ending communication, and the like. Examples of the operation unit 14 include buttons, switches, and the like, but the operation unit 14 is not limited thereto.
  • the operation unit 14 is arranged, for example, in a housing constituting the information processing apparatus 10.
  • the operation unit 14 outputs an operation signal related to the received operation to the operation control unit 195.
  • the sensor 15 detects the posture of the user who uses the information processing device 10.
  • the sensor 15 detects, for example, the direction of the face of the user who uses the information processing apparatus 10 and the angle of the face with respect to the horizontal plane.
  • the sensor 15 is realized by, for example, an acceleration sensor and a gyro sensor.
  • the sensor 15 outputs, for example, the posture information regarding the detected posture of the user to the posture information acquisition unit 193.
  • the sensor 15 is arranged, for example, in a housing constituting the information processing apparatus 10.
  • the sensor 15 may detect, for example, the state of the eyes of the user who uses the information processing device 10.
  • the state of the user's eyes is, for example, the direction of the user's line of sight, the viewpoint of the user, and the like.
  • the sensor 15 may be realized by, for example, a camera that captures the user's eyes.
  • the sensor 15 may be arranged in the housing of the information processing device 10 toward the user's face while the information processing device 10 is being used.
  • the storage unit 16 stores various information.
  • the storage unit 16 stores, for example, user information about a user who uses the information processing device 10.
  • the user information includes various physical information including the height of the user and the height of the line of sight.
  • the storage unit 16 can be realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory, or a storage device such as a hard disk or a solid state drive.
  • the first communication unit 17 is a telecommunications unit that performs telecommunications. Specifically, the first communication unit 17 is communicably connected to the mobile body 30 and the terminal device 40 via the network N1. That is, the information processing device 10 uses the first communication unit 17 to perform long-distance wireless communication with the mobile body 30 and the terminal device 40.
  • the second communication unit 18 is a short-range wireless communication unit that performs short-range wireless communication. Specifically, the second communication unit 18 is communicably connected to the controller 20 via the network N2. That is, the information processing apparatus 10 uses the second communication unit 18 to perform short-range wireless communication with the controller 20.
  • control unit 19 for example, a program (for example, a program according to the present disclosure) stored in a storage unit (not shown) is executed by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like with the RAM or the like as a work area. It is realized by. Further, the control unit 19 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array). The control unit 19 may be realized by a combination of software and hardware.
  • a program for example, a program according to the present disclosure
  • a storage unit not shown
  • the control unit 19 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the control unit 19 includes a video data acquisition unit 191, an audio data acquisition unit 192, an attitude information acquisition unit 193, a display control unit 194, an operation control unit 195, an output control unit 196, and a communication control unit 197. Be prepared.
  • the video data acquisition unit 191, the audio data acquisition unit 192, the attitude information acquisition unit 193, the display control unit 194, the operation control unit 195, the output control unit 196, and the communication control unit 197 are bus B2, respectively. Is connected by.
  • the video data acquisition unit 191 acquires video data.
  • the video data acquisition unit 191 acquires video data related to the video captured by the image pickup unit provided on the moving body 200 via the first communication unit 17.
  • the voice data acquisition unit 192 acquires voice data.
  • the voice data acquisition unit 192 acquires voice data related to the voice spoken by the user from the microphone 12.
  • the voice data acquisition unit 192 acquires voice data related to the voice spoken by the user of the terminal device 40 from the terminal device 40 via the first communication unit 17.
  • the posture information acquisition unit 193 acquires posture information regarding the posture of the user who uses the information processing device 10.
  • the posture information acquisition unit 193 acquires the user's posture information based on the detection result of the sensor 15, for example.
  • the posture information acquisition unit 193 acquires information regarding, for example, the orientation of the user's head and the angle of the head with respect to the horizontal plane.
  • the posture information acquisition unit 193 may detect, for example, the orientation of the user's body.
  • the posture information acquisition unit 193 may acquire information on the state of the user's eyes as posture information based on the detection result by the sensor 15.
  • the posture information acquisition unit 193 may acquire information regarding the user's facial expression as posture information.
  • the display control unit 194 causes the display unit 11 to display an image.
  • the display control unit 194 causes the display unit 11 to display, for example, an image related to the image data acquired by the image data acquisition unit 191.
  • the display control unit 194 causes the display unit 11 to display, for example, an image that matches the scenery seen from the height of the line of sight when the user using the information processing apparatus 10 walks.
  • the image that matches the scenery seen from the height of the line of sight when the user walks is not limited to the case of perfect matching, and may include an image that falls within a predetermined range. Not only when the user walks, but also when the user is sitting in a wheelchair and moving, when the user is seated on the floor, when the user is lying down, or when the user is in a predetermined posture or movement.
  • the image corresponding to the scenery seen by the user may be displayed on the display unit 11.
  • An image corresponding to the scenery seen by the user in these various cases is referred to as an image from the user's point of view.
  • the display control unit 194 superimposes, for example, an operation icon for moving the moving body 200 on the image and displays it on the display unit 11.
  • the display control unit 194 superimposes the image of the user's hand and the operation icon for moving the moving body 200 on the image based on the posture information regarding the position of the user's hand acquired from the controller 20, for example. , May be displayed on the display unit 11.
  • the user may use the operation unit 14 to select a predetermined posture and a predetermined operation, and instruct the moving body 200 to move.
  • the operation control unit 195 acquires the operation information related to the operation received by the operation unit 14.
  • the operation control unit 195 acquires, for example, operation information for starting and ending communication with the controller 20, the mobile body 30, and the terminal device 40.
  • the operation control unit 195 outputs a control signal corresponding to the acquired operation information to control the operation of the information processing apparatus 10.
  • the output control unit 196 outputs sound from the speaker 13.
  • the output control unit 196 outputs, for example, the voice spoken by the user of the terminal device 40 acquired from the terminal device 40 by the voice data acquisition unit 192 via the first communication unit 17.
  • the output control unit 196 outputs, for example, the voice data acquired from the microphone 12 by the voice data acquisition unit 192 to the terminal device 40 via the first communication unit 17.
  • the output control unit 196 outputs, for example, the user information stored in the storage unit 16 to the mobile body 30 via the first communication unit 17.
  • the output control unit 196 outputs, for example, the posture information acquired by the posture information acquisition unit 193 to the mobile body 30 via the first communication unit 17.
  • the communication control unit 197 controls communication between the information processing device 10 and an external device. Specifically, the communication control unit 197 controls the first communication unit 17 to control the communication between the information processing device 10 and the mobile body 30. The communication control unit 197 controls the first communication unit 17 to control the communication between the information processing device 10 and the terminal device 40. The communication control unit 197 controls the second communication unit 18 to control the communication between the information processing device 10 and the controller 20.
  • FIG. 3 is a diagram showing a configuration example of the controller according to the embodiment.
  • the controller 20 includes an operation unit 21, a sensor 22, a first communication unit 23, a second communication unit 24, and a control unit 25.
  • the operation unit 21, the sensor 22, the first communication unit 23, the second communication unit 24, and the control unit 25 are each connected by a bus B3.
  • the operation unit 21 receives various operations on the controller 20.
  • the operation unit 21 receives, for example, an operation for starting communication with the information processing device 10 and the mobile body 30, an operation for ending communication, and the like.
  • Examples of the operation unit 21 include buttons, switches, and the like, but the operation unit 21 is not limited thereto.
  • the operation unit 21 is arranged, for example, in a housing constituting the controller 20.
  • the operation unit 21 outputs an operation signal related to the received operation to the operation control unit 251.
  • the sensor 22 detects the posture of the user who uses the controller 20.
  • the sensor 22 detects, for example, the position and orientation of the hand.
  • the sensor 22 is realized by, for example, an acceleration sensor and a gyro sensor.
  • the sensor 22 outputs, for example, the posture information regarding the detected posture of the user to the posture information acquisition unit 252.
  • the sensor 22 is arranged, for example, in a housing constituting the controller 20.
  • the first communication unit 23 is a long-distance wireless communication unit that performs long-distance wireless communication. Specifically, the first communication unit 23 is communicably connected to the mobile body 30 via the network N1. That is, the controller 20 remotely controls the mobile body 30 via the first communication unit 23.
  • the second communication unit 24 is a short-range wireless communication unit that performs short-range wireless communication. Specifically, the second communication unit 24 is communicably connected to the information processing device 10 via the network N2. That is, the controller 20 uses the second communication unit 24 to perform short-range wireless communication with the information processing device 10.
  • the control unit 25 is realized by, for example, a CPU, an MPU, or the like executing a program stored in a storage unit (not shown) using a RAM or the like as a work area. Further, the control unit 25 may be realized by an integrated circuit such as an ASIC or FPGA. The control unit 25 may be realized by a combination of software and hardware.
  • the control unit 25 includes an operation control unit 251, a posture information acquisition unit 252, a mobile body control unit 253, and a communication control unit 254.
  • the operation control unit 251, the posture information acquisition unit 252, the mobile control unit 253, and the communication control unit 254 are connected by a bus B4, respectively.
  • the operation control unit 251 acquires the operation information related to the operation received by the operation unit 21.
  • the operation control unit 251 acquires, for example, operation information for starting and ending communication with the information processing device 10 and the mobile body 30.
  • the operation control unit 251 outputs a control signal corresponding to the acquired operation information to control the operation of the controller 20.
  • the posture information acquisition unit 252 acquires posture information regarding the posture of the user who uses the controller 20.
  • the posture information acquisition unit 252 acquires the user's posture information based on the detection result of the sensor 22, for example.
  • the posture information acquisition unit 252 acquires, for example, position information regarding the position and direction of the hand.
  • the posture information acquisition unit 252 acquires, for example, hand position information regarding a predetermined reference position of the hand or deviation from the reference direction.
  • the mobile body control unit 253 outputs the mobile body control information for moving the mobile body to the mobile body 30 via the first communication unit 17.
  • the mobile body control unit 253 outputs, for example, the position information of the user's hand holding the controller 20 acquired by the posture information acquisition unit 252 to the mobile body 30 as the mobile body control information.
  • the communication control unit 254 controls communication between the controller 20 and an external device. Specifically, the communication control unit 254 controls the first communication unit 23 to control the communication between the controller 20 and the mobile body 30. The communication control unit 254 controls the second communication unit 24 to control the communication between the information processing device 10 and the controller 20.
  • FIG. 4 is a diagram for explaining how to use the information processing apparatus according to the embodiment and the controller.
  • the information processing apparatus 10 is attached to the head of the user U1.
  • the user U1 uses the information processing device 10 in the hospital where he / she is hospitalized or in the facility where he / she is hospitalized.
  • the controller 20 is gripped, for example, one for each of the left hand and the right hand of the user U1.
  • the captured image IM1 captured by the imaging unit provided on the moving body 30 is displayed.
  • the user U1 can operate the moving body 30 by operating the controller 20 while visually recognizing the captured image IM1.
  • the captured image IM1 is, for example, an image taken at home such as the user U1.
  • the captured image IM1 includes a user U2 who is a family member of the user U1 and a chair C which is an obstacle.
  • the captured image IM1 includes an operation icon I1, an operation icon I2, and an operation icon I3.
  • the operation icon I1, the operation icon I2, and the operation icon I3 are icons used when operating the moving body 30.
  • the operation icon I3 is referred to as an operation icon I.
  • the operation icon I is, for example, an arrow icon. The direction of the arrow means the traveling direction of the moving body 30.
  • the operation icon I displays only the icon for moving the user in the movable direction on the captured image IM1 based on the three-dimensional map generated by the map generation unit 396 regarding the place where the user U1 can move. Is preferable.
  • the operation icon I is not limited to the instruction of the moving direction, and may include an icon instructing the posture of the user U1, for example, standing, sitting, sleeping, and the like.
  • the captured video IM1 includes a left-hand icon LH and a right-hand icon RH.
  • the left-handed icon LH corresponds to the left hand of the user U1
  • the right-handed icon RH corresponds to the right hand of the user U1.
  • the left hand icon LH in the captured image IM1 moves according to the movement of the left hand.
  • the right hand icon RH in the captured image IM1 moves according to the movement of the right hand. For example, if the left hand is moved upward while the controller 20 is held in the left hand, the left hand icon LH in the captured image IM1 moves upward.
  • the user U1 can move the moving body 30 by moving the left-hand icon LH or the right-hand icon RH and selecting the operation icon I in the captured video IM1. For example, if the operation icon I1 is selected, the moving body 30 moves forward, if the operation icon I2 is selected, the moving body 30 moves diagonally forward to the left, and if the operation icon I3 is selected, the moving body 30 moves diagonally forward to the right. .. That is, the user U1 can move the moving body 30 in a desired direction by moving the left hand or the right hand holding the controller 20 to move the moving body 30 and selecting the operation icon I corresponding to the direction. ..
  • FIG. 5 is a block diagram showing a configuration example of the moving body according to the embodiment.
  • the mobile body 30 includes an image pickup unit 31, a drive unit 32, an operation unit 33, a sensor 34, a storage unit 35, an AR signal transmission unit 36, and a first communication unit 37.
  • a second communication unit 38 and a control unit 39 are provided.
  • the image pickup unit 31, the drive unit 32, the operation unit 33, the sensor 34, the storage unit 35, the AR signal transmission unit 36, the first communication unit 37, the second communication unit 38, and the control unit 39 are , Each is connected by bus B5.
  • the mobile body 30 is, for example, a drone remotely controlled by the user U1. In the following, the mobile body 30 will be described as being a drone, but the mobile body 30 may be another robot.
  • the image pickup unit 31 captures an image around the moving body 30.
  • the image pickup unit 31 captures, for example, an image of the moving body 30 in the traveling direction.
  • the image pickup unit 31 includes an image pickup element (not shown), a circuit for generating video data based on the output of the image pickup element, and the like.
  • Examples of the image pickup element include, but are not limited to, a CMOS (Complementary Metal Oxide Semiconductor) image sensor and a CCD (Charge Coupled Device).
  • the image pickup unit 31 is provided in, for example, a housing constituting the moving body 30.
  • the image pickup unit 31 is provided, for example, to be rotationally driven with respect to the moving body 30 by an actuator.
  • the image pickup unit 31 is provided on the moving body 30 via, for example, a gimbal (not shown).
  • the drive unit 32 drives each part of the moving body 30.
  • the drive unit 32 includes various drive sources including a motor for moving the moving body 30 and an actuator for rotationally driving the image pickup unit 31 with respect to the moving body 30.
  • the operation unit 33 receives various operations on the moving body 30.
  • the operation unit 33 accepts, for example, an operation of turning the power of the moving body 30 on and off.
  • Examples of the operation unit 33 include buttons, switches, and the like, but the operation unit 33 is not limited thereto.
  • the operation unit 33 is arranged, for example, in a housing constituting the mobile body 30.
  • the sensor 34 includes a sensor that detects the distance between the moving body 30 and the ground. In the case of indoors, the ground refers to the floor surface.
  • the sensor 34 detects the distance between the moving body 30 and surrounding obstacles. Examples of the sensor 34 include, but are limited to, a laser radar (for example, LIDAR: Laser Imaging Detection and Ringing), an infrared sensor including an infrared irradiation unit and a light receiving sensor, and a ToF (Time of Flight) sensor. No.
  • the sensor 34 may include other sensors.
  • the sensor 34 includes a sensor that detects the inclination of the moving body 30.
  • Examples of such a sensor 34 include, but are not limited to, a gyro sensor and the like.
  • the storage unit 35 stores various information.
  • the storage unit 35 stores, for example, user information about a user who uses the information processing device 10.
  • the user information includes various physical information including the height of the user and the height of the line of sight.
  • the storage unit 35 can be realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or a solid state drive.
  • the AR signal transmission unit 36 transmits an AR signal to the terminal device 40 for displaying an augmented reality image superimposed on the image in the real field.
  • the AR signal transmission unit 36 transmits, for example, an AR signal for displaying the avatar of the user U1 superimposed on the moving body 30 photographed and displayed by the terminal device 40 to the terminal device 40.
  • the AR signal may include information about the facial expression and posture of the user who uses the information processing device 10 detected by the information processing device 10.
  • the first communication unit 37 is a telecommunications unit that performs telecommunications. Specifically, the first communication unit 37 is communicably connected to the controller 20 via the network N1.
  • the second communication unit 38 is a short-range wireless communication unit that performs short-range wireless communication. Specifically, the second communication unit 38 is communicably connected to the terminal device 40 via the network N3. That is, the mobile body 30 uses the second communication unit 38 to perform short-range wireless communication with the terminal device 40.
  • the control unit 39 is realized by, for example, a CPU, an MPU, or the like executing a program stored in a storage unit (not shown) using a RAM or the like as a work area. Further, the control unit 39 may be realized by an integrated circuit such as an ASIC or FPGA. The control unit 39 may be realized by a combination of software and hardware.
  • the control unit 39 communicates with the image pickup control unit 391, the video data acquisition unit 392, the drive control unit 393, the output control unit 394, the distance calculation unit 395, the map generation unit 396, and the AR signal control unit 397. It is provided with a control unit 398.
  • the image pickup control unit 391, the video data acquisition unit 392, the drive control unit 393, the output control unit 394, the distance calculation unit 395, the map generation unit 396, the AR signal control unit 397, and the communication control unit 398 are , Each is connected by bus B6.
  • the image pickup control unit 391 controls the image pickup unit 31.
  • the image pickup control unit 391 sets the image pickup conditions by the image pickup unit 31 and causes the image pickup unit 31 to perform image pickup.
  • the video data acquisition unit 392 acquires video data.
  • the video data acquisition unit 392 acquires video data around the moving body 30 from the image pickup unit 31.
  • the video data acquisition unit 392 acquires, for example, video data in front of the moving body 30 from the image pickup unit 31.
  • the drive control unit 393 controls the drive of each unit of the moving body 30.
  • the drive control unit 393 has a height such that the image captured by the image pickup unit 31 matches the scenery seen from the line of sight when the user walks, according to the user information stored in the storage unit 35.
  • Drive the moving body 30 Specifically, when the moving body 30 is a drone, the drive control unit 393 has an altitude at which the image captured by the image pickup unit 31 matches the scenery seen from the line of sight when the user walks. Fly the mobile body 30. In this case, the drive control unit 393 flies the moving body 30 so as to keep the altitude constant based on the calculation result of the distance between the moving body 30 and the ground by the distance calculating unit 395.
  • the fact that the image captured by the image pickup unit 31 matches the image seen from the line of sight when the user walks is not limited to the case where they completely match, and may include falling within a predetermined range.
  • the constant altitude is not limited to the case where the altitude is exactly the same, and may include the fact that the altitude is within a predetermined range.
  • the moving body 30 may be driven so that the image corresponds to the scenery seen by the user.
  • the drive control unit 393 drives the drive unit 32 to move the moving body 30 according to the control signal from the information processing device 10.
  • the drive control unit 393 drives the drive unit 32 according to the control signal from the information processing device 10 to change the direction in which the image pickup unit 31 faces. For example, when the user's head faces the right rear direction, the drive control unit 393 drives the drive unit 32 to change the direction of the image pickup unit 31 to the right.
  • the drive control unit 393 may change the direction of the image pickup unit 31 by changing the direction of the moving body 30, or may change the direction of the image pickup unit 31 by driving the gimbal provided with the image pickup unit 31. You may.
  • the output control unit 394 outputs various information.
  • the output control unit 394 outputs, for example, the video data acquired by the video data acquisition unit 392 to the information processing apparatus 10 via the first communication unit 37.
  • the output control unit 394 stores, for example, the user information acquired via the first communication unit 37 in the storage unit 35.
  • the distance calculation unit 395 calculates the distance between the moving body 30 and various objects.
  • the distance calculation unit 395 calculates the distance between the moving body 30 and the ground, for example, based on the detection result by the sensor 34.
  • the distance calculation unit 395 calculates the distance between the moving body 30 and the obstacle based on the detection result by the sensor 34.
  • the map generation unit 396 generates a map of the space in which the moving body 30 moves.
  • the map generation unit 396 for example, when the space in which the moving body 30 moves is a room, obstacles such as the size and shape of the room, pillars, large furniture, etc. that hinder the movement of the moving body 30, and the user's room Generates a map that contains obstacles that hinder you as you walk around.
  • the map generation unit 396 creates a three-dimensional map of the room using the technique of photogrammetry based on the video data of the entire room taken by the image pickup unit 31. By creating a three-dimensional map, it is possible to avoid collisions with obstacles and avoid unnatural movements such as passing over a desk or the like where the user does not move.
  • the map generator 396 generates a three-dimensional map that includes information about where the user can move.
  • the AR signal control unit 397 controls the operation of the AR signal transmission unit 36.
  • the AR signal control unit 397 controls the AR signal transmission unit 36 to transmit an AR signal to the terminal device 40.
  • the communication control unit 398 controls communication between the mobile body 30 and an external device. Specifically, the communication control unit 398 controls the first communication unit 37 to control the communication between the mobile body 30 and the information processing device 10. The communication control unit 398 controls the first communication unit 37 to control the communication between the mobile body 30 and the controller 20. The communication control unit 398 controls the second communication unit 38 to control the communication between the mobile body 30 and the terminal device 40.
  • FIG. 6 is a block diagram showing a configuration example of the terminal device according to the embodiment.
  • the terminal device 40 includes an image pickup unit 41, a display unit 42, a microphone 43, a speaker 44, an operation unit 45, a storage unit 46, an AR signal reception unit 47, and a first communication.
  • a unit 48, a second communication unit 49, and a control unit 50 are provided.
  • the image pickup unit 41, the display unit 42, the microphone 43, the speaker 44, the operation unit 45, the storage unit 46, the AR signal reception unit 47, the first communication unit 48, and the second communication unit 49 are controlled.
  • Each unit 50 is connected to the bus B7.
  • the image pickup unit 41 captures an image around the terminal device 40.
  • the image pickup unit 41 includes an image pickup element (not shown), a circuit for generating video data based on the output of the image pickup element, and the like.
  • Examples of the image pickup element include, but are not limited to, a CMOS image sensor and a CCD.
  • the display unit 42 displays various images.
  • the display unit 42 displays, for example, video data captured by the image pickup unit 41.
  • the display unit 42 is a display including, for example, a liquid crystal display (LCD: Liquid Crystal Display) or an organic EL (Electro-Luminescence) display.
  • LCD Liquid Crystal Display
  • organic EL Electro-Luminescence
  • the microphone 43 collects the voice spoken by the user.
  • the microphone 43 is arranged, for example, in a housing constituting the terminal device 40.
  • the microphone 43 outputs voice data related to the picked-up voice to the voice data acquisition unit 503.
  • the speaker 44 outputs audio.
  • the speaker 44 outputs voice based on voice data related to voice spoken by the user of the information processing device 10 acquired from the information processing device 10.
  • the speaker 44 is arranged, for example, in a housing constituting the terminal device 40.
  • the operation unit 45 receives various operations on the terminal device 40.
  • the operation unit 45 accepts, for example, an operation for starting communication and an operation for ending communication. Examples of the operation unit 45 include buttons, switches, and a touch panel, but the operation unit 45 is not limited thereto.
  • the operation unit 45 is arranged, for example, in a housing constituting the terminal device 40.
  • the operation unit 45 outputs an operation signal related to the received operation to the operation control unit 507.
  • the storage unit 46 stores various types of information.
  • the storage unit 46 stores, for example, user information about a user who uses the information processing device 10.
  • the user information includes various physical information including the height of the user and the height of the line of sight.
  • the storage unit 46 can be realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or a solid state drive.
  • the AR signal receiving unit 47 receives an AR signal from the moving body 30 for displaying an augmented reality image superimposed on the image in the real field.
  • the AR signal receiving unit 47 receives, for example, an AR signal from the moving body 30 for displaying the avatar of the user U1 superimposed on the moving body 30 photographed and displayed by the terminal device 40.
  • the first communication unit 48 is a telecommunications unit that performs telecommunications. Specifically, the first communication unit 48 is communicably connected to the information processing apparatus 10 via the network N1.
  • the second communication unit 49 is a short-range wireless communication unit that performs short-range wireless communication. Specifically, the second communication unit 49 is communicably connected to the mobile body 30 via the network N3.
  • the control unit 50 is realized by, for example, a CPU, an MPU, or the like executing a program stored in a storage unit (not shown) using a RAM or the like as a work area. Further, the control unit 50 may be realized by an integrated circuit such as an ASIC or FPGA. The control unit 50 may be realized by a combination of software and hardware.
  • the control unit 50 includes an image pickup control unit 501, a video data acquisition unit 502, an audio data acquisition unit 503, a display control unit 504, a relative distance calculation unit 505, an avatar generation unit 506, and an operation control unit 507. It includes an output control unit 508, an AR signal control unit 509, and a communication control unit 510. Imaging control unit 501, video data acquisition unit 502, audio data acquisition unit 503, display control unit 504, relative distance calculation unit 505, avatar generation unit 506, operation control unit 507, and output control unit 508. , AR signal control unit 509 and communication control unit 510 are connected by bus B8, respectively.
  • the image pickup control unit 501 controls the image pickup unit 41.
  • the image pickup control unit 501 sets the image pickup conditions by the image pickup unit 41, and causes the image pickup unit 41 to perform image pickup.
  • the video data acquisition unit 502 acquires video data.
  • the video data acquisition unit 502 acquires video data around the terminal device 40 from the image pickup unit 41.
  • the voice data acquisition unit 503 acquires voice data.
  • the voice data acquisition unit 503 acquires voice data related to the voice spoken by the user from the microphone 43.
  • the voice data acquisition unit 503 acquires voice data related to the voice spoken by the user of the information processing device 10 from the information processing device 10 via the first communication unit 17.
  • the display control unit 504 causes the display unit 42 to display an image.
  • the display control unit 504 causes the display unit 42 to display, for example, an image related to the image data acquired by the image data acquisition unit 502.
  • the display control unit 504 superimposes and displays the user's avatar of the information processing device 10 generated by the avatar generation unit 506 on the moving body 30 included in the video displayed on the display unit 42.
  • the relative distance calculation unit 505 calculates the relative distance between the terminal device 40 and the moving body 30.
  • the relative distance calculation unit 505 calculates the relative distance between the terminal device 40 and the moving body 30 based on the ratio of the area occupied by the moving body 30 in the image displayed on the display unit 42, for example.
  • the relative distance calculation unit 505 calculates the relative distance between the terminal device 40 and the mobile body 30 based on the radio wave intensity when communicating between the terminal device 40 and the mobile body 30, for example. May be good.
  • the terminal device 40 includes a distance measuring sensor such as a ToF sensor (not shown)
  • the relative distance calculation unit 505 is located between the terminal device 40 and the moving body 30 based on the detection result of the distance measuring sensor. Relative distance may be calculated.
  • the avatar generation unit 506 generates an avatar to be superimposed and displayed on the moving body 30 on the display unit 42.
  • the avatar generation unit 506 generates, for example, an avatar of a user who uses the information processing device 10 based on the user information stored in the storage unit 46.
  • the avatar generation unit 506 changes the size of the avatar according to the distance calculated by the relative distance calculation unit 505, for example.
  • the avatar generation unit 506 changes the facial expression and posture of the avatar, for example, based on the AR signal acquired by the AR signal control unit 509.
  • the operation control unit 507 acquires the operation information related to the operation received by the operation unit 45.
  • the operation control unit 507 acquires, for example, operation information for starting and ending communication with the information processing device 10 and the mobile body 30.
  • the operation control unit 507 acquires, for example, operation information for generating an avatar of a user of the information processing apparatus 10.
  • the operation control unit 507 outputs a control signal corresponding to the acquired operation information to control the operation of the terminal device 40.
  • the output control unit 508 outputs sound from the speaker 44.
  • the output control unit 508 outputs, for example, the voice spoken by the user of the information processing device 10 acquired from the information processing device 10 by the voice data acquisition unit 503 via the first communication unit 48.
  • the output control unit 508 outputs, for example, the voice data acquired from the microphone 43 by the voice data acquisition unit 503 to the information processing apparatus 10 via the first communication unit 48.
  • the AR signal control unit 509 controls the AR signal reception unit 47.
  • the AR signal control unit 509 receives the AR signal from the mobile body 30 by using the AR signal reception unit.
  • the communication control unit 510 controls communication between the terminal device 40 and an external device. Specifically, the communication control unit 510 controls the first communication unit 48 to control the communication between the terminal device 40 and the information processing device 10. The communication control unit 510 controls the second communication unit 49 to control the communication between the terminal device 40 and the mobile body 30.
  • FIG. 7 is a diagram for explaining how to use the mobile body according to the embodiment and the terminal device.
  • the mobile body 30 is used by a user U2 such as a family member of the user U1.
  • the user U2 uses the mobile body 30 in, for example, a room at home.
  • the moving body 30 generates a three-dimensional map showing the situation of the room in advance when it is used. Specifically, the moving body 30 rises horizontally, and an omnidirectional angle and an elevation angle range of 0 ° to 90 ° are photographed by the imaging unit 31.
  • the moving body 30 generates a three-dimensional map containing information about obstacles such as chair C that hinder the movement of chair C and the like.
  • the moving body 30 changes the altitude of the image pickup unit 31 from the ground so as to match the line of sight when the user U1 walks.
  • the moving body 30 may be driven so that the image corresponds to the scenery seen by the user.
  • the mobile body 30 moves or changes its orientation according to the operation of the user U1 who uses the information processing device 10.
  • the moving body 30 moves within the movable range shown in the map generated by the map generation unit 396 according to the operation of the controller 20 by the user U1.
  • the mobile body 30 changes its orientation based on the posture of the head of the user U1 who has the information processing device 10 attached to the head.
  • the mobile body 30 may change the orientation of the image pickup unit 31 based on the posture of the head of the user U1 who has the information processing device 10 attached to the head.
  • the user U2 can talk to the user U1 who uses the information processing device 10 by using the terminal device 40.
  • the mobile body 30 photographs the user U2 using the image pickup unit 31, and outputs the photographed data to the information processing apparatus 10.
  • the information processing apparatus 10 displays the image including the user U2. Therefore, the user U1 can have the experience of returning home and having a conversation with the user U2 while staying at a hospital or facility. Further, the mobile body 30 outputs an AR signal including the posture information of the user U1 to the information processing apparatus 10.
  • the user U2 takes a picture of the mobile body 30 with the terminal device 40 when talking to the user U1.
  • the terminal device 40 generates an avatar A1 which is an avatar of the user U1 based on the AR signal, superimposes the avatar A1 on the moving body 30, and displays an image.
  • the user U2 can talk with the user U1 who is admitted to the hospital or the facility as if he / she is talking at home.
  • FIG. 8 is a flowchart showing an example of the processing flow of the information processing system according to the embodiment.
  • the moving body 30 generates a map including obstacles existing in the room based on the video data obtained by photographing the room (step S10).
  • the information processing device 10 outputs the user information of the user wearing the information processing device 10 to the mobile body 30 (step S11). Based on the user information, the mobile body 30 adjusts the altitude so that the image captured by the image pickup unit 31 matches the view seen from the height of the line of sight of the user who uses the information processing device 10 (step S12). ).
  • the moving body 30 uses the imaging unit 31 to photograph the environment in the traveling direction (step S13).
  • the mobile body 30 outputs the video data obtained by shooting with the image pickup unit 31 to the information processing device 10 (step S14).
  • the information processing apparatus 10 displays the video related to the video data received from the mobile body 30 and the operation icon I superimposed on the video on the display unit 11 (step S15).
  • the information processing device 10 acquires posture information regarding the posture of the user who uses the information processing device 10 from the sensor 15 (step S16).
  • the information processing device 10 outputs the acquired posture information to the moving body 30 (step S17).
  • the mobile body 30 changes its posture according to the posture information received from the information processing apparatus 10 (step S18).
  • the controller 20 acquires posture information for moving the moving body 30 (step S19).
  • the controller 20 outputs the acquired posture information to the moving body 30 as the moving body control information (step S20).
  • the moving body 30 moves according to the moving body control information received from the controller 20 (step S21).
  • the terminal device 40 uses the image pickup unit 41 to capture an image including the moving body 30 (step S22).
  • the terminal device 40 displays the image obtained by shooting with the image pickup unit 41 on the display unit 42 (step S23).
  • the mobile body 30 outputs an AR signal to the terminal device 40 (step S24).
  • the terminal device 40 generates an avatar of a user who uses the information processing device 10 based on the AR signal received from the mobile body 30 (step S25).
  • the terminal device 40 superimposes and displays the avatar of the user of the information processing device 10 on the moving body 30 displayed on the display unit 42 (step S26).
  • the information processing device 10 and the terminal device 40 transmit and receive voice data (step S27). As a result, a call is started between the user of the information processing device 10 and the user of the terminal device 40.
  • FIG. 9 is a flowchart showing an example of the flow of processing for the moving body of the information processing apparatus according to the embodiment.
  • the output control unit 196 outputs the user information about the user who uses the information processing device 10 to the mobile body 30 via the first communication unit 17 (step S100).
  • the video data acquisition unit 191 acquires video data from the mobile body 30 whose altitude is adjusted according to the user information via the first communication unit 17 (step S110).
  • the display control unit 194 causes the display unit 11 to display the video related to the video data acquired by the video data acquisition unit 191 (step S120).
  • the posture information acquisition unit 193 acquires posture information regarding the posture of the user who uses the information processing device 10 from the sensor 15 (step S130).
  • the output control unit 196 outputs the posture information acquired by the posture information acquisition unit 193 to the moving body 30 via the first communication unit 17 (step S140).
  • the control unit 19 determines whether or not to end the process (step S150). When the control unit 19 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 19 determines that the process is to be completed. If it is determined that the end is not completed (step S150; No), the process proceeds to step S110. When it is determined that the process is terminated (step S150; Yes), the process of FIG. 9 is terminated.
  • FIG. 10 is a flowchart showing an example of the flow of processing for the terminal device of the information processing device according to the embodiment.
  • the voice data acquisition unit 192 acquires voice data related to the voice spoken by the user who uses the information processing device 10 from the microphone 12 (step S200).
  • the output control unit 196 outputs the voice data acquired from the microphone 12 by the voice data acquisition unit 192 to the terminal device 40 via the first communication unit 17 (step S210).
  • the voice data acquisition unit 192 acquires voice data related to the voice spoken by the user of the terminal device 40 from the terminal device 40 via the first communication unit 17 (step S220).
  • the output control unit 196 outputs the voice related to the voice data acquired from the terminal device 40 by the voice data acquisition unit 192 from the speaker 13 (step S230).
  • the control unit 19 determines whether or not to end the process (step S240). When the control unit 19 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 19 determines that the process is to be completed. If it is determined that the end is not completed (step S240; No), the process proceeds to step S200. When it is determined to end the process (step S240; Yes), the process of FIG. 10 is terminated.
  • FIG. 11 is a flowchart showing an example of the flow of processing for the moving body of the controller according to the embodiment.
  • the posture information acquisition unit 252 acquires the position information of the left hand or the right hand of the user holding the controller 20 from the sensor 22 (step S300).
  • the mobile body control unit 253 outputs the position information acquired by the posture information acquisition unit 252 to the mobile body 30 as the mobile body control information for moving the mobile body 30 via the first communication unit 23 (step S310). ).
  • the control unit 25 determines whether or not to end the process (step S320). When the control unit 25 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 25 determines that the process is to be completed. If it is determined that the end is not completed (step S320; No), the process proceeds to step S300. When it is determined to end the process (step S320; Yes), the process of FIG. 11 is terminated.
  • FIG. 12 is a flowchart showing an example of the flow of processing of the moving body according to the embodiment.
  • the image pickup control unit 391 uses the image pickup unit 31 to photograph the environment around the moving body 30 (step S400).
  • the video data acquisition unit 392 acquires video data related to the video around the moving body 30 from the image pickup unit 31 (step S410).
  • the map generation unit 396 generates a map including information on obstacles based on the video data around the moving body 30 acquired by the video data acquisition unit 392 (step S420).
  • the drive control unit 393 acquires user information about the user who uses the information processing device 10 via the first communication unit 37 (step S430). Based on the acquired user information, the drive control unit 393 adjusts the altitude so that the image captured by the image pickup unit 31 matches the image seen from the line of sight when the user walks (step S440).
  • the image pickup control unit 391 uses the image pickup unit 31 to photograph the environment in the traveling direction of the moving body 30 (step S450).
  • the output control unit 394 outputs video data related to the video in the traveling direction of the moving body 30 to the information processing device 10 via the first communication unit 17 (step S460).
  • the drive control unit 393 determines whether or not the attitude information has been acquired from the information processing device 10 via the first communication unit 37 (step S470). When it is determined that the posture information has been acquired (step S470; Yes), the drive control unit 393 changes the posture of the moving body 30 according to the posture information (step S480). If it is determined that the posture information has not been acquired (step S470; No), the process proceeds to step S490.
  • step S490 determines whether or not the mobile control information has been acquired from the controller 20 via the first communication unit 37 (step S490). When it is determined that the mobile body control information has been acquired (step S490; Yes), the drive control unit 393 moves the mobile body 30 according to the mobile body control information (step S500). If it is determined that the mobile control information has not been acquired (step S490; No), the process proceeds to step S510.
  • the AR signal control unit 397 transmits an AR signal to the terminal device 40 (step S510).
  • the control unit 39 determines whether or not to end the process (step S520). When the control unit 39 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 39 determines to end the process. If it is determined that the end is not completed (step S520; No), the process proceeds to step S450. When it is determined to end the process (step S520; Yes), the process of FIG. 12 is terminated.
  • FIG. 13 is a flowchart showing an example of the processing flow of the terminal device according to the embodiment.
  • the image pickup control unit 501 photographs the moving body 30 using the image pickup unit 41 (step S600).
  • the display control unit 504 displays an image related to the image data obtained by photographing the moving body 30 on the display unit 42 (step S610).
  • the AR signal control unit 509 determines whether or not an AR signal has been received from the mobile body 30 (step S620). When it is determined that the AR signal has been received (step S620; Yes), the avatar generation unit 506 generates an avatar of the user of the information processing apparatus 10 based on the AR signal (step S630). If it is determined that the AR signal has not been received (step S620; No), the process proceeds to step S650. After step S630, the display control unit 504 superimposes and displays the avatar generated by the avatar generation unit 506 on the moving body 30 displayed on the display unit 42 (step S640).
  • the voice data acquisition unit 503 acquires voice data related to the voice spoken by the user of the information processing device 10 from the information processing device 10 via the first communication unit 48. (Step S650).
  • the output control unit 508 outputs the voice related to the voice data acquired from the information processing apparatus 10 by the voice data acquisition unit 503 from the speaker 44 (step S660).
  • the voice data acquisition unit 503 acquires voice data related to the voice spoken by the user of the terminal device 40 from the microphone 43 (step S670).
  • the output control unit 508 outputs the voice data acquired from the microphone 43 by the voice data acquisition unit 503 to the information processing device 10 via the first communication unit 48 (step S680).
  • the control unit 50 determines whether or not to end the process (step S690). When the control unit 50 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 50 determines that the process is to be completed. If it is determined that the end is not completed (step S690; No), the process proceeds to step S600. When it is determined to end the process (step S690; Yes), the process of FIG. 13 is terminated.
  • the user who is admitted to the hospital or the facility head-mounts the image taken by the camera provided in the drone and which matches the line of sight when walking. Operate the drone while visually observing it on the display. As a result, the user who is admitted to the hospital or the facility can experience the sensation of returning home in a pseudo manner.
  • the present disclosure is not limited to this.
  • the present embodiment may be used outdoors as long as the moving body 300 can generate a map including information on obstacles in advance.
  • the moving body 300 can generate a map including information on obstacles in advance.
  • the present disclosure is not limited by the contents of these embodiments.
  • the above-mentioned components include those that can be easily assumed by those skilled in the art, those that are substantially the same, that is, those in a so-called equal range.
  • the components described above can be combined as appropriate. Further, various omissions, replacements or changes of the components can be made without departing from the gist of the above-described embodiment.
  • the information processing device, information processing system, information processing method, and program of the present embodiment can be used, for example, as a communication device that communicates with a user at a remote location by using a wearable device and a mobile body such as a robot. ..
  • Information processing system 10 Information processing device 11,42 Display unit 12,43 Microphone 13,44 Speaker 14,21,33,45 Operation unit 15,22,34 Sensor 16,35,46 Storage unit 17,23,37,48 1st communication unit 18, 24, 38, 49 2nd communication unit 19, 25, 39, 50 Control unit 20 Controller 30 Mobile unit 31, 41 Imaging unit 32 Drive unit 36 AR signal transmission unit 40 Terminal device 47 AR signal reception unit 191,392,502 Video data acquisition unit 192,503 Audio data acquisition unit 193,252 Attitude information acquisition unit 194,504 Display control unit 195,251,507 Operation control unit 196,394,508 Output control unit 197,254,398 , 510 Communication control unit 391,501 Imaging control unit 393 Drive control unit 395 Distance calculation unit 396 Map generation unit 397,509 AR signal control unit 505 Relative distance calculation unit 506 Avatar generation unit

Abstract

This information processing device is provided with: a video data acquisition unit (191) which, following a user's operations in a first location, acquires video data that is of video from the user's line of sight and that is captured by an imaging unit provided on a moving body that can move through a second location; a display control unit (194) which displays the video data on a display unit; a posture information acquisition unit (193) which acquires posture information about the user's posture; and an operation control unit (195) which changes the posture of the moving body depending on the posture information acquired by the posture information acquisition unit (193).

Description

情報処理装置、情報処理システム、情報処理方法、およびプログラムInformation processing equipment, information processing systems, information processing methods, and programs
 本開示は、情報処理装置、情報処理システム、情報処理方法、およびプログラムに関する。 This disclosure relates to information processing devices, information processing systems, information processing methods, and programs.
 ロボットが撮影した映像を利用して、遠隔地にいるユーザとコミュニケーションを図る技術が知られている。 Technology is known to communicate with users in remote areas using images taken by robots.
 例えば、特許文献1には、ロボットで検出した位置情報を基にアバターを表示することで、複数のユーザが3次元的な位置の認識を共有して、同じ空間にいるような感覚を得ることができる技術が記載されている。 For example, in Patent Document 1, by displaying an avatar based on the position information detected by the robot, a plurality of users can share the recognition of the three-dimensional position and obtain the feeling of being in the same space. The technology that can be done is described.
特開2019-168971号公報Japanese Unexamined Patent Publication No. 2019-16971
 長期入院している患者または介護施設などに入所している高齢者と、自宅にいる家族との間でコミュニケーションを取ることが想定される。このような、コミュニケーションにおいては、患者または高齢者が実際に自宅にいる感覚を得られることが求められている。 It is assumed that communication will be made between patients who have been hospitalized for a long time or elderly people who are in nursing care facilities, and their families at home. In such communication, it is required that the patient or the elderly can actually feel at home.
 本開示は、遠隔地にいながら自宅にいる感覚を体験することのできる情報処理装置、情報処理システム、情報処理方法、およびプログラムを提供することを目的とする。 The purpose of this disclosure is to provide an information processing device, an information processing system, an information processing method, and a program that enable an experience of being at home while being at a remote location.
 本開示の一態様に係る情報処理装置は、第1の場所のユーザの操作に従って、第2の場所を移動可能な移動体に設けられた撮像部によって撮影された、前記ユーザの目線の映像に関する映像データを取得する映像データ取得部と、前記映像データを表示部に表示させる表示制御部と、前記ユーザの姿勢に関する姿勢情報を取得する姿勢情報取得部と、前記姿勢情報取得部が取得した姿勢情報に応じて前記移動体の姿勢を変更させる操作制御部と、を備える。 The information processing apparatus according to one aspect of the present disclosure relates to an image of the user's line of sight taken by an imaging unit provided on a moving body that can move in a second place according to an operation of the user in the first place. A video data acquisition unit that acquires video data, a display control unit that displays the video data on the display unit, a posture information acquisition unit that acquires posture information regarding the posture of the user, and a posture acquired by the posture information acquisition unit. It is provided with an operation control unit that changes the posture of the moving body according to information.
 本開示の一態様に係る情報処理システムは、本開示の一態様に係る情報処理装置と、前記情報処理装置と通信可能な端末装置と、を含み、前記端末装置は、撮像部によって撮影された移動体の映像を取得する映像データ取得部と、前記情報処理装置のユーザのアバターを生成するアバター生成部と、前記映像データ取得部が取得した映像を表示部に表示させるとともに、前記表示部において前記移動体に対して前記情報処理画像のユーザのアバターを重畳する表示制御部と、を備える。 The information processing system according to one aspect of the present disclosure includes an information processing device according to one aspect of the present disclosure and a terminal device capable of communicating with the information processing device, and the terminal device is photographed by an image pickup unit. The video data acquisition unit that acquires the video of the moving object, the avatar generation unit that generates the user's avatar of the information processing device, and the video acquired by the video data acquisition unit are displayed on the display unit, and the display unit displays the video. A display control unit that superimposes the user's avatar of the information processing image on the moving body is provided.
 本開示の一態様に係る情報処理方法は、第1の場所のユーザの操作に従って、第2の場所を移動可能な移動体に設けられた撮像部によって撮影された、前記ユーザの目線の映像に関する映像データを取得するステップと、前記映像データを表示部に表示させるステップと、前記ユーザの姿勢に関する姿勢情報を取得するステップと、取得された姿勢情報に応じて前記移動体の姿勢を変更させるステップと、を含む。 The information processing method according to one aspect of the present disclosure relates to an image of the user's line of sight taken by an imaging unit provided on a moving body that can move in a second place according to an operation of a user in the first place. A step of acquiring video data, a step of displaying the video data on the display unit, a step of acquiring posture information regarding the posture of the user, and a step of changing the posture of the moving body according to the acquired posture information. And, including.
 本開示の一態様に係るプログラムは、第1の場所のユーザの操作に従って、第2の場所を移動可能な移動体に設けられた撮像部によって撮影された、前記ユーザの目線の映像に関する映像データを取得するステップと、前記映像データを表示部に表示させるステップと、前記ユーザの姿勢に関する姿勢情報を取得するステップと、取得された姿勢情報に応じて前記移動体の姿勢を変更させるステップと、をコンピュータに実行させる。 The program according to one aspect of the present disclosure is video data relating to an image of the user's line of sight taken by an imaging unit provided on a moving body that can move in a second place according to an operation of a user in the first place. A step of acquiring the image data, a step of displaying the video data on the display unit, a step of acquiring the posture information regarding the posture of the user, and a step of changing the posture of the moving body according to the acquired posture information. Let the computer run.
 本開示によれば、遠隔地にいながら自宅にいる感覚を体験することができる。 According to this disclosure, you can experience the feeling of being at home while being in a remote location.
図1は、実施形態に係る情報処理システムの構成例を説明するための図である。FIG. 1 is a diagram for explaining a configuration example of an information processing system according to an embodiment. 図2は、実施形態に係る情報処理装置の構成例を示すブロック図である。FIG. 2 is a block diagram showing a configuration example of the information processing apparatus according to the embodiment. 図3は、実施形態に係るコントローラの構成例を示す図である。FIG. 3 is a diagram showing a configuration example of the controller according to the embodiment. 図4は、実施形態に係る情報処理装置と、コントローラとの使用方法を説明するための図である。FIG. 4 is a diagram for explaining how to use the information processing apparatus according to the embodiment and the controller. 図5は、実施形態に係る移動体の構成例を示すブロック図である。FIG. 5 is a block diagram showing a configuration example of the moving body according to the embodiment. 図6は、実施形態に係る端末装置の構成例を示すブロック図である。FIG. 6 is a block diagram showing a configuration example of the terminal device according to the embodiment. 図7は、実施形態に係る移動体と、端末装置との使用方法について説明するための図である。FIG. 7 is a diagram for explaining how to use the mobile body according to the embodiment and the terminal device. 図8は、実施形態に係る情報処理システムの処理の流れの一例を示すフローチャートである。FIG. 8 is a flowchart showing an example of the processing flow of the information processing system according to the embodiment. 図9は、実施形態に係る情報処理装置の移動体に対する処理の流れの一例を示すフローチャートである。FIG. 9 is a flowchart showing an example of the flow of processing for the moving body of the information processing apparatus according to the embodiment. 図10は、実施形態に係る情報処理装置の端末装置に対する処理の流れの一例を示すフローチャートである。FIG. 10 is a flowchart showing an example of the flow of processing for the terminal device of the information processing device according to the embodiment. 図11は、実施形態に係るコントローラの移動体に対する処理の流れの一例を示すフローチャートである。FIG. 11 is a flowchart showing an example of the flow of processing for the moving body of the controller according to the embodiment. 図12は、実施形態に係る移動体の処理の流れの一例を示すフローチャートである。FIG. 12 is a flowchart showing an example of the flow of processing of the moving body according to the embodiment. 図13は、実施形態に係る端末装置の処理の流れの一例を示すフローチャートである。FIG. 13 is a flowchart showing an example of the processing flow of the terminal device according to the embodiment.
 以下、添付図面を参照して、本開示に係る実施形態を詳細に説明する。なお、この実施形態により本開示が限定されるものではなく、また、実施形態が複数ある場合には、各実施形態を組み合わせて構成するものも含む。また、以下の実施形態において、同一の部位には同一の符号を付することにより重複する説明を省略する。 Hereinafter, embodiments according to the present disclosure will be described in detail with reference to the attached drawings. It should be noted that the present disclosure is not limited to this embodiment, and when there are a plurality of embodiments, the present embodiment also includes a combination of the respective embodiments. Further, in the following embodiments, the same parts are designated by the same reference numerals, so that duplicate description will be omitted.
[情報処理システム]
 図1を用いて、実施形態に係る情報処理システムについて説明する。図1は、実施形態に係る情報処理システムの構成例を示す図である。
[Information processing system]
The information processing system according to the embodiment will be described with reference to FIG. FIG. 1 is a diagram showing a configuration example of an information processing system according to an embodiment.
 図1に示すように、情報処理システム1は、情報処理装置10と、コントローラ20と、移動体30と、端末装置40と、を含む。情報処理装置10と、コントローラ20と、移動体30と、端末装置40とは、ネットワークN1を介して、通信可能に接続されている。ネットワークN1は、例えば、インターネット網である。情報処理装置10と、コントローラ20とは、ネットワークN2を介して、通信可能に接続されている。移動体30と、端末装置40とは、ネットワークN3を介して、通信可能に接続されている。ネットワークN2およびネットワークN3は、例えば、Bluetooth(登録商標)や、Wi-Fi(登録商標)を使用したネットワークである。 As shown in FIG. 1, the information processing system 1 includes an information processing device 10, a controller 20, a mobile body 30, and a terminal device 40. The information processing device 10, the controller 20, the mobile body 30, and the terminal device 40 are communicably connected via the network N1. The network N1 is, for example, an internet network. The information processing apparatus 10 and the controller 20 are communicably connected to each other via the network N2. The mobile body 30 and the terminal device 40 are communicably connected to each other via the network N3. The network N2 and the network N3 are, for example, networks using Bluetooth (registered trademark) or Wi-Fi (registered trademark).
 情報処理装置10と、コントローラ20とは、第1の場所で用いられる。第1の場所としては、病院および施設などが例示されるが、これに限定されない。移動体30と、端末装置40とは、第1の場所とは異なる第2の場所で用いられる。第2の場所としては、自宅が例示されるが、これに限定されない。第1の場所のユーザおよび第2の場所のユーザは、それぞれ、情報処理装置10および端末装置40を用いて、テレビ電話を行うことができる。 The information processing device 10 and the controller 20 are used in the first place. Examples of the first place include, but are not limited to, hospitals and facilities. The mobile body 30 and the terminal device 40 are used in a second place different from the first place. The second place is exemplified by, but is not limited to, the home. The user at the first place and the user at the second place can make a videophone call by using the information processing device 10 and the terminal device 40, respectively.
 情報処理装置10は、ユーザが頭部に装着して使用するHMD(Head Mounted Display)装置を用いた構成が例示されるが、これに限定されない。コントローラ20は、ユーザによって把持されて用いられる移動体30の操作装置である。コントローラ20は、例えば、スティック状の形状を有するが、これに限定されない。ユーザは、情報処理装置10に表示された映像を確認しながらコントローラ20を操作することで、移動体30の移動を含む各種の動作を制御することができる。移動体30は、コントローラ20を用いて操作可能なロボットである。ロボットとしては、ドローンが例示されるが、これに限定されない。端末装置40は、スマートフォン、タブレット端末、AR(Augmented Reality)デバイスなどが例示されるが、これに限定されない。第2空間のユーザは、移動体30を撮影して表示部に表示させながら情報処理装置10のユーザと通話を行う。この際、端末装置40は、表示部に表示された移動体30に対して、情報処理装置10のユーザのアバター画像を重畳させる。これにより、情報処理装置10のユーザは、例えば、病院に入院していながら、疑似的に一時帰宅を体験することができる。端末装置40のユーザは、情報処理装置10のユーザが自宅にいるような体験をすることができる。 The information processing device 10 is exemplified by a configuration using an HMD (Head Mounted Display) device that the user wears on the head and uses, but the information processing device 10 is not limited to this. The controller 20 is an operating device for the mobile body 30 that is gripped and used by the user. The controller 20 has, for example, a stick-like shape, but is not limited thereto. The user can control various operations including the movement of the moving body 30 by operating the controller 20 while checking the image displayed on the information processing device 10. The mobile body 30 is a robot that can be operated by using the controller 20. Examples of robots include, but are not limited to, drones. Examples of the terminal device 40 include smartphones, tablet terminals, AR (Augmented Reality) devices, and the like, but the terminal device 40 is not limited thereto. The user in the second space makes a call with the user of the information processing apparatus 10 while photographing the moving body 30 and displaying it on the display unit. At this time, the terminal device 40 superimposes the avatar image of the user of the information processing device 10 on the moving body 30 displayed on the display unit. As a result, the user of the information processing apparatus 10 can experience a pseudo-temporary return home while being hospitalized, for example. The user of the terminal device 40 can experience as if the user of the information processing device 10 is at home.
[情報処理装置]
 図2を用いて、実施形態に係る情報処理装置の構成について説明する。図2は、実施形態に係る情報処理装置の構成例を示すブロック図である。
[Information processing device]
The configuration of the information processing apparatus according to the embodiment will be described with reference to FIG. FIG. 2 is a block diagram showing a configuration example of the information processing apparatus according to the embodiment.
 図2に示すように、情報処理装置10は、表示部11と、マイク12と、スピーカ13と、操作部14と、センサ15と、記憶部16と、第1通信部17と、第2通信部18と、制御部19とを備える。表示部11と、マイク12と、スピーカ13と、操作部14と、センサ15と、記憶部16と、第1通信部17と、第2通信部18と、制御部19とは、それぞれ、バスB1により接続されている。 As shown in FIG. 2, the information processing apparatus 10 includes a display unit 11, a microphone 12, a speaker 13, an operation unit 14, a sensor 15, a storage unit 16, a first communication unit 17, and a second communication. A unit 18 and a control unit 19 are provided. The display unit 11, the microphone 12, the speaker 13, the operation unit 14, the sensor 15, the storage unit 16, the first communication unit 17, the second communication unit 18, and the control unit 19 are buses, respectively. It is connected by B1.
 表示部11は、各種の映像を表示する。表示部11は、例えば、ヘッドマウントディスプレイである。具体的には、表示部11は、例えば、VR(Virtual Reality)デバイスおよびARグラスなどのヘッドマウントディスプレイである。表示部11は、ユーザが頭部に装着した状態でユーザの両眼を覆う位置に配置される。 The display unit 11 displays various images. The display unit 11 is, for example, a head-mounted display. Specifically, the display unit 11 is, for example, a VR (Virtual Reality) device and a head-mounted display such as an AR glass. The display unit 11 is arranged at a position that covers both eyes of the user while being worn on the head of the user.
 マイク12は、ユーザが発話した音声を収音する。マイク12は、例えば、情報処理装置10を構成する筐体に配置される。マイク12は、収音した音声に関する音声データを音声データ取得部192に出力する。 The microphone 12 picks up the voice spoken by the user. The microphone 12 is arranged, for example, in a housing constituting the information processing apparatus 10. The microphone 12 outputs voice data related to the picked-up voice to the voice data acquisition unit 192.
 スピーカ13は、音声を出力する。スピーカ13は、端末装置40から取得した端末装置40のユーザが発話した音声に関する音声データに基づいて音声を出力する。スピーカ13は、例えば、情報処理装置10を構成する筐体に配置される。 The speaker 13 outputs sound. The speaker 13 outputs voice based on voice data related to voice spoken by the user of the terminal device 40 acquired from the terminal device 40. The speaker 13 is arranged in, for example, a housing constituting the information processing apparatus 10.
 操作部14は、情報処理装置10に対する各種の操作を受け付ける。操作部14は、例えば、通信を開始するための操作および通信を終了するための操作などを受け付ける。操作部14は、ボタン、スイッチなどが例示されるが、これに限定されない。操作部14は、例えば、情報処理装置10を構成する筐体に配置される。操作部14は、受け付けた操作に関する操作信号を操作制御部195に出力する。 The operation unit 14 receives various operations on the information processing device 10. The operation unit 14 accepts, for example, an operation for starting communication, an operation for ending communication, and the like. Examples of the operation unit 14 include buttons, switches, and the like, but the operation unit 14 is not limited thereto. The operation unit 14 is arranged, for example, in a housing constituting the information processing apparatus 10. The operation unit 14 outputs an operation signal related to the received operation to the operation control unit 195.
 センサ15は、情報処理装置10を使用するユーザの姿勢を検出する。センサ15は、例えば、情報処理装置10を使用するユーザの顔の方向および水平面に対する顔の角度を検出する。センサ15は、例えば、加速度センサおよびジャイロセンサなどで実現される。センサ15は、例えば、検出したユーザの姿勢に関する姿勢情報を姿勢情報取得部193に出力する。センサ15は、例えば、情報処理装置10を構成する筐体に配置される。 The sensor 15 detects the posture of the user who uses the information processing device 10. The sensor 15 detects, for example, the direction of the face of the user who uses the information processing apparatus 10 and the angle of the face with respect to the horizontal plane. The sensor 15 is realized by, for example, an acceleration sensor and a gyro sensor. The sensor 15 outputs, for example, the posture information regarding the detected posture of the user to the posture information acquisition unit 193. The sensor 15 is arranged, for example, in a housing constituting the information processing apparatus 10.
 センサ15は、例えば、情報処理装置10を使用するユーザの眼の状態を検出してもよい。ユーザの眼の状態とは、例えばユーザの視線方向、またはユーザの視点などである。この場合、センサ15は、例えば、ユーザの眼を撮影するカメラで実現すればよい。センサ15は、例えば、情報処理装置10の筐体において、情報処理装置10を使用している状態でユーザの顔の方向に向けて配置すればよい。 The sensor 15 may detect, for example, the state of the eyes of the user who uses the information processing device 10. The state of the user's eyes is, for example, the direction of the user's line of sight, the viewpoint of the user, and the like. In this case, the sensor 15 may be realized by, for example, a camera that captures the user's eyes. For example, the sensor 15 may be arranged in the housing of the information processing device 10 toward the user's face while the information processing device 10 is being used.
 記憶部16は、各種の情報を記憶する。記憶部16は、例えば、情報処理装置10を使用するユーザに関するユーザ情報を記憶する。ユーザ情報には、ユーザの身長および目線の高さを含む各種の身体情報が含まれる。記憶部16は、例えば、RAM(Random Access Memory)、フラッシュメモリなどの半導体メモリ素子、またはハードディスク、ソリッドステートドライブなどの記憶装置で実現することができる。 The storage unit 16 stores various information. The storage unit 16 stores, for example, user information about a user who uses the information processing device 10. The user information includes various physical information including the height of the user and the height of the line of sight. The storage unit 16 can be realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory, or a storage device such as a hard disk or a solid state drive.
 第1通信部17は、遠距離無線通信を行う遠距離無線通信部である。具体的には、第1通信部17は、ネットワークN1を介して、移動体30および端末装置40と通信可能に接続されている。すなわち、情報処理装置10は、第1通信部17を用いて、移動体30および端末装置40と遠距離無線通信を行う。 The first communication unit 17 is a telecommunications unit that performs telecommunications. Specifically, the first communication unit 17 is communicably connected to the mobile body 30 and the terminal device 40 via the network N1. That is, the information processing device 10 uses the first communication unit 17 to perform long-distance wireless communication with the mobile body 30 and the terminal device 40.
 第2通信部18は、近距離無線通信を行う近距離無線通信部である。具体的には、第2通信部18は、ネットワークN2を介して、コントローラ20と通信可能に接続されている。すなわち、情報処理装置10は、第2通信部18を用いて、コントローラ20と近距離無線通信を行う。 The second communication unit 18 is a short-range wireless communication unit that performs short-range wireless communication. Specifically, the second communication unit 18 is communicably connected to the controller 20 via the network N2. That is, the information processing apparatus 10 uses the second communication unit 18 to perform short-range wireless communication with the controller 20.
 制御部19は、例えば、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等によって、図示しない記憶部に記憶されたプログラム(例えば、本開示に係るプログラム)がRAM等を作業領域として実行されることにより実現される。また、制御部19は、例えば、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現されてもよい。制御部19は、ソフトウェアと、ハードウェアとの組み合わせで実現されてもよい。 In the control unit 19, for example, a program (for example, a program according to the present disclosure) stored in a storage unit (not shown) is executed by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like with the RAM or the like as a work area. It is realized by. Further, the control unit 19 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array). The control unit 19 may be realized by a combination of software and hardware.
 制御部19は、映像データ取得部191と、音声データ取得部192と、姿勢情報取得部193と、表示制御部194と、操作制御部195と、出力制御部196と、通信制御部197とを備える。映像データ取得部191と、音声データ取得部192と、姿勢情報取得部193と、表示制御部194と、操作制御部195と、出力制御部196と、通信制御部197とは、それぞれ、バスB2により接続されている。 The control unit 19 includes a video data acquisition unit 191, an audio data acquisition unit 192, an attitude information acquisition unit 193, a display control unit 194, an operation control unit 195, an output control unit 196, and a communication control unit 197. Be prepared. The video data acquisition unit 191, the audio data acquisition unit 192, the attitude information acquisition unit 193, the display control unit 194, the operation control unit 195, the output control unit 196, and the communication control unit 197 are bus B2, respectively. Is connected by.
 映像データ取得部191は、映像データを取得する。映像データ取得部191は、第1通信部17を介して、移動体200に設けられた撮像部が撮影した映像に関する映像データを取得する。 The video data acquisition unit 191 acquires video data. The video data acquisition unit 191 acquires video data related to the video captured by the image pickup unit provided on the moving body 200 via the first communication unit 17.
 音声データ取得部192は、音声データを取得する。音声データ取得部192は、マイク12からユーザが発話した音声に関する音声データを取得する。音声データ取得部192は、第1通信部17を介して、端末装置40から端末装置40のユーザが発話した音声に関する音声データを取得する。 The voice data acquisition unit 192 acquires voice data. The voice data acquisition unit 192 acquires voice data related to the voice spoken by the user from the microphone 12. The voice data acquisition unit 192 acquires voice data related to the voice spoken by the user of the terminal device 40 from the terminal device 40 via the first communication unit 17.
 姿勢情報取得部193は、情報処理装置10を使用するユーザの姿勢に関する姿勢情報を取得する。姿勢情報取得部193は、例えば、センサ15による検出結果に基づいて、ユーザの姿勢情報を取得する。姿勢情報取得部193は、例えば、ユーザの頭部の向きおよび水平面に対する頭部の角度などに関する情報を取得する。姿勢情報取得部193は、例えば、ユーザの体の向きなどを検出してもよい。 The posture information acquisition unit 193 acquires posture information regarding the posture of the user who uses the information processing device 10. The posture information acquisition unit 193 acquires the user's posture information based on the detection result of the sensor 15, for example. The posture information acquisition unit 193 acquires information regarding, for example, the orientation of the user's head and the angle of the head with respect to the horizontal plane. The posture information acquisition unit 193 may detect, for example, the orientation of the user's body.
 姿勢情報取得部193は、センサ15がユーザの眼を撮影するカメラである場合に、センサ15による検出結果に基づいて、ユーザの眼の状態に関する情報を姿勢情報として取得してもよい。姿勢情報取得部193は、ユーザの表情に関する情報を姿勢情報として取得してもよい。 When the sensor 15 is a camera that captures the user's eyes, the posture information acquisition unit 193 may acquire information on the state of the user's eyes as posture information based on the detection result by the sensor 15. The posture information acquisition unit 193 may acquire information regarding the user's facial expression as posture information.
 表示制御部194は、表示部11に映像を表示させる。表示制御部194は、例えば、映像データ取得部191が取得した映像データに関する映像を表示部11に表示させる。表示制御部194は、例えば、情報処理装置10を使用するユーザが歩行する際の目線の高さから見た景色と一致するような映像を表示部11に表示させる。ユーザが歩行する際の目線の高さから見た景色と一致するような映像とは、完全に一致する場合に限られず、所定範囲内に収まる映像も含み得る。ユーザが歩行するときに限らず、ユーザが車椅子に着座して移動するとき、ユーザが床面に着座したとき、ユーザが寝ころんだときなど、ユーザが所定の姿勢、所定の動作をとっているときのユーザがみる景色に相当する映像を表示部11に表示してもよい。これら種々の場合においてユーザが見る景色に相当する映像を、ユーザの目線の映像と称する。表示制御部194は、例えば、移動体200を移動させるための操作アイコンを映像に重畳して、表示部11に表示させる。表示制御部194は、例えば、コントローラ20から取得されたユーザの手の位置に関する姿勢情報に基づいて、ユーザの手の映像と、移動体200を移動させるための操作アイコンとを映像に重畳して、表示部11に表示させてもよい。ユーザが操作部14を用いて所定の姿勢、所定の動作を選択し、移動体200を移動させるように指示してもよい。 The display control unit 194 causes the display unit 11 to display an image. The display control unit 194 causes the display unit 11 to display, for example, an image related to the image data acquired by the image data acquisition unit 191. The display control unit 194 causes the display unit 11 to display, for example, an image that matches the scenery seen from the height of the line of sight when the user using the information processing apparatus 10 walks. The image that matches the scenery seen from the height of the line of sight when the user walks is not limited to the case of perfect matching, and may include an image that falls within a predetermined range. Not only when the user walks, but also when the user is sitting in a wheelchair and moving, when the user is seated on the floor, when the user is lying down, or when the user is in a predetermined posture or movement. The image corresponding to the scenery seen by the user may be displayed on the display unit 11. An image corresponding to the scenery seen by the user in these various cases is referred to as an image from the user's point of view. The display control unit 194 superimposes, for example, an operation icon for moving the moving body 200 on the image and displays it on the display unit 11. The display control unit 194 superimposes the image of the user's hand and the operation icon for moving the moving body 200 on the image based on the posture information regarding the position of the user's hand acquired from the controller 20, for example. , May be displayed on the display unit 11. The user may use the operation unit 14 to select a predetermined posture and a predetermined operation, and instruct the moving body 200 to move.
 操作制御部195は、操作部14が受け付けた操作に関する操作情報を取得する。操作制御部195は、例えば、コントローラ20、移動体30、および端末装置40との通信を開始したり終了したりするための操作情報を取得する。操作制御部195は、取得した操作情報に応じた制御信号を出力して情報処理装置10の動作を制御する。 The operation control unit 195 acquires the operation information related to the operation received by the operation unit 14. The operation control unit 195 acquires, for example, operation information for starting and ending communication with the controller 20, the mobile body 30, and the terminal device 40. The operation control unit 195 outputs a control signal corresponding to the acquired operation information to control the operation of the information processing apparatus 10.
 出力制御部196は、スピーカ13から音声を出力させる。出力制御部196は、例えば、音声データ取得部192が第1通信部17を介して端末装置40から取得した、端末装置40のユーザが発話した音声を出力する。出力制御部196は、例えば、音声データ取得部192がマイク12から取得した音声データを、第1通信部17を介して端末装置40に出力する。出力制御部196は、例えば、記憶部16に記憶されたユーザ情報を、第1通信部17を介して移動体30に出力する。出力制御部196は、例えば、姿勢情報取得部193が取得した姿勢情報を、第1通信部17を介して移動体30に出力する。 The output control unit 196 outputs sound from the speaker 13. The output control unit 196 outputs, for example, the voice spoken by the user of the terminal device 40 acquired from the terminal device 40 by the voice data acquisition unit 192 via the first communication unit 17. The output control unit 196 outputs, for example, the voice data acquired from the microphone 12 by the voice data acquisition unit 192 to the terminal device 40 via the first communication unit 17. The output control unit 196 outputs, for example, the user information stored in the storage unit 16 to the mobile body 30 via the first communication unit 17. The output control unit 196 outputs, for example, the posture information acquired by the posture information acquisition unit 193 to the mobile body 30 via the first communication unit 17.
 通信制御部197は、情報処理装置10と、外部の装置との間の通信を制御する。具体的には、通信制御部197は、第1通信部17を制御して、情報処理装置10と、移動体30との間の通信を制御する。通信制御部197は、第1通信部17を制御して、情報処理装置10と、端末装置40との間の通信を制御する。通信制御部197は、第2通信部18を制御して、情報処理装置10と、コントローラ20との間の通信を制御する。 The communication control unit 197 controls communication between the information processing device 10 and an external device. Specifically, the communication control unit 197 controls the first communication unit 17 to control the communication between the information processing device 10 and the mobile body 30. The communication control unit 197 controls the first communication unit 17 to control the communication between the information processing device 10 and the terminal device 40. The communication control unit 197 controls the second communication unit 18 to control the communication between the information processing device 10 and the controller 20.
[コントローラ]
 図3を用いて、実施形態に係るコントローラの構成について説明する。図3は、実施形態に係るコントローラの構成例を示す図である。
[controller]
The configuration of the controller according to the embodiment will be described with reference to FIG. FIG. 3 is a diagram showing a configuration example of the controller according to the embodiment.
 コントローラ20は、操作部21と、センサ22と、第1通信部23と、第2通信部24と、制御部25とを備える。操作部21と、センサ22と、第1通信部23と、第2通信部24と、制御部25とは、それぞれ、バスB3により接続されている。 The controller 20 includes an operation unit 21, a sensor 22, a first communication unit 23, a second communication unit 24, and a control unit 25. The operation unit 21, the sensor 22, the first communication unit 23, the second communication unit 24, and the control unit 25 are each connected by a bus B3.
 操作部21は、コントローラ20に対する各種の操作を受け付ける。操作部21は、例えば、情報処理装置10および移動体30との通信を開始するための操作および通信を終了するための操作などを受け付ける。操作部21は、ボタン、スイッチなどが例示されるが、これに限定されない。操作部21は、例えば、コントローラ20を構成する筐体に配置される。操作部21は、受け付けた操作に関する操作信号を操作制御部251に出力する。 The operation unit 21 receives various operations on the controller 20. The operation unit 21 receives, for example, an operation for starting communication with the information processing device 10 and the mobile body 30, an operation for ending communication, and the like. Examples of the operation unit 21 include buttons, switches, and the like, but the operation unit 21 is not limited thereto. The operation unit 21 is arranged, for example, in a housing constituting the controller 20. The operation unit 21 outputs an operation signal related to the received operation to the operation control unit 251.
 センサ22は、コントローラ20を使用するユーザの姿勢を検出する。センサ22は、例えば、手の位置および向きなどを検出する。センサ22は、例えば、加速度センサおよびジャイロセンサなどで実現される。センサ22は、例えば、検出したユーザの姿勢に関する姿勢情報を姿勢情報取得部252に出力する。センサ22は、例えば、コントローラ20を構成する筐体に配置される。 The sensor 22 detects the posture of the user who uses the controller 20. The sensor 22 detects, for example, the position and orientation of the hand. The sensor 22 is realized by, for example, an acceleration sensor and a gyro sensor. The sensor 22 outputs, for example, the posture information regarding the detected posture of the user to the posture information acquisition unit 252. The sensor 22 is arranged, for example, in a housing constituting the controller 20.
 第1通信部23は、遠距離無線通信を行う遠距離無線通信部である。具体的には、第1通信部23は、ネットワークN1を介して、移動体30と通信可能に接続されている。すなわち、コントローラ20は、第1通信部23を介して、移動体30を遠隔操作する。 The first communication unit 23 is a long-distance wireless communication unit that performs long-distance wireless communication. Specifically, the first communication unit 23 is communicably connected to the mobile body 30 via the network N1. That is, the controller 20 remotely controls the mobile body 30 via the first communication unit 23.
 第2通信部24は、近距離無線通信を行う近距離無線通信部である。具体的には、第2通信部24は、ネットワークN2を介して、情報処理装置10と通信可能に接続されている。すなわち、コントローラ20は、第2通信部24を用いて、情報処理装置10と近距離無線通信を行う。 The second communication unit 24 is a short-range wireless communication unit that performs short-range wireless communication. Specifically, the second communication unit 24 is communicably connected to the information processing device 10 via the network N2. That is, the controller 20 uses the second communication unit 24 to perform short-range wireless communication with the information processing device 10.
 制御部25は、例えば、CPUやMPU等によって、図示しない記憶部に記憶されたプログラムがRAM等を作業領域として実行されることにより実現される。また、制御部25は、例えば、ASICやFPGA等の集積回路により実現されてもよい。制御部25は、ソフトウェアと、ハードウェアとの組み合わせで実現されてもよい。 The control unit 25 is realized by, for example, a CPU, an MPU, or the like executing a program stored in a storage unit (not shown) using a RAM or the like as a work area. Further, the control unit 25 may be realized by an integrated circuit such as an ASIC or FPGA. The control unit 25 may be realized by a combination of software and hardware.
 制御部25は、操作制御部251と、姿勢情報取得部252と、移動体制御部253と、通信制御部254とを備える。操作制御部251と、姿勢情報取得部252と、移動体制御部253と、通信制御部254とは、それぞれ、バスB4により接続されている。 The control unit 25 includes an operation control unit 251, a posture information acquisition unit 252, a mobile body control unit 253, and a communication control unit 254. The operation control unit 251, the posture information acquisition unit 252, the mobile control unit 253, and the communication control unit 254 are connected by a bus B4, respectively.
 操作制御部251は、操作部21が受け付けた操作に関する操作情報を取得する。操作制御部251は、例えば、情報処理装置10および移動体30との通信を開始したり終了したりするための操作情報を取得する。操作制御部251は、取得した操作情報に応じた制御信号を出力してコントローラ20の動作を制御する。 The operation control unit 251 acquires the operation information related to the operation received by the operation unit 21. The operation control unit 251 acquires, for example, operation information for starting and ending communication with the information processing device 10 and the mobile body 30. The operation control unit 251 outputs a control signal corresponding to the acquired operation information to control the operation of the controller 20.
 姿勢情報取得部252は、コントローラ20を使用するユーザの姿勢に関する姿勢情報を取得する。姿勢情報取得部252は、例えば、センサ22による検出結果に基づいて、ユーザの姿勢情報を取得する。姿勢情報取得部252は、例えば、手の位置や方向に関する位置情報を取得する。姿勢情報取得部252は、例えば、予め定められた手の基準位置や基準方向からのずれに関する手の位置情報を取得する。 The posture information acquisition unit 252 acquires posture information regarding the posture of the user who uses the controller 20. The posture information acquisition unit 252 acquires the user's posture information based on the detection result of the sensor 22, for example. The posture information acquisition unit 252 acquires, for example, position information regarding the position and direction of the hand. The posture information acquisition unit 252 acquires, for example, hand position information regarding a predetermined reference position of the hand or deviation from the reference direction.
 移動体制御部253は、移動体を移動させるための移動体制御情報を、第1通信部17を介して、移動体30に出力する。移動体制御部253は、例えば、姿勢情報取得部252が取得したコントローラ20を把持しているユーザの手の位置情報を移動体制御情報として、移動体30に出力する。 The mobile body control unit 253 outputs the mobile body control information for moving the mobile body to the mobile body 30 via the first communication unit 17. The mobile body control unit 253 outputs, for example, the position information of the user's hand holding the controller 20 acquired by the posture information acquisition unit 252 to the mobile body 30 as the mobile body control information.
 通信制御部254は、コントローラ20と、外部の装置との間の通信を制御する。具体的には、通信制御部254は、第1通信部23を制御して、コントローラ20と、移動体30との間の通信を制御する。通信制御部254は、第2通信部24を制御して、情報処理装置10と、コントローラ20との間の通信を制御する。 The communication control unit 254 controls communication between the controller 20 and an external device. Specifically, the communication control unit 254 controls the first communication unit 23 to control the communication between the controller 20 and the mobile body 30. The communication control unit 254 controls the second communication unit 24 to control the communication between the information processing device 10 and the controller 20.
[情報処理装置およびコントローラの使用方法]
 図4を用いて、実施形態に係る情報処理装置と、コントローラとの使用方法について説明する。図4は、実施形態に係る情報処理装置と、コントローラとの使用方法を説明するための図である。
[How to use the information processing device and controller]
A method of using the information processing apparatus according to the embodiment and the controller will be described with reference to FIG. FIG. 4 is a diagram for explaining how to use the information processing apparatus according to the embodiment and the controller.
 図4に示すように、情報処理装置10は、ユーザU1の頭部に装着される。ユーザU1は、入院している病院内や入所している施設内で情報処理装置10を使用する。コントローラ20は、例えば、ユーザU1の左手と右手とのそれぞれ1つずつ把持される。情報処理装置10の表示部11には移動体30に設けられた撮像部によって撮影された撮影映像IM1が表示される。ユーザU1は、撮影映像IM1を視認しながらコントローラ20を操作することで、移動体30を操作することができる。 As shown in FIG. 4, the information processing apparatus 10 is attached to the head of the user U1. The user U1 uses the information processing device 10 in the hospital where he / she is hospitalized or in the facility where he / she is hospitalized. The controller 20 is gripped, for example, one for each of the left hand and the right hand of the user U1. On the display unit 11 of the information processing apparatus 10, the captured image IM1 captured by the imaging unit provided on the moving body 30 is displayed. The user U1 can operate the moving body 30 by operating the controller 20 while visually recognizing the captured image IM1.
 撮影映像IM1は、例えば、ユーザU1などの自宅で撮影された映像である。撮影映像IM1には、ユーザU1の家族などであるユーザU2と、障害物である椅子Cなどを含む。撮影映像IM1は、操作アイコンI1と、操作アイコンI2と、操作アイコンI3とを含む。操作アイコンI1と、操作アイコンI2と、操作アイコンI3とは、移動体30を操作する際に使用されるアイコンである。操作アイコンI1と、操作アイコンI2と、操作アイコンI3とを特に区別する必要のない場合には、操作アイコンIと称する。操作アイコンIは、例えば、矢印のアイコンである。矢印の方向は、移動体30の進行方向を意味する。操作アイコンIは、マップ生成部396が生成したユーザU1が移動可能な場所に関する情報を含む3次元マップに基づき、ユーザが移動可能な方向へ移動するためのアイコンのみを撮影映像IM1に表示することが好ましい。操作アイコンIは、移動方向の指示に限らず、ユーザU1の姿勢、例えば、立つ、座る、寝るなどを指示するアイコンを含めてもよい。 The captured image IM1 is, for example, an image taken at home such as the user U1. The captured image IM1 includes a user U2 who is a family member of the user U1 and a chair C which is an obstacle. The captured image IM1 includes an operation icon I1, an operation icon I2, and an operation icon I3. The operation icon I1, the operation icon I2, and the operation icon I3 are icons used when operating the moving body 30. When it is not necessary to distinguish between the operation icon I1 and the operation icon I2, the operation icon I3 is referred to as an operation icon I. The operation icon I is, for example, an arrow icon. The direction of the arrow means the traveling direction of the moving body 30. The operation icon I displays only the icon for moving the user in the movable direction on the captured image IM1 based on the three-dimensional map generated by the map generation unit 396 regarding the place where the user U1 can move. Is preferable. The operation icon I is not limited to the instruction of the moving direction, and may include an icon instructing the posture of the user U1, for example, standing, sitting, sleeping, and the like.
 撮影映像IM1は、左手アイコンLHと、右手アイコンRHとを含む。左手アイコンLHは、ユーザU1の左手に対応し、右手アイコンRHは、ユーザU1の右手に対応している。ユーザU1は、左手にコントローラ20を把持した状態で左手を動かすと、撮影映像IM1における左手アイコンLHは左手の動きに合わせて動く。ユーザU1は、右手にコントローラ20を把持した状態で右手を動かすと、撮影映像IM1における右手アイコンRHは右手の動きに合わせて動く。例えば、左手にコントローラ20を把持した状態で左手を上に動かすと、撮影映像IM1における左手アイコンLHは上に動く。ユーザU1は、撮影映像IM1において、左手アイコンLHまたは右手アイコンRHを動かして、操作アイコンIを選択することで、移動体30を移動させることができる。例えば、操作アイコンI1を選択すると移動体30は前方に移動し、操作アイコンI2を選択すると移動体30は左斜前方に移動し、操作アイコンI3を選択すると移動体30は右斜前方に移動する。すなわち、ユーザU1は、コントローラ20を把持した左手または右手を動かして、移動体30を動かして方向に対応する操作アイコンIを選択することで、移動体30を所望の方向に移動させることができる。 The captured video IM1 includes a left-hand icon LH and a right-hand icon RH. The left-handed icon LH corresponds to the left hand of the user U1, and the right-handed icon RH corresponds to the right hand of the user U1. When the user U1 moves the left hand while holding the controller 20 in the left hand, the left hand icon LH in the captured image IM1 moves according to the movement of the left hand. When the user U1 moves the right hand while holding the controller 20 in the right hand, the right hand icon RH in the captured image IM1 moves according to the movement of the right hand. For example, if the left hand is moved upward while the controller 20 is held in the left hand, the left hand icon LH in the captured image IM1 moves upward. The user U1 can move the moving body 30 by moving the left-hand icon LH or the right-hand icon RH and selecting the operation icon I in the captured video IM1. For example, if the operation icon I1 is selected, the moving body 30 moves forward, if the operation icon I2 is selected, the moving body 30 moves diagonally forward to the left, and if the operation icon I3 is selected, the moving body 30 moves diagonally forward to the right. .. That is, the user U1 can move the moving body 30 in a desired direction by moving the left hand or the right hand holding the controller 20 to move the moving body 30 and selecting the operation icon I corresponding to the direction. ..
[移動体]
 図5を用いて、実施形態に係る移動体の構成について説明する。図5は、実施形態に係る移動体の構成例を示すブロック図である。
[Mobile]
The configuration of the moving body according to the embodiment will be described with reference to FIG. FIG. 5 is a block diagram showing a configuration example of the moving body according to the embodiment.
 図5に示すように、移動体30は、撮像部31と、駆動部32と、操作部33と、センサ34と、記憶部35と、AR信号送信部36と、第1通信部37と、第2通信部38と、制御部39とを備える。撮像部31と、駆動部32と、操作部33と、センサ34と、記憶部35と、AR信号送信部36と、第1通信部37と、第2通信部38と、制御部39とは、それぞれ、バスB5により接続されている。移動体30は、例えば、ユーザU1によって遠隔操作されるドローンである。以下では、移動体30はドローンであるものとして説明するが、移動体30は、その他のロボットであってもよい。 As shown in FIG. 5, the mobile body 30 includes an image pickup unit 31, a drive unit 32, an operation unit 33, a sensor 34, a storage unit 35, an AR signal transmission unit 36, and a first communication unit 37. A second communication unit 38 and a control unit 39 are provided. The image pickup unit 31, the drive unit 32, the operation unit 33, the sensor 34, the storage unit 35, the AR signal transmission unit 36, the first communication unit 37, the second communication unit 38, and the control unit 39 are , Each is connected by bus B5. The mobile body 30 is, for example, a drone remotely controlled by the user U1. In the following, the mobile body 30 will be described as being a drone, but the mobile body 30 may be another robot.
 撮像部31は、移動体30の周辺の映像を撮影する。撮像部31は、例えば、移動体30の進行方向の映像を撮影する。撮像部31は、図示しない撮像素子と、撮像素子の出力に基づいて映像データを生成する回路などを備える。撮像素子としては、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサおよびCCD(Charge Coupled Device)などが例示されるが、これらに限られない。 The image pickup unit 31 captures an image around the moving body 30. The image pickup unit 31 captures, for example, an image of the moving body 30 in the traveling direction. The image pickup unit 31 includes an image pickup element (not shown), a circuit for generating video data based on the output of the image pickup element, and the like. Examples of the image pickup element include, but are not limited to, a CMOS (Complementary Metal Oxide Semiconductor) image sensor and a CCD (Charge Coupled Device).
 撮像部31は、例えば、移動体30を構成する筐体に設けられる。撮像部31は、例えば、アクチュエータによって移動体30に対して回転駆動可能に設けられる。撮像部31は、例えば、図示しないジンバルを介して移動体30に設けられる。 The image pickup unit 31 is provided in, for example, a housing constituting the moving body 30. The image pickup unit 31 is provided, for example, to be rotationally driven with respect to the moving body 30 by an actuator. The image pickup unit 31 is provided on the moving body 30 via, for example, a gimbal (not shown).
 駆動部32は、移動体30の各部位を駆動させる。駆動部32は、移動体30を移動させるためのモータ、撮像部31を移動体30に対して回転駆動させるためのアクチュエータを含む各種の駆動源を含む。 The drive unit 32 drives each part of the moving body 30. The drive unit 32 includes various drive sources including a motor for moving the moving body 30 and an actuator for rotationally driving the image pickup unit 31 with respect to the moving body 30.
 操作部33は、移動体30に対する各種の操作を受け付ける。操作部33は、例えば、移動体30の電源をオンしたりオフしたりする操作などを受け付ける。操作部33は、ボタン、スイッチなどが例示されるが、これに限定されない。操作部33は、例えば、移動体30を構成する筐体に配置される。 The operation unit 33 receives various operations on the moving body 30. The operation unit 33 accepts, for example, an operation of turning the power of the moving body 30 on and off. Examples of the operation unit 33 include buttons, switches, and the like, but the operation unit 33 is not limited thereto. The operation unit 33 is arranged, for example, in a housing constituting the mobile body 30.
 センサ34は、移動体30と、地面との間の距離を検出するセンサを含む。なお、屋内の場合、地面とは床面を指す。センサ34は、移動体30と、周囲の障害物との間の距離を検出する。センサ34としては、レーザレーダ(例えば、LIDAR:Laser Imaging Detection and Ranging)、赤外線照射部と受光センサとを含む赤外線センサ、およびToF(Time of Flight)センサなどが例示されるが、これらに限られない。センサ34は、その他のセンサを含んでもよい。 The sensor 34 includes a sensor that detects the distance between the moving body 30 and the ground. In the case of indoors, the ground refers to the floor surface. The sensor 34 detects the distance between the moving body 30 and surrounding obstacles. Examples of the sensor 34 include, but are limited to, a laser radar (for example, LIDAR: Laser Imaging Detection and Ringing), an infrared sensor including an infrared irradiation unit and a light receiving sensor, and a ToF (Time of Flight) sensor. No. The sensor 34 may include other sensors.
 センサ34は、移動体30の傾きを検出するセンサを含む。このようなセンサ34は、例えば、ジャイロセンサなどが例示されるが、これに限られない。 The sensor 34 includes a sensor that detects the inclination of the moving body 30. Examples of such a sensor 34 include, but are not limited to, a gyro sensor and the like.
 記憶部35は、各種の情報を記憶する。記憶部35は、例えば、情報処理装置10を使用するユーザに関するユーザ情報を記憶する。ユーザ情報には、ユーザの身長および目線の高さを含む各種の身体情報が含まれる。記憶部35は、例えば、RAM、フラッシュメモリなどの半導体メモリ素子、またはハードディスク、ソリッドステートドライブなどの記憶装置で実現することができる。 The storage unit 35 stores various information. The storage unit 35 stores, for example, user information about a user who uses the information processing device 10. The user information includes various physical information including the height of the user and the height of the line of sight. The storage unit 35 can be realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or a solid state drive.
 AR信号送信部36は、実視界の映像に対して拡張現実として虚像映像を重畳して表示させるためのAR信号を端末装置40に送信する。AR信号送信部36は、例えば、端末装置40で撮影され表示された移動体30に対して、ユーザU1のアバターを重畳して表示させるためのAR信号を端末装置40に送信する。AR信号には、情報処理装置10で検出された情報処理装置10を使用するユーザの、表情および姿勢に関する情報を含み得る。 The AR signal transmission unit 36 transmits an AR signal to the terminal device 40 for displaying an augmented reality image superimposed on the image in the real field. The AR signal transmission unit 36 transmits, for example, an AR signal for displaying the avatar of the user U1 superimposed on the moving body 30 photographed and displayed by the terminal device 40 to the terminal device 40. The AR signal may include information about the facial expression and posture of the user who uses the information processing device 10 detected by the information processing device 10.
 第1通信部37は、遠距離無線通信を行う遠距離無線通信部である。具体的には、第1通信部37は、ネットワークN1を介して、コントローラ20と通信可能に接続されている。 The first communication unit 37 is a telecommunications unit that performs telecommunications. Specifically, the first communication unit 37 is communicably connected to the controller 20 via the network N1.
 第2通信部38は、近距離無線通信を行う近距離無線通信部である。具体的には、第2通信部38は、ネットワークN3を介して、端末装置40と通信可能に接続されている。すなわち、移動体30は、第2通信部38を用いて、端末装置40と近距離無線通信を行う。 The second communication unit 38 is a short-range wireless communication unit that performs short-range wireless communication. Specifically, the second communication unit 38 is communicably connected to the terminal device 40 via the network N3. That is, the mobile body 30 uses the second communication unit 38 to perform short-range wireless communication with the terminal device 40.
 制御部39は、例えば、CPUやMPU等によって、図示しない記憶部に記憶されたプログラムがRAM等を作業領域として実行されることにより実現される。また、制御部39は、例えば、ASICやFPGA等の集積回路により実現されてもよい。制御部39は、ソフトウェアと、ハードウェアとの組み合わせで実現されてもよい。 The control unit 39 is realized by, for example, a CPU, an MPU, or the like executing a program stored in a storage unit (not shown) using a RAM or the like as a work area. Further, the control unit 39 may be realized by an integrated circuit such as an ASIC or FPGA. The control unit 39 may be realized by a combination of software and hardware.
 制御部39は、撮像制御部391と、映像データ取得部392と、駆動制御部393と、出力制御部394と、距離算出部395と、マップ生成部396と、AR信号制御部397と、通信制御部398とを備える。撮像制御部391と、映像データ取得部392と、駆動制御部393と、出力制御部394と、距離算出部395と、マップ生成部396と、AR信号制御部397と、通信制御部398とは、それぞれ、バスB6によって接続されている。 The control unit 39 communicates with the image pickup control unit 391, the video data acquisition unit 392, the drive control unit 393, the output control unit 394, the distance calculation unit 395, the map generation unit 396, and the AR signal control unit 397. It is provided with a control unit 398. The image pickup control unit 391, the video data acquisition unit 392, the drive control unit 393, the output control unit 394, the distance calculation unit 395, the map generation unit 396, the AR signal control unit 397, and the communication control unit 398 are , Each is connected by bus B6.
 撮像制御部391は、撮像部31を制御する。撮像制御部391は、撮像部31による撮像条件を設定して、撮像部31に撮像を行わせる。 The image pickup control unit 391 controls the image pickup unit 31. The image pickup control unit 391 sets the image pickup conditions by the image pickup unit 31 and causes the image pickup unit 31 to perform image pickup.
 映像データ取得部392は、映像データを取得する。映像データ取得部392は、撮像部31から移動体30の周辺の映像データを取得する。映像データ取得部392は、例えば、撮像部31から移動体30の前方の映像データを取得する。 The video data acquisition unit 392 acquires video data. The video data acquisition unit 392 acquires video data around the moving body 30 from the image pickup unit 31. The video data acquisition unit 392 acquires, for example, video data in front of the moving body 30 from the image pickup unit 31.
 駆動制御部393は、移動体30を各部の駆動を制御する。駆動制御部393は、例えば、記憶部35に記憶されたユーザ情報に従って、撮像部31によって撮影される映像が、ユーザが歩行する際の目線から見た景色と一致するような高さとなるように移動体30を駆動する。具体的には、駆動制御部393は、移動体30がドローンである場合には、撮像部31によって撮影される映像が、ユーザが歩行する際の目線から見た景色と一致するような高度で移動体30を飛行させる。この場合、駆動制御部393は、距離算出部395による移動体30と、地面との間の距離の算出結果に基づいて、高度を一定に保つように移動体30を飛行させる。撮像部31によって撮影される映像が、ユーザが歩行する際の目線から見た映像と一致するとは、完全に一致する場合に限られず、所定範囲内に収まることを含んでもよい。高度が一定とは、完全に一致する場合に限られず、所定範囲内に収まることを含んでもよい。ユーザが歩行するときに限らず、ユーザが車椅子に着座して移動するとき、ユーザが床面に着座したとき、ユーザが寝ころんだときなど、ユーザが所定の姿勢、所定の動作をとっているときのユーザがみる景色に相当する映像となるよう、移動体30を駆動してもよい。駆動制御部393は、情報処理装置10からの制御信号に従って、駆動部32を駆動させて移動体30を移動させる。 The drive control unit 393 controls the drive of each unit of the moving body 30. For example, the drive control unit 393 has a height such that the image captured by the image pickup unit 31 matches the scenery seen from the line of sight when the user walks, according to the user information stored in the storage unit 35. Drive the moving body 30. Specifically, when the moving body 30 is a drone, the drive control unit 393 has an altitude at which the image captured by the image pickup unit 31 matches the scenery seen from the line of sight when the user walks. Fly the mobile body 30. In this case, the drive control unit 393 flies the moving body 30 so as to keep the altitude constant based on the calculation result of the distance between the moving body 30 and the ground by the distance calculating unit 395. The fact that the image captured by the image pickup unit 31 matches the image seen from the line of sight when the user walks is not limited to the case where they completely match, and may include falling within a predetermined range. The constant altitude is not limited to the case where the altitude is exactly the same, and may include the fact that the altitude is within a predetermined range. Not only when the user walks, but also when the user is sitting in a wheelchair and moving, when the user is seated on the floor, when the user is lying down, or when the user is in a predetermined posture or motion. The moving body 30 may be driven so that the image corresponds to the scenery seen by the user. The drive control unit 393 drives the drive unit 32 to move the moving body 30 according to the control signal from the information processing device 10.
 駆動制御部393は、情報処理装置10からの制御信号に従って、駆動部32を駆動させて撮像部31が向く方向を変更する。駆動制御部393は、例えば、ユーザの頭部が右後方を向いた場合には、駆動部32を駆動させて撮像部31の方向を右方向に変更する。駆動制御部393は、移動体30の向きを変更することで撮像部31の向く方向を変更してもよいし、撮像部31が設けられたジンバルを駆動することで撮像部31の向きを変更してもよい。 The drive control unit 393 drives the drive unit 32 according to the control signal from the information processing device 10 to change the direction in which the image pickup unit 31 faces. For example, when the user's head faces the right rear direction, the drive control unit 393 drives the drive unit 32 to change the direction of the image pickup unit 31 to the right. The drive control unit 393 may change the direction of the image pickup unit 31 by changing the direction of the moving body 30, or may change the direction of the image pickup unit 31 by driving the gimbal provided with the image pickup unit 31. You may.
 出力制御部394は、各種の情報を出力する。出力制御部394は、例えば、映像データ取得部392が取得した映像データを、第1通信部37を介して情報処理装置10に出力する。出力制御部394は、例えば、第1通信部37を介して取得されたユーザ情報を記憶部35に記憶する。 The output control unit 394 outputs various information. The output control unit 394 outputs, for example, the video data acquired by the video data acquisition unit 392 to the information processing apparatus 10 via the first communication unit 37. The output control unit 394 stores, for example, the user information acquired via the first communication unit 37 in the storage unit 35.
 距離算出部395は、移動体30と各種の物体との間の距離を算出する。距離算出部395は、例えば、センサ34による検出結果に基づいて、移動体30と、地面との間の距離を算出する。距離算出部395は、センサ34による検出結果に基づいて、移動体30と、障害物との間の距離を算出する。 The distance calculation unit 395 calculates the distance between the moving body 30 and various objects. The distance calculation unit 395 calculates the distance between the moving body 30 and the ground, for example, based on the detection result by the sensor 34. The distance calculation unit 395 calculates the distance between the moving body 30 and the obstacle based on the detection result by the sensor 34.
 マップ生成部396は、移動体30が移動する空間のマップを生成する。マップ生成部396は、例えば、移動体30が移動する空間が部屋である場合には部屋の広さや形、柱、大きな家具などの移動体30の移動の妨げとなる障害物、およびユーザが部屋を歩いて移動する際に妨げとなる障害物を含むマップを生成する。具体的には、マップ生成部396は、撮像部31によって撮影された部屋全体の映像データに基づいて、フォトグラメトリの技術を用いて部屋の3次元マップを作成する。3次元マップを作成することで、障害物との衝突回避、ユーザが移動することのない机などの上を通過するといった不自然な移動を回避することができる。言い換えれば、マップ生成部396は、ユーザが移動可能な場所に関する情報を含む3次元マップを生成する。 The map generation unit 396 generates a map of the space in which the moving body 30 moves. In the map generation unit 396, for example, when the space in which the moving body 30 moves is a room, obstacles such as the size and shape of the room, pillars, large furniture, etc. that hinder the movement of the moving body 30, and the user's room Generates a map that contains obstacles that hinder you as you walk around. Specifically, the map generation unit 396 creates a three-dimensional map of the room using the technique of photogrammetry based on the video data of the entire room taken by the image pickup unit 31. By creating a three-dimensional map, it is possible to avoid collisions with obstacles and avoid unnatural movements such as passing over a desk or the like where the user does not move. In other words, the map generator 396 generates a three-dimensional map that includes information about where the user can move.
 AR信号制御部397は、AR信号送信部36の動作を制御する。AR信号制御部397は、AR信号送信部36を制御して、端末装置40に対してAR信号を送信する。 The AR signal control unit 397 controls the operation of the AR signal transmission unit 36. The AR signal control unit 397 controls the AR signal transmission unit 36 to transmit an AR signal to the terminal device 40.
 通信制御部398は、移動体30と、外部の装置との間の通信を制御する。具体的には、通信制御部398は、第1通信部37を制御して、移動体30と、情報処理装置10との間の通信を制御する。通信制御部398は、第1通信部37を制御して、移動体30と、コントローラ20との間の通信を制御する。通信制御部398は、第2通信部38を制御して、移動体30と、端末装置40との間の通信を制御する。 The communication control unit 398 controls communication between the mobile body 30 and an external device. Specifically, the communication control unit 398 controls the first communication unit 37 to control the communication between the mobile body 30 and the information processing device 10. The communication control unit 398 controls the first communication unit 37 to control the communication between the mobile body 30 and the controller 20. The communication control unit 398 controls the second communication unit 38 to control the communication between the mobile body 30 and the terminal device 40.
[端末装置]
 図6を用いて、実施形態に係る端末装置の構成について説明する。図6は、実施形態に係る端末装置の構成例を示すブロック図である。
[Terminal device]
The configuration of the terminal apparatus according to the embodiment will be described with reference to FIG. FIG. 6 is a block diagram showing a configuration example of the terminal device according to the embodiment.
 図6に示すように、端末装置40は、撮像部41と、表示部42と、マイク43と、スピーカ44と、操作部45と、記憶部46と、AR信号受信部47と、第1通信部48と、第2通信部49と、制御部50とを備える。撮像部41と、表示部42と、マイク43と、スピーカ44と、操作部45と、記憶部46と、AR信号受信部47と、第1通信部48と、第2通信部49と、制御部50とは、それぞれ、バスB7により接続されている。 As shown in FIG. 6, the terminal device 40 includes an image pickup unit 41, a display unit 42, a microphone 43, a speaker 44, an operation unit 45, a storage unit 46, an AR signal reception unit 47, and a first communication. A unit 48, a second communication unit 49, and a control unit 50 are provided. The image pickup unit 41, the display unit 42, the microphone 43, the speaker 44, the operation unit 45, the storage unit 46, the AR signal reception unit 47, the first communication unit 48, and the second communication unit 49 are controlled. Each unit 50 is connected to the bus B7.
 撮像部41は、端末装置40の周辺の映像を撮影する。撮像部41は、図示しない撮像素子と、撮像素子の出力に基づいて映像データを生成する回路などを備える。撮像素子としては、CMOSイメージセンサおよびCCDなどが例示されるが、これらに限られない。 The image pickup unit 41 captures an image around the terminal device 40. The image pickup unit 41 includes an image pickup element (not shown), a circuit for generating video data based on the output of the image pickup element, and the like. Examples of the image pickup element include, but are not limited to, a CMOS image sensor and a CCD.
 表示部42は、種々の映像を表示する。表示部42は、例えば、撮像部41が撮影した映像データを表示する。表示部42は、例えば、液晶ディスプレイ(LCD:Liquid Crystal Display)または有機EL(Electro-Luminescence)ディスプレイなどを含むディスプレイである。 The display unit 42 displays various images. The display unit 42 displays, for example, video data captured by the image pickup unit 41. The display unit 42 is a display including, for example, a liquid crystal display (LCD: Liquid Crystal Display) or an organic EL (Electro-Luminescence) display.
 マイク43は、ユーザが発話した音声を収音する。マイク43は、例えば、端末装置40を構成する筐体に配置される。マイク43は、収音した音声に関する音声データを音声データ取得部503に出力する。 The microphone 43 collects the voice spoken by the user. The microphone 43 is arranged, for example, in a housing constituting the terminal device 40. The microphone 43 outputs voice data related to the picked-up voice to the voice data acquisition unit 503.
 スピーカ44は、音声を出力する。スピーカ44は、情報処理装置10から取得した情報処理装置10のユーザが発話した音声に関する音声データに基づいて音声を出力する。スピーカ44は、例えば、端末装置40を構成する筐体に配置される。 The speaker 44 outputs audio. The speaker 44 outputs voice based on voice data related to voice spoken by the user of the information processing device 10 acquired from the information processing device 10. The speaker 44 is arranged, for example, in a housing constituting the terminal device 40.
 操作部45は、端末装置40に対する各種の操作を受け付ける。操作部45は、例えば、通信を開始するための操作および通信を終了するための操作などを受け付ける。操作部45は、ボタン、スイッチ、タッチパネルなどが例示されるが、これに限定されない。操作部45は、例えば、端末装置40を構成する筐体に配置される。操作部45は、受け付けた操作に関する操作信号を操作制御部507に出力する。 The operation unit 45 receives various operations on the terminal device 40. The operation unit 45 accepts, for example, an operation for starting communication and an operation for ending communication. Examples of the operation unit 45 include buttons, switches, and a touch panel, but the operation unit 45 is not limited thereto. The operation unit 45 is arranged, for example, in a housing constituting the terminal device 40. The operation unit 45 outputs an operation signal related to the received operation to the operation control unit 507.
 記憶部46は、各種の情報を記憶する。記憶部46は、例えば、情報処理装置10を使用するユーザに関するユーザ情報を記憶する。ユーザ情報には、ユーザの身長および目線の高さを含む各種の身体情報が含まれる。記憶部46は、例えば、RAM、フラッシュメモリなどの半導体メモリ素子、またはハードディスク、ソリッドステートドライブなどの記憶装置で実現することができる。 The storage unit 46 stores various types of information. The storage unit 46 stores, for example, user information about a user who uses the information processing device 10. The user information includes various physical information including the height of the user and the height of the line of sight. The storage unit 46 can be realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or a solid state drive.
 AR信号受信部47は、実視界の映像に対して拡張現実として虚像映像を重畳して表示させるためのAR信号を移動体30から受信する。AR信号受信部47は、例えば、端末装置40で撮影され表示された移動体30に対して、ユーザU1のアバターを重畳して表示させるためのAR信号を移動体30から受信する。 The AR signal receiving unit 47 receives an AR signal from the moving body 30 for displaying an augmented reality image superimposed on the image in the real field. The AR signal receiving unit 47 receives, for example, an AR signal from the moving body 30 for displaying the avatar of the user U1 superimposed on the moving body 30 photographed and displayed by the terminal device 40.
 第1通信部48は、遠距離無線通信を行う遠距離無線通信部である。具体的には、第1通信部48は、ネットワークN1を介して、情報処理装置10と通信可能に接続されている。 The first communication unit 48 is a telecommunications unit that performs telecommunications. Specifically, the first communication unit 48 is communicably connected to the information processing apparatus 10 via the network N1.
 第2通信部49は、近距離無線通信を行う近距離無線通信部である。具体的には、第2通信部49は、ネットワークN3を介して、移動体30と通信可能に接続されている。 The second communication unit 49 is a short-range wireless communication unit that performs short-range wireless communication. Specifically, the second communication unit 49 is communicably connected to the mobile body 30 via the network N3.
 制御部50は、例えば、CPUやMPU等によって、図示しない記憶部に記憶されたプログラムがRAM等を作業領域として実行されることにより実現される。また、制御部50は、例えば、ASICやFPGA等の集積回路により実現されてもよい。制御部50は、ソフトウェアと、ハードウェアとの組み合わせで実現されてもよい。 The control unit 50 is realized by, for example, a CPU, an MPU, or the like executing a program stored in a storage unit (not shown) using a RAM or the like as a work area. Further, the control unit 50 may be realized by an integrated circuit such as an ASIC or FPGA. The control unit 50 may be realized by a combination of software and hardware.
 制御部50は、撮像制御部501と、映像データ取得部502と、音声データ取得部503と、表示制御部504と、相対距離算出部505と、アバター生成部506と、操作制御部507と、出力制御部508と、AR信号制御部509と、通信制御部510と、を備える。撮像制御部501と、映像データ取得部502と、音声データ取得部503と、表示制御部504と、相対距離算出部505と、アバター生成部506と、操作制御部507と、出力制御部508と、AR信号制御部509と、通信制御部510は、それぞれ、バスB8により接続されている。 The control unit 50 includes an image pickup control unit 501, a video data acquisition unit 502, an audio data acquisition unit 503, a display control unit 504, a relative distance calculation unit 505, an avatar generation unit 506, and an operation control unit 507. It includes an output control unit 508, an AR signal control unit 509, and a communication control unit 510. Imaging control unit 501, video data acquisition unit 502, audio data acquisition unit 503, display control unit 504, relative distance calculation unit 505, avatar generation unit 506, operation control unit 507, and output control unit 508. , AR signal control unit 509 and communication control unit 510 are connected by bus B8, respectively.
 撮像制御部501は、撮像部41を制御する。撮像制御部501は、撮像部41による撮像条件を設定して、撮像部41に撮像を行わせる。 The image pickup control unit 501 controls the image pickup unit 41. The image pickup control unit 501 sets the image pickup conditions by the image pickup unit 41, and causes the image pickup unit 41 to perform image pickup.
 映像データ取得部502は、映像データを取得する。映像データ取得部502は、撮像部41から端末装置40の周辺の映像データを取得する。 The video data acquisition unit 502 acquires video data. The video data acquisition unit 502 acquires video data around the terminal device 40 from the image pickup unit 41.
 音声データ取得部503は、音声データを取得する。音声データ取得部503は、マイク43からユーザが発話した音声に関する音声データを取得する。音声データ取得部503は、第1通信部17を介して、情報処理装置10から情報処理装置10のユーザが発話した音声に関する音声データを取得する。 The voice data acquisition unit 503 acquires voice data. The voice data acquisition unit 503 acquires voice data related to the voice spoken by the user from the microphone 43. The voice data acquisition unit 503 acquires voice data related to the voice spoken by the user of the information processing device 10 from the information processing device 10 via the first communication unit 17.
 表示制御部504は、表示部42に映像を表示させる。表示制御部504は、例えば、映像データ取得部502が取得した映像データに関する映像を表示部42に表示させる。表示制御部504は、例えば、アバター生成部506が生成した情報処理装置10のユーザのアバターを表示部42に表示された映像に含まれる移動体30に重畳して表示させる。 The display control unit 504 causes the display unit 42 to display an image. The display control unit 504 causes the display unit 42 to display, for example, an image related to the image data acquired by the image data acquisition unit 502. For example, the display control unit 504 superimposes and displays the user's avatar of the information processing device 10 generated by the avatar generation unit 506 on the moving body 30 included in the video displayed on the display unit 42.
 相対距離算出部505は、端末装置40と、移動体30との間の相対距離を算出する。相対距離算出部505は、例えば、表示部42に表示された映像のうち、移動体30が占める面積の割合に基づいて、端末装置40と、移動体30との間の相対距離を算出する。相対距離算出部505は、例えば、端末装置40と、移動体30との間で通信をする際の電波強度に基づいて、端末装置40と、移動体30との間の相対距離を算出してもよい。相対距離算出部505は、例えば、端末装置40が図示しないToFセンサなどの測距センサを含む場合には、測距センサの検出結果に基づいて、端末装置40と、移動体30との間の相対距離を算出してもよい。 The relative distance calculation unit 505 calculates the relative distance between the terminal device 40 and the moving body 30. The relative distance calculation unit 505 calculates the relative distance between the terminal device 40 and the moving body 30 based on the ratio of the area occupied by the moving body 30 in the image displayed on the display unit 42, for example. The relative distance calculation unit 505 calculates the relative distance between the terminal device 40 and the mobile body 30 based on the radio wave intensity when communicating between the terminal device 40 and the mobile body 30, for example. May be good. For example, when the terminal device 40 includes a distance measuring sensor such as a ToF sensor (not shown), the relative distance calculation unit 505 is located between the terminal device 40 and the moving body 30 based on the detection result of the distance measuring sensor. Relative distance may be calculated.
 アバター生成部506は、表示部42に移動体30に重畳して表示させるためのアバターを生成する。アバター生成部506は、例えば、記憶部46に記憶されたユーザ情報に基づいて、情報処理装置10を使用するユーザのアバターを生成する。アバター生成部506は、例えば、相対距離算出部505が算出した距離に応じて、アバターの大きさを変更する。アバター生成部506は、例えば、AR信号制御部509が取得したAR信号に基づいて、アバターの表情および姿勢などを変更する。 The avatar generation unit 506 generates an avatar to be superimposed and displayed on the moving body 30 on the display unit 42. The avatar generation unit 506 generates, for example, an avatar of a user who uses the information processing device 10 based on the user information stored in the storage unit 46. The avatar generation unit 506 changes the size of the avatar according to the distance calculated by the relative distance calculation unit 505, for example. The avatar generation unit 506 changes the facial expression and posture of the avatar, for example, based on the AR signal acquired by the AR signal control unit 509.
 操作制御部507は、操作部45が受け付けた操作に関する操作情報を取得する。操作制御部507は、例えば、情報処理装置10、および移動体30との通信を開始したり終了したりするための操作情報を取得する。操作制御部507は、例えば、情報処理装置10のユーザのアバターを生成するための操作情報を取得する。操作制御部507は、取得した操作情報に応じた制御信号を出力して端末装置40の動作を制御する。 The operation control unit 507 acquires the operation information related to the operation received by the operation unit 45. The operation control unit 507 acquires, for example, operation information for starting and ending communication with the information processing device 10 and the mobile body 30. The operation control unit 507 acquires, for example, operation information for generating an avatar of a user of the information processing apparatus 10. The operation control unit 507 outputs a control signal corresponding to the acquired operation information to control the operation of the terminal device 40.
 出力制御部508は、スピーカ44から音声を出力させる。出力制御部508は、例えば、音声データ取得部503が第1通信部48を介して情報処理装置10から取得した、情報処理装置10のユーザが発話した音声を出力する。出力制御部508は、例えば、音声データ取得部503がマイク43から取得した音声データを、第1通信部48を介して情報処理装置10に出力する。 The output control unit 508 outputs sound from the speaker 44. The output control unit 508 outputs, for example, the voice spoken by the user of the information processing device 10 acquired from the information processing device 10 by the voice data acquisition unit 503 via the first communication unit 48. The output control unit 508 outputs, for example, the voice data acquired from the microphone 43 by the voice data acquisition unit 503 to the information processing apparatus 10 via the first communication unit 48.
 AR信号制御部509は、AR信号受信部47を制御する。AR信号制御部509は、AR信号受信部を用いて、移動体30からのAR信号を受信する。 The AR signal control unit 509 controls the AR signal reception unit 47. The AR signal control unit 509 receives the AR signal from the mobile body 30 by using the AR signal reception unit.
 通信制御部510は、端末装置40と、外部の装置との間の通信を制御する。具体的には、通信制御部510は、第1通信部48を制御して、端末装置40と、情報処理装置10との間の通信を制御する。通信制御部510は、第2通信部49を制御して、端末装置40と、移動体30との間の通信を制御する。 The communication control unit 510 controls communication between the terminal device 40 and an external device. Specifically, the communication control unit 510 controls the first communication unit 48 to control the communication between the terminal device 40 and the information processing device 10. The communication control unit 510 controls the second communication unit 49 to control the communication between the terminal device 40 and the mobile body 30.
[移動体および端末装置の使用方法]
 図7を用いて、実施形態に係る移動体と、端末装置との使用方法について説明する。図7は、実施形態に係る移動体と、端末装置との使用方法について説明するための図である。
[How to use mobile and terminal devices]
A method of using the mobile body according to the embodiment and the terminal device will be described with reference to FIG. 7. FIG. 7 is a diagram for explaining how to use the mobile body according to the embodiment and the terminal device.
 図7に示すように、移動体30は、ユーザU1の家族などのユーザU2によって使用される。ユーザU2は、例えば、自宅の部屋内で移動体30を使用する。移動体30は、使用する際に予め部屋の状況を示す3次元マップを生成する。具体的には、移動体30は、水平に上昇し、全方位角および0°~90°の仰角の範囲を撮像部31で撮影する。移動体30は、椅子Cなどの移動の妨げとなる椅子Cなどの障害物に関する情報を含む3次元マップを生成する。移動体30は、撮像部31の地面からの高さをユーザU1が歩行する際の目線と一致するように高度を変更する。ユーザが歩行するときに限らず、ユーザが車椅子に着座して移動するとき、ユーザが床面に着座したとき、ユーザが寝ころんだときなど、ユーザが所定の姿勢、所定の動作をとっているときのユーザがみる景色に相当する映像となるよう、移動体30を駆動してもよい。移動体30は、情報処理装置10を使用するユーザU1の操作に従って、移動したり、向きを変更したりする。移動体30は、ユーザU1によるコントローラ20の操作に従って、マップ生成部396が生成したマップに示される移動可能な範囲内を移動する。移動体30は、情報処理装置10を頭部に装着したユーザU1の頭部の姿勢に基づいて、向きを変更する。移動体30は、情報処理装置10を頭部に装着したユーザU1の頭部の姿勢に基づいて、撮像部31の向きを変更するようにしてもよい。 As shown in FIG. 7, the mobile body 30 is used by a user U2 such as a family member of the user U1. The user U2 uses the mobile body 30 in, for example, a room at home. The moving body 30 generates a three-dimensional map showing the situation of the room in advance when it is used. Specifically, the moving body 30 rises horizontally, and an omnidirectional angle and an elevation angle range of 0 ° to 90 ° are photographed by the imaging unit 31. The moving body 30 generates a three-dimensional map containing information about obstacles such as chair C that hinder the movement of chair C and the like. The moving body 30 changes the altitude of the image pickup unit 31 from the ground so as to match the line of sight when the user U1 walks. Not only when the user walks, but also when the user is sitting in a wheelchair and moving, when the user is seated on the floor, when the user is lying down, or when the user is in a predetermined posture or motion. The moving body 30 may be driven so that the image corresponds to the scenery seen by the user. The mobile body 30 moves or changes its orientation according to the operation of the user U1 who uses the information processing device 10. The moving body 30 moves within the movable range shown in the map generated by the map generation unit 396 according to the operation of the controller 20 by the user U1. The mobile body 30 changes its orientation based on the posture of the head of the user U1 who has the information processing device 10 attached to the head. The mobile body 30 may change the orientation of the image pickup unit 31 based on the posture of the head of the user U1 who has the information processing device 10 attached to the head.
 ユーザU2は、端末装置40を用いて、情報処理装置10を使用するユーザU1と通話可能である。ユーザU1と、ユーザU2とが通話をする際、移動体30は撮像部31を用いてユーザU2を撮影し、撮影した撮影データを情報処理装置10に出力する。これにより、情報処理装置10には、ユーザU2を含む映像が表示される。そのため、ユーザU1は、病院および施設などにいながらも、自宅に帰宅してユーザU2と会話をしているような体験をすることができる。また、移動体30は、ユーザU1の姿勢情報などを含むAR信号を情報処理装置10に出力する。 The user U2 can talk to the user U1 who uses the information processing device 10 by using the terminal device 40. When the user U1 and the user U2 make a call, the mobile body 30 photographs the user U2 using the image pickup unit 31, and outputs the photographed data to the information processing apparatus 10. As a result, the information processing apparatus 10 displays the image including the user U2. Therefore, the user U1 can have the experience of returning home and having a conversation with the user U2 while staying at a hospital or facility. Further, the mobile body 30 outputs an AR signal including the posture information of the user U1 to the information processing apparatus 10.
 ユーザU2は、ユーザU1と通話をする際には移動体30を端末装置40で撮影する。端末装置40は、AR信号に基づいて、ユーザU1のアバターであるアバターA1を生成し、移動体30に対してアバターA1を重畳して映像を表示する。これにより、ユーザU2は、病院に入院したり施設に入所したりするユーザU1と自宅で会話しているように、通話することができる。 The user U2 takes a picture of the mobile body 30 with the terminal device 40 when talking to the user U1. The terminal device 40 generates an avatar A1 which is an avatar of the user U1 based on the AR signal, superimposes the avatar A1 on the moving body 30, and displays an image. As a result, the user U2 can talk with the user U1 who is admitted to the hospital or the facility as if he / she is talking at home.
[情報処理システムの処理]
 図8を用いて、実施形態に係る情報処理システムの処理について説明する。図8は、実施形態に係る情報処理システムの処理の流れの一例を示すフローチャートである。
[Processing of information processing system]
The processing of the information processing system according to the embodiment will be described with reference to FIG. FIG. 8 is a flowchart showing an example of the processing flow of the information processing system according to the embodiment.
 まず、移動体30は、室内を撮影して得られた映像データに基づいて、室内に存在する障害物を含むマップを生成する(ステップS10)。情報処理装置10は、情報処理装置10を装着しているユーザのユーザ情報を移動体30に出力する(ステップS11)。移動体30は、ユーザ情報に基づいて、撮像部31で撮像される映像が、情報処理装置10を使用するユーザの目線の高さから見た景色と一致するように高度を調整する(ステップS12)。移動体30は、撮像部31を用いて、進行方向の環境を撮影する(ステップS13)。移動体30は、撮像部31で撮影して得られた映像データを情報処理装置10に出力する(ステップS14)。 First, the moving body 30 generates a map including obstacles existing in the room based on the video data obtained by photographing the room (step S10). The information processing device 10 outputs the user information of the user wearing the information processing device 10 to the mobile body 30 (step S11). Based on the user information, the mobile body 30 adjusts the altitude so that the image captured by the image pickup unit 31 matches the view seen from the height of the line of sight of the user who uses the information processing device 10 (step S12). ). The moving body 30 uses the imaging unit 31 to photograph the environment in the traveling direction (step S13). The mobile body 30 outputs the video data obtained by shooting with the image pickup unit 31 to the information processing device 10 (step S14).
 情報処理装置10は、移動体30から受けた映像データに関する映像と、映像に重畳させた操作アイコンIとを表示部11に表示する(ステップS15)。情報処理装置10は、センサ15から情報処理装置10を使用するユーザの姿勢に関する姿勢情報を取得する(ステップS16)。情報処理装置10は、取得した姿勢情報を移動体30に出力する(ステップS17)。移動体30は、情報処理装置10から受けた姿勢情報に従って、姿勢を変更する(ステップS18)。 The information processing apparatus 10 displays the video related to the video data received from the mobile body 30 and the operation icon I superimposed on the video on the display unit 11 (step S15). The information processing device 10 acquires posture information regarding the posture of the user who uses the information processing device 10 from the sensor 15 (step S16). The information processing device 10 outputs the acquired posture information to the moving body 30 (step S17). The mobile body 30 changes its posture according to the posture information received from the information processing apparatus 10 (step S18).
 コントローラ20は、移動体30を移動させるための姿勢情報を取得する(ステップS19)。コントローラ20は、取得した姿勢情報を移動体制御情報として移動体30に出力する(ステップS20)。移動体30は、コントローラ20から受けた移動体制御情報に従って、移動する(ステップS21)。 The controller 20 acquires posture information for moving the moving body 30 (step S19). The controller 20 outputs the acquired posture information to the moving body 30 as the moving body control information (step S20). The moving body 30 moves according to the moving body control information received from the controller 20 (step S21).
 端末装置40は、撮像部41を用いて、移動体30を含む映像を撮影する(ステップS22)。端末装置40は、撮像部41で撮影して得られた映像を表示部42に表示する(ステップS23)。 The terminal device 40 uses the image pickup unit 41 to capture an image including the moving body 30 (step S22). The terminal device 40 displays the image obtained by shooting with the image pickup unit 41 on the display unit 42 (step S23).
 移動体30は、AR信号を端末装置40に出力する(ステップS24)。端末装置40は、移動体30から受けたAR信号に基づいて、情報処理装置10を使用するユーザのアバターを生成する(ステップS25)。端末装置40は、表示部42に表示された移動体30に対して、情報処理装置10のユーザのアバターを重畳表示する(ステップS26)。 The mobile body 30 outputs an AR signal to the terminal device 40 (step S24). The terminal device 40 generates an avatar of a user who uses the information processing device 10 based on the AR signal received from the mobile body 30 (step S25). The terminal device 40 superimposes and displays the avatar of the user of the information processing device 10 on the moving body 30 displayed on the display unit 42 (step S26).
 情報処理装置10と、端末装置40とは、音声データの送受信を行う(ステップS27)。これにより、情報処理装置10のユーザと、端末装置40のユーザとの間で、通話が開始される。 The information processing device 10 and the terminal device 40 transmit and receive voice data (step S27). As a result, a call is started between the user of the information processing device 10 and the user of the terminal device 40.
[情報処理装置の処理]
 図9を用いて、実施形態に係る情報処理装置が実行する処理の流れについて説明する。図9は、実施形態に係る情報処理装置の移動体に対する処理の流れの一例を示すフローチャートである。
[Processing of information processing equipment]
A flow of processing executed by the information processing apparatus according to the embodiment will be described with reference to FIG. 9. FIG. 9 is a flowchart showing an example of the flow of processing for the moving body of the information processing apparatus according to the embodiment.
 出力制御部196は、第1通信部17を介して、情報処理装置10を使用するユーザに関するユーザ情報を移動体30に出力する(ステップS100)。映像データ取得部191は、ユーザ情報に従って高度が調整された移動体30から、第1通信部17を介して映像データを取得する(ステップS110)。表示制御部194は、映像データ取得部191が取得した映像データに関する映像を表示部11に表示させる(ステップS120)。 The output control unit 196 outputs the user information about the user who uses the information processing device 10 to the mobile body 30 via the first communication unit 17 (step S100). The video data acquisition unit 191 acquires video data from the mobile body 30 whose altitude is adjusted according to the user information via the first communication unit 17 (step S110). The display control unit 194 causes the display unit 11 to display the video related to the video data acquired by the video data acquisition unit 191 (step S120).
 姿勢情報取得部193は、センサ15から情報処理装置10を使用するユーザの姿勢に関する姿勢情報を取得する(ステップS130)。出力制御部196は、第1通信部17を介して、姿勢情報取得部193が取得した姿勢情報を移動体30に出力する(ステップS140)。 The posture information acquisition unit 193 acquires posture information regarding the posture of the user who uses the information processing device 10 from the sensor 15 (step S130). The output control unit 196 outputs the posture information acquired by the posture information acquisition unit 193 to the moving body 30 via the first communication unit 17 (step S140).
 制御部19は、処理を終了するか否かを判定する(ステップS150)。制御部19は、処理を終了する旨の操作および電源をオフする操作などを受け付けた場合に、処理を終了すると判定する。終了を終了しないと判定した場合(ステップS150;No)、ステップS110に進む。処理を終了すると判定した場合(ステップS150;Yes)、図9の処理を終了する。 The control unit 19 determines whether or not to end the process (step S150). When the control unit 19 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 19 determines that the process is to be completed. If it is determined that the end is not completed (step S150; No), the process proceeds to step S110. When it is determined that the process is terminated (step S150; Yes), the process of FIG. 9 is terminated.
 図10を用いて、実施形態に係る情報処理装置が実行する処理の流れについて説明する。図10は、実施形態に係る情報処理装置の端末装置に対する処理の流れの一例を示すフローチャートである。 The flow of processing executed by the information processing apparatus according to the embodiment will be described with reference to FIG. FIG. 10 is a flowchart showing an example of the flow of processing for the terminal device of the information processing device according to the embodiment.
 音声データ取得部192は、マイク12から情報処理装置10を使用するユーザが発話した音声に関する音声データを取得する(ステップS200)。出力制御部196は、第1通信部17を介して、音声データ取得部192がマイク12から取得した音声データを端末装置40に出力する(ステップS210)。 The voice data acquisition unit 192 acquires voice data related to the voice spoken by the user who uses the information processing device 10 from the microphone 12 (step S200). The output control unit 196 outputs the voice data acquired from the microphone 12 by the voice data acquisition unit 192 to the terminal device 40 via the first communication unit 17 (step S210).
 音声データ取得部192は、第1通信部17を介して、端末装置40から端末装置40のユーザが発話した音声に関する音声データを取得する(ステップS220)。出力制御部196は、音声データ取得部192が端末装置40から取得した音声データに関する音声をスピーカ13から出力する(ステップS230)。 The voice data acquisition unit 192 acquires voice data related to the voice spoken by the user of the terminal device 40 from the terminal device 40 via the first communication unit 17 (step S220). The output control unit 196 outputs the voice related to the voice data acquired from the terminal device 40 by the voice data acquisition unit 192 from the speaker 13 (step S230).
 制御部19は、処理を終了するか否かを判定する(ステップS240)。制御部19は、処理を終了する旨の操作および電源をオフする操作などを受け付けた場合に、処理を終了すると判定する。終了を終了しないと判定した場合(ステップS240;No)、ステップS200に進む。処理を終了すると判定した場合(ステップS240;Yes)、図10の処理を終了する。 The control unit 19 determines whether or not to end the process (step S240). When the control unit 19 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 19 determines that the process is to be completed. If it is determined that the end is not completed (step S240; No), the process proceeds to step S200. When it is determined to end the process (step S240; Yes), the process of FIG. 10 is terminated.
[コントローラの処理]
 図11を用いて、実施形態に係るコントローラが実行する処理の流れについて説明する。図11は、実施形態に係るコントローラの移動体に対する処理の流れの一例を示すフローチャートである。
[Controller processing]
A flow of processing executed by the controller according to the embodiment will be described with reference to FIG. FIG. 11 is a flowchart showing an example of the flow of processing for the moving body of the controller according to the embodiment.
 姿勢情報取得部252は、コントローラ20を把持しているユーザの左手または右手の位置情報をセンサ22から取得する(ステップS300)。移動体制御部253は、第1通信部23を介して、姿勢情報取得部252が取得した位置情報を、移動体30を移動させるための移動体制御情報として移動体30に出力する(ステップS310)。 The posture information acquisition unit 252 acquires the position information of the left hand or the right hand of the user holding the controller 20 from the sensor 22 (step S300). The mobile body control unit 253 outputs the position information acquired by the posture information acquisition unit 252 to the mobile body 30 as the mobile body control information for moving the mobile body 30 via the first communication unit 23 (step S310). ).
 制御部25は、処理を終了するか否かを判定する(ステップS320)。制御部25は、処理を終了する旨の操作および電源をオフする操作などを受け付けた場合に、処理を終了すると判定する。終了を終了しないと判定した場合(ステップS320;No)、ステップS300に進む。処理を終了すると判定した場合(ステップS320;Yes)、図11の処理を終了する。 The control unit 25 determines whether or not to end the process (step S320). When the control unit 25 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 25 determines that the process is to be completed. If it is determined that the end is not completed (step S320; No), the process proceeds to step S300. When it is determined to end the process (step S320; Yes), the process of FIG. 11 is terminated.
[移動体の処理]
 図12を用いて、実施形態に係る移動体の処理について説明する。図12は、実施形態に係る移動体の処理の流れの一例を示すフローチャートである。
[Processing of moving objects]
The processing of the moving body according to the embodiment will be described with reference to FIG. FIG. 12 is a flowchart showing an example of the flow of processing of the moving body according to the embodiment.
 撮像制御部391は、撮像部31を用いて移動体30の周辺の環境を撮影する(ステップS400)。映像データ取得部392は、移動体30の周辺の映像に関する映像データを撮像部31から取得する(ステップS410)。マップ生成部396は、映像データ取得部392が取得した移動体30の周辺の映像データに基づいて、障害物の情報を含むマップを生成する(ステップS420)。 The image pickup control unit 391 uses the image pickup unit 31 to photograph the environment around the moving body 30 (step S400). The video data acquisition unit 392 acquires video data related to the video around the moving body 30 from the image pickup unit 31 (step S410). The map generation unit 396 generates a map including information on obstacles based on the video data around the moving body 30 acquired by the video data acquisition unit 392 (step S420).
 駆動制御部393は、第1通信部37を介して、情報処理装置10を使用するユーザに関するユーザ情報を取得する(ステップS430)。駆動制御部393は、取得したユーザ情報に基づいて、撮像部31によって撮影される映像が、ユーザが歩行する際の目線から見た映像と一致するような高度を調整する(ステップS440)。 The drive control unit 393 acquires user information about the user who uses the information processing device 10 via the first communication unit 37 (step S430). Based on the acquired user information, the drive control unit 393 adjusts the altitude so that the image captured by the image pickup unit 31 matches the image seen from the line of sight when the user walks (step S440).
 撮像制御部391は、撮像部31を用いて移動体30の進行方向の環境を撮影する(ステップS450)。出力制御部394は、第1通信部17を介して、移動体30の進行方向の映像に関する映像データを、情報処理装置10に出力する(ステップS460)。 The image pickup control unit 391 uses the image pickup unit 31 to photograph the environment in the traveling direction of the moving body 30 (step S450). The output control unit 394 outputs video data related to the video in the traveling direction of the moving body 30 to the information processing device 10 via the first communication unit 17 (step S460).
 駆動制御部393は、第1通信部37を介して、情報処理装置10から姿勢情報を取得したか否かを判定する(ステップS470)。姿勢情報を取得したと判定した場合(ステップS470;Yes)、駆動制御部393は、姿勢情報に従って移動体30の姿勢を変更する(ステップS480)。姿勢情報を取得していないと判定した場合(ステップS470;No)、ステップS490に進む。 The drive control unit 393 determines whether or not the attitude information has been acquired from the information processing device 10 via the first communication unit 37 (step S470). When it is determined that the posture information has been acquired (step S470; Yes), the drive control unit 393 changes the posture of the moving body 30 according to the posture information (step S480). If it is determined that the posture information has not been acquired (step S470; No), the process proceeds to step S490.
 ステップS470でNoと判定した場合およびステップS480の後、駆動制御部393は、第1通信部37を介して、コントローラ20から移動体制御情報を取得したか否かを判定する(ステップS490)。移動体制御情報を取得したと判定した場合(ステップS490;Yes)、駆動制御部393は、移動体制御情報に従って、移動体30を移動させる(ステップS500)。移動体制御情報を取得していないと判定した場合(ステップS490;No)、ステップS510に進む。 When No is determined in step S470 and after step S480, the drive control unit 393 determines whether or not the mobile control information has been acquired from the controller 20 via the first communication unit 37 (step S490). When it is determined that the mobile body control information has been acquired (step S490; Yes), the drive control unit 393 moves the mobile body 30 according to the mobile body control information (step S500). If it is determined that the mobile control information has not been acquired (step S490; No), the process proceeds to step S510.
 ステップS490でNoと判定した場合およびステップS500の後、AR信号制御部397は、端末装置40に対してAR信号を送信する(ステップS510)。 When No is determined in step S490 and after step S500, the AR signal control unit 397 transmits an AR signal to the terminal device 40 (step S510).
 制御部39は、処理を終了するか否かを判定する(ステップS520)。制御部39は、処理を終了する旨の操作および電源をオフする操作などを受け付けた場合に、処理を終了すると判定する。終了を終了しないと判定した場合(ステップS520;No)、ステップS450に進む。処理を終了すると判定した場合(ステップS520;Yes)、図12の処理を終了する。 The control unit 39 determines whether or not to end the process (step S520). When the control unit 39 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 39 determines to end the process. If it is determined that the end is not completed (step S520; No), the process proceeds to step S450. When it is determined to end the process (step S520; Yes), the process of FIG. 12 is terminated.
[端末装置の処理]
 図13を用いて、実施形態に係る端末装置の処理について説明する。図13は、実施形態に係る端末装置の処理の流れの一例を示すフローチャートである。
[Processing of terminal equipment]
A process of the terminal device according to the embodiment will be described with reference to FIG. FIG. 13 is a flowchart showing an example of the processing flow of the terminal device according to the embodiment.
 撮像制御部501は、撮像部41を用いて移動体30を撮影する(ステップS600)。表示制御部504は、移動体30を撮影して得られた映像データに関する映像を表示部42に表示する(ステップS610)。 The image pickup control unit 501 photographs the moving body 30 using the image pickup unit 41 (step S600). The display control unit 504 displays an image related to the image data obtained by photographing the moving body 30 on the display unit 42 (step S610).
 AR信号制御部509は、移動体30からAR信号を受信したか否かを判定する(ステップS620)。AR信号を受信したと判定した場合(ステップS620;Yes)、アバター生成部506は、AR信号に基づいて、情報処理装置10のユーザのアバターを生成する(ステップS630)。AR信号を受信していないと判定した場合(ステップS620;No)、ステップS650に進む。ステップS630の後、表示制御部504は、表示部42に表示された移動体30に対して、アバター生成部506が生成したアバターを重畳表示させる(ステップS640)。 The AR signal control unit 509 determines whether or not an AR signal has been received from the mobile body 30 (step S620). When it is determined that the AR signal has been received (step S620; Yes), the avatar generation unit 506 generates an avatar of the user of the information processing apparatus 10 based on the AR signal (step S630). If it is determined that the AR signal has not been received (step S620; No), the process proceeds to step S650. After step S630, the display control unit 504 superimposes and displays the avatar generated by the avatar generation unit 506 on the moving body 30 displayed on the display unit 42 (step S640).
 ステップS620でNoと判定した場合およびステップS640の後、音声データ取得部503は、第1通信部48を介して、情報処理装置10から情報処理装置10のユーザが発話した音声に関する音声データを取得する(ステップS650)。出力制御部508は、音声データ取得部503が情報処理装置10から取得した音声データに関する音声をスピーカ44から出力する(ステップS660)。 When No is determined in step S620 and after step S640, the voice data acquisition unit 503 acquires voice data related to the voice spoken by the user of the information processing device 10 from the information processing device 10 via the first communication unit 48. (Step S650). The output control unit 508 outputs the voice related to the voice data acquired from the information processing apparatus 10 by the voice data acquisition unit 503 from the speaker 44 (step S660).
 音声データ取得部503は、マイク43から端末装置40のユーザが発話した音声に関する音声データを取得する(ステップS670)。出力制御部508は、第1通信部48を介して、音声データ取得部503がマイク43から取得した音声データを情報処理装置10に出力する(ステップS680)。 The voice data acquisition unit 503 acquires voice data related to the voice spoken by the user of the terminal device 40 from the microphone 43 (step S670). The output control unit 508 outputs the voice data acquired from the microphone 43 by the voice data acquisition unit 503 to the information processing device 10 via the first communication unit 48 (step S680).
 制御部50は、処理を終了するか否かを判定する(ステップS690)。制御部50は、処理を終了する旨の操作および電源をオフする操作などを受け付けた場合に、処理を終了すると判定する。終了を終了しないと判定した場合(ステップS690;No)、ステップS600に進む。処理を終了すると判定した場合(ステップS690;Yes)、図13の処理を終了する。 The control unit 50 determines whether or not to end the process (step S690). When the control unit 50 receives an operation to end the process, an operation to turn off the power, and the like, the control unit 50 determines that the process is to be completed. If it is determined that the end is not completed (step S690; No), the process proceeds to step S600. When it is determined to end the process (step S690; Yes), the process of FIG. 13 is terminated.
 上述のとおり、本実施形態では、病院に入院したり施設に入所したりしているユーザは、ドローンに設けられたカメラで撮影された、歩行している際の目線と一致する映像をヘッドマウントディスプレイで視認しながら、ドローンを操作する。これにより、病院に入院したり施設に入所したりしているユーザは、疑似的に一時帰宅した感覚を体験することができる。 As described above, in the present embodiment, the user who is admitted to the hospital or the facility head-mounts the image taken by the camera provided in the drone and which matches the line of sight when walking. Operate the drone while visually observing it on the display. As a result, the user who is admitted to the hospital or the facility can experience the sensation of returning home in a pseudo manner.
 なお、上述の実施形態では、ドローンなどの移動体300を自宅などの室内で移動させる場合について説明したが、本開示はこれに限定されない。本実施形態は、移動体300により事前に障害物に関する情報を含むマップを生成することができれば、屋外で使用してもよい。これにより、例えば、長期間の間、病院に入院したり施設に入所したりしている高齢者などの散歩コースのマップを生成することにより、高齢者などに対して疑似的に散歩をしている感覚を体験させることができる。 Although the above-described embodiment has described the case where the mobile body 300 such as a drone is moved indoors such as at home, the present disclosure is not limited to this. The present embodiment may be used outdoors as long as the moving body 300 can generate a map including information on obstacles in advance. As a result, for example, by generating a map of a walking course for elderly people who have been hospitalized or admitted to a facility for a long period of time, they can take a pseudo walk for elderly people. You can experience the feeling of being there.
 以上、本開示の実施形態を説明したが、これら実施形態の内容により本開示が限定されるものではない。また、前述した構成要素には、当業者が容易に想定できるもの、実質的に同一のもの、いわゆる均等の範囲のものが含まれる。さらに、前述した構成要素は適宜組み合わせることが可能である。さらに、前述した実施形態の要旨を逸脱しない範囲で構成要素の種々の省略、置換又は変更を行うことができる。 Although the embodiments of the present disclosure have been described above, the present disclosure is not limited by the contents of these embodiments. Further, the above-mentioned components include those that can be easily assumed by those skilled in the art, those that are substantially the same, that is, those in a so-called equal range. Furthermore, the components described above can be combined as appropriate. Further, various omissions, replacements or changes of the components can be made without departing from the gist of the above-described embodiment.
 本実施形態の情報処理装置、情報処理システム、情報処理方法、およびプログラムは、例えば、ウエアラブルデバイスと、ロボットなどの移動体を用いて遠隔地のユーザと通信を行う通信装置に利用することができる。 The information processing device, information processing system, information processing method, and program of the present embodiment can be used, for example, as a communication device that communicates with a user at a remote location by using a wearable device and a mobile body such as a robot. ..
 1 情報処理システム
 10 情報処理装置
 11,42 表示部
 12,43 マイク
 13,44 スピーカ
 14,21,33,45 操作部
 15,22,34 センサ
 16,35,46 記憶部
 17,23,37,48 第1通信部
 18,24,38,49 第2通信部
 19,25,39,50 制御部
 20 コントローラ
 30 移動体
 31,41 撮像部
 32 駆動部
 36 AR信号送信部
 40 端末装置
 47 AR信号受信部
 191,392,502 映像データ取得部
 192,503 音声データ取得部
 193,252 姿勢情報取得部
 194,504 表示制御部
 195,251,507 操作制御部
 196,394,508 出力制御部
 197,254,398,510 通信制御部
 391,501 撮像制御部
 393 駆動制御部
 395 距離算出部
 396 マップ生成部
 397,509 AR信号制御部
 505 相対距離算出部
 506 アバター生成部
1 Information processing system 10 Information processing device 11,42 Display unit 12,43 Microphone 13,44 Speaker 14,21,33,45 Operation unit 15,22,34 Sensor 16,35,46 Storage unit 17,23,37,48 1st communication unit 18, 24, 38, 49 2nd communication unit 19, 25, 39, 50 Control unit 20 Controller 30 Mobile unit 31, 41 Imaging unit 32 Drive unit 36 AR signal transmission unit 40 Terminal device 47 AR signal reception unit 191,392,502 Video data acquisition unit 192,503 Audio data acquisition unit 193,252 Attitude information acquisition unit 194,504 Display control unit 195,251,507 Operation control unit 196,394,508 Output control unit 197,254,398 , 510 Communication control unit 391,501 Imaging control unit 393 Drive control unit 395 Distance calculation unit 396 Map generation unit 397,509 AR signal control unit 505 Relative distance calculation unit 506 Avatar generation unit

Claims (5)

  1.  第1の場所のユーザの操作に従って、第2の場所を移動可能な移動体に設けられた撮像部によって撮影された、前記ユーザの目線の映像に関する映像データを取得する映像データ取得部と、
     前記映像データを表示部に表示させる表示制御部と、
     前記ユーザの姿勢に関する姿勢情報を取得する姿勢情報取得部と、
     前記姿勢情報取得部が取得した姿勢情報に応じて前記移動体の姿勢を変更させる操作制御部と、
     を備える、情報処理装置。
    A video data acquisition unit that acquires video data related to the image of the user's line of sight taken by an imaging unit provided on a moving body that can move to the second location according to the operation of the user at the first location.
    A display control unit that displays the video data on the display unit,
    A posture information acquisition unit that acquires posture information related to the user's posture, and
    An operation control unit that changes the posture of the moving body according to the posture information acquired by the posture information acquisition unit, and an operation control unit.
    An information processing device equipped with.
  2.  前記表示制御部は、移動体をコントローラによって移動させるための操作アイコンを前記映像データに重畳して表示させる、
     請求項1に記載の情報処理装置。
    The display control unit superimposes and displays an operation icon for moving the moving body by the controller on the video data.
    The information processing apparatus according to claim 1.
  3.  請求項1または2に記載の情報処理装置と、
     前記情報処理装置と通信可能な端末装置と、を含み、
     前記端末装置は、
     撮像部によって撮影された移動体の映像を取得する映像データ取得部と、
     前記情報処理装置のユーザのアバターを生成するアバター生成部と、
     前記映像データ取得部が取得した映像を表示部に表示させるとともに、前記表示部において前記移動体に対して前記ユーザのアバターを重畳する表示制御部と、
     を備える、情報処理システム。
    The information processing apparatus according to claim 1 or 2,
    Including a terminal device capable of communicating with the information processing device,
    The terminal device is
    An image data acquisition unit that acquires images of a moving object taken by the image pickup unit, and a video data acquisition unit.
    An avatar generation unit that generates an avatar of a user of the information processing device,
    A display control unit that displays the video acquired by the video data acquisition unit on the display unit and superimposes the user's avatar on the moving body on the display unit.
    An information processing system equipped with.
  4.  第1の場所のユーザの操作に従って、第2の場所を移動可能な移動体に設けられた撮像部によって撮影された、前記ユーザの目線の映像に関する映像データを取得するステップと、
     前記映像データを表示部に表示させるステップと、
     前記ユーザの姿勢に関する姿勢情報を取得するステップと、
     取得された姿勢情報に応じて前記移動体の姿勢を変更させるステップと、
     を含む、情報処理方法。
    A step of acquiring video data related to an image of the user's line of sight taken by an image pickup unit provided in a moving body that can move in a second place according to a user's operation in the first place.
    The step of displaying the video data on the display unit and
    The step of acquiring posture information regarding the posture of the user and
    A step of changing the posture of the moving body according to the acquired posture information, and
    Information processing methods, including.
  5.  第1の場所のユーザの操作に従って、第2の場所を移動可能な移動体に設けられた撮像部によって撮影された、前記ユーザの目線の映像に関する映像データを取得するステップと、
     前記映像データを表示部に表示させるステップと、
     前記ユーザの姿勢に関する姿勢情報を取得するステップと、
     取得された姿勢情報に応じて前記移動体の姿勢を変更させるステップと、
     をコンピュータに実行させる、プログラム。
    A step of acquiring video data related to an image of the user's line of sight taken by an image pickup unit provided in a moving body that can move in a second place according to a user's operation in the first place.
    The step of displaying the video data on the display unit and
    The step of acquiring posture information regarding the posture of the user and
    A step of changing the posture of the moving body according to the acquired posture information, and
    A program that lets your computer run.
PCT/JP2020/047282 2020-07-28 2020-12-17 Information processing device, information processing system, information processing method, and program WO2022024412A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-127567 2020-07-28
JP2020127567A JP2022024779A (en) 2020-07-28 2020-07-28 Information processing device, information processing system, information processing method, and program

Publications (1)

Publication Number Publication Date
WO2022024412A1 true WO2022024412A1 (en) 2022-02-03

Family

ID=80037884

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/047282 WO2022024412A1 (en) 2020-07-28 2020-12-17 Information processing device, information processing system, information processing method, and program

Country Status (2)

Country Link
JP (1) JP2022024779A (en)
WO (1) WO2022024412A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013038622A (en) * 2011-08-09 2013-02-21 Topcon Corp Remote control system
JP2016115328A (en) * 2014-12-17 2016-06-23 富士ゼロックス株式会社 Method for calculation execution, calculation processing system, and program
WO2019163031A1 (en) * 2018-02-21 2019-08-29 株式会社ソニー・インタラクティブエンタテインメント Image processing device and image processing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013038622A (en) * 2011-08-09 2013-02-21 Topcon Corp Remote control system
JP2016115328A (en) * 2014-12-17 2016-06-23 富士ゼロックス株式会社 Method for calculation execution, calculation processing system, and program
WO2019163031A1 (en) * 2018-02-21 2019-08-29 株式会社ソニー・インタラクティブエンタテインメント Image processing device and image processing method

Also Published As

Publication number Publication date
JP2022024779A (en) 2022-02-09

Similar Documents

Publication Publication Date Title
JP6212667B1 (en) Method executed by computer to communicate via virtual space, program causing computer to execute the method, and information processing apparatus
WO2016017245A1 (en) Information processing device, information processing method, and image display system
JP6229089B1 (en) Method executed by computer to communicate via virtual space, program causing computer to execute the method, and information processing apparatus
JP4386367B2 (en) Communication robot improvement system
US11443540B2 (en) Information processing apparatus and information processing method
EP3196734B1 (en) Control device, control method, and program
JPWO2017213070A1 (en) Information processing apparatus and method, and recording medium
US20180299948A1 (en) Method for communicating via virtual space and system for executing the method
WO2017064926A1 (en) Information processing device and information processing method
JP2006139525A (en) Autonomous mobile robot
WO2022024412A1 (en) Information processing device, information processing system, information processing method, and program
WO2020209167A1 (en) Information processing device, information processing method, and program
JP7203157B2 (en) Video processing device and program
JP7287798B2 (en) Remote camera system, control system, video output method, virtual camera work system, and program
JP4517085B2 (en) Robot remote control system
US11518036B2 (en) Service providing system, service providing method and management apparatus for service providing system
JP2004363987A (en) Image presentation system
JP6321247B1 (en) Method executed by computer for moving in virtual space, program for causing computer to execute the method, and information processing apparatus
US20200234046A1 (en) Information providing system, information providing method and management apparatus for information providing system
WO2023228432A1 (en) Robot, robot control method, and computer program
JP2019106009A (en) Autonomous travel body and autonomous travel body system
WO2022149497A1 (en) Information processing device, information processing method, and computer program
WO2022149496A1 (en) Entertainment system and robot
US20230415346A1 (en) Operation system, operation method, and storage medium
US20230206546A1 (en) Terminal device, method, and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20947451

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20947451

Country of ref document: EP

Kind code of ref document: A1