WO2018123431A1 - Interactive robot - Google Patents

Interactive robot Download PDF

Info

Publication number
WO2018123431A1
WO2018123431A1 PCT/JP2017/043112 JP2017043112W WO2018123431A1 WO 2018123431 A1 WO2018123431 A1 WO 2018123431A1 JP 2017043112 W JP2017043112 W JP 2017043112W WO 2018123431 A1 WO2018123431 A1 WO 2018123431A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
robot
image
upper unit
user
Prior art date
Application number
PCT/JP2017/043112
Other languages
French (fr)
Japanese (ja)
Inventor
隆弘 飯島
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2018123431A1 publication Critical patent/WO2018123431A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • This disclosure relates to an interactive robot that can communicate with a user through conversation.
  • Patent Document 1 discloses a robot apparatus that communicates with people (including adults, children, and pets) in a home environment.
  • the robot apparatus includes a drive unit including a plurality of links and joints connecting the links, a task instruction input unit that inputs a task instruction, and controls the operation of the drive unit based on the input task.
  • a drive control unit for determining a no-go-in area composed of a space required for the operation, and an area display unit for displaying the no-go-in area. According to this robot apparatus, it is possible to suitably take appropriate safety measures against people in the home environment.
  • Patent Document 2 discloses a robot apparatus that can identify a person.
  • the robot apparatus includes a video acquisition unit that acquires an image, a head detection tracking unit that detects a human head from the image, and a front face position that acquires a frontal face image from the detected partial image of the head
  • a person comprising a matching means, a face feature extraction means for converting a front face image into a feature value, a face identification means for identifying a person from the feature value using an identification dictionary, and an identification dictionary storage means for storing the identification dictionary
  • An identification device an overall control unit that controls the operation of the robot; and a moving unit that moves the robot in response to an instruction from the overall control unit.
  • a person can be identified even in an environment where lighting conditions are not constant, such as a garden environment.
  • This disclosure provides an interactive robot that can communicate with a user through conversation.
  • an interactive robot capable of communicating with a user through conversation.
  • the interactive robot includes an upper unit, a lower unit, and a joint part that connects the upper unit and the lower unit and is foldable so as to raise and lower the upper unit relative to the lower unit.
  • the upper unit includes a projection device that projects an image and an imaging device that captures an image.
  • the lower unit includes a drive device that drives a wheel for moving the robot, and a control device that controls the operation of the robot.
  • the upper unit is placed in contact with the lower unit, the projection device, the imaging device, and the joint unit are housed in the upper and lower units, and the upper unit is lifted away from the lower unit.
  • the projection device, the imaging device, and the second state in which at least a part of the joint portion is exposed can be taken.
  • the perspective view of the home robot which is one embodiment of this indication (the state where the head part opened) Perspective view of home robot (head is raised and away from body) Perspective view of home robot (head part lowered and in contact with body part) It is an expanded view of a home robot. It is a perspective view of a home robot in a state where the head part is lifted up from the body part. It is a perspective view of the joint part which connects a head part and a body part. (A) Perspective view of the home robot in the state shown in FIG. 5 (viewed from the front), (B) Perspective view of the home robot in the state shown in FIG. 5 (viewed from the side). It is a figure explaining the attitude
  • FIG. 11 is a block diagram showing the internal configuration of the home robot
  • FIG. 12 is a diagram illustrating a state of the home robot that projects the video. It is a figure explaining the state of the home robot which is interacting with the user.
  • FIG. 1 to 3 are perspective views of the home robot according to the first embodiment of the present disclosure.
  • the home robot 100 is naturally adapted to the living space and has an egg-shaped design that is highly mentally and psychologically compatible with humans.
  • the home robot 100 is divided into two parts, a head unit 10 (an example of an upper unit) and a body unit 50 (an example of a lower unit).
  • the position of the boundary of the division (joint surface) is a position above the center position in the height direction.
  • the radius of the head part 10 can be made small and the weight of the head part 10 can be reduced, the center of gravity of the entire robot 100 can be positioned on the lower side, and it becomes difficult to fall over.
  • the external appearance of the state where the head part 10 is separated from the body part 50 as shown in FIGS. 1 and 2 can remind a person that the egg shell is broken, and can make a person feel familiar.
  • the head part 10 and the body part 50 are connected by a joint part 30. Details of the configuration of the joint unit 30 will be described later.
  • the head unit 10 as shown in FIGS. 1 and 2 is lifted away from the body unit 50 (hereinafter referred to as “open state”), and the head unit 10 as shown in FIG. It can take two states: a state of being placed in contact with the top (hereinafter referred to as “closed state”).
  • a projector 11 for projecting an image and a camera 13 are fixedly attached to the head unit 10.
  • the recessed part 41a for accommodating the joint part 30 (32) in the closed state is provided in the head part 10 (refer FIG. 1).
  • the body portion 50 is also provided with a concave portion 41b for accommodating the joint portion 30 (32) and the projector 11 and the camera 13 of the head portion 10 in the closed state (see FIG. 2).
  • FIG. 4 is a development view of the home robot 100.
  • the head unit 10 includes a first upper housing 10a and a second upper housing 10b as outer shell cases.
  • the body part 50 includes a first lower casing 50a, a second lower casing 50b, and a third lower casing 50c as outer shell cases.
  • Each housing 10a to 10b, 50a to 50c is made of resin.
  • the projector 11 and the camera 13 are attached to the first upper housing 10a.
  • the first upper housing 10 a is provided with an arm attachment portion 10 d for attaching the joint portion 30.
  • a speaker hole 17b is provided in the upper part of the second upper housing 10b.
  • the second upper housing 10b is a case that covers the projector 11 and the like mounted on the first upper housing 10a.
  • Lens covers 11b and 13b for protecting the respective lenses are attached to the projector 11 and the camera 13 attached to the head unit 10.
  • a base 51 for arranging hardware parts (circuits, etc.) is arranged in the body part 50.
  • an infrared sensor 55b (see FIG. 11) is disposed at a position 55 of the second lower housing 50b. Note that the infrared sensors 55b are arranged at equal intervals in four locations of the second lower housing 50b.
  • a wheel 57 for moving the home robot 100 is attached in the lower region of the body portion 50.
  • the wheel 57 is attached at four locations in the lower region of the body portion 50.
  • a wheel cover 57b for protecting the hole is attached to each wheel 57.
  • joint part 30 which is a mechanism for connecting the head part 10 and the body part 50 will be described.
  • FIG. 5 is a perspective view of the home robot 100 with the head unit 10 raised from the body unit 50.
  • FIG. 6 is a perspective view of the joint unit 30 in the home robot 100 in the posture shown in FIG.
  • FIG. 7A is a perspective view of the home robot 100 in the state shown in FIG. 5, and is a view seen from the front of the home robot 100.
  • FIG. 7B is a perspective view of the home robot 100 in the same state, as viewed from the side of the home robot 100.
  • the joint part 30 is composed of a first arm 31 attached to the body part 50 side and a second arm 32 attached to the head part 10 side.
  • the first and second arms 31 and 32 are made of metal.
  • the first arm 31 is rotatably connected to the first lower housing 50a of the body part 50.
  • the second arm 32 is rotatably connected to the arm attachment portion 10d of the first upper housing 50a of the head portion 10.
  • the 1st arm 31 and the 2nd arm 32 are connected so that rotation is mutually possible.
  • the joint part 30 connects the head part 10 and the body part 50 by three joint mechanisms using two arms.
  • the irradiation angle (tilt angle) of the projector can be widely changed while keeping the balance of the center of gravity optimal.
  • An actuator 35 for driving the arm is provided at the joint where the first arm 31 is connected to the first lower housing 50a. At the joint where the first arm 31 and the second arm 32 are coupled, an arm driving actuator 36 is provided. At the joint where the second arm 32 is connected to the arm mounting portion 10d of the head portion 10, an arm driving actuator 37 is provided. These actuators 35 to 37 drive the first and second arms 31 and 32 under the control of the controller 51.
  • the actuators 35 to 37 are servo motors, for example.
  • the actuator 35 By driving the actuator 35, the angle of the first arm 31 with respect to the body portion 50 can be adjusted.
  • the actuator 36 By driving the actuator 36, the angle of the second arm 32 with respect to the first arm 31 can be adjusted.
  • the actuator 37 By driving the actuator 37, the angle of the head unit 10 with respect to the second arm 32 can be adjusted.
  • the height of the projection position by the projector 11 or the height of the photographing area by the camera 13 can be adjusted.
  • FIGS. 8A to 8C are diagrams illustrating the posture of the home robot 100 in a state where an image is projected toward the front lower side.
  • FIGS. 9A to 9C are diagrams illustrating the posture of the home robot 100 in a state where an image is projected toward the front.
  • FIGS. 10A to 10C are diagrams illustrating the posture of the home robot 100 in a state where an image is projected upward.
  • 8 (A), 9 (A), 10 (A) are front views
  • FIGS. 8 (B), 9 (B), 10 (B) are side views
  • FIGS. C) and 10 (C) are rear views.
  • the projection direction of the projector 11 can be freely changed by adjusting the angles of the first and second arms 31 and 32 of the joint portion 30.
  • the opening angle of the head unit 10 is increased.
  • the controller 51 controls the angles of the first and second arms 31 and 32 so that the balance is not lost due to the weight of the head unit 10 and the robot 100 does not fall.
  • the first and second arms prevent the robot 100 from tipping over when the image is projected directly upward so that the balance is lost due to the weight of the head unit 10.
  • the angles of 31 and 32 are adjusted.
  • FIG. 11 is a block diagram showing an electrical configuration of the home robot 100. As shown in FIG. 11, the home robot 100 includes a projector 11, a camera 13, a microphone 15, a speaker 17, a temperature sensor 19, and a gyro sensor 21 in the header unit 10.
  • the home robot 100 includes a controller 51, a communication device 53, a data storage unit 54, an infrared sensor 55b, a wheel drive unit 56, and a battery 59 in the body unit 50.
  • the home robot 100 is provided with arm drive actuators 35 to 37 at the connecting portions of the head portion 10, the body portion 50 and the joint portion 30.
  • the projector 11 is a projection device that projects a high-quality image such as 4K.
  • the projector 11 receives the video signal from the controller 51 and projects the video indicated by the video signal.
  • the camera 13 is an imaging device that includes an optical system and an image sensor, and shoots a subject to generate image data.
  • the microphone 15 inputs external sound, converts it into a sound signal, and transmits it to the controller 51.
  • the speaker 17 outputs sound based on the sound signal from the controller 51.
  • the temperature sensor 19 detects the temperature inside the head unit 10 (particularly, around the projector 11).
  • the gyro sensor 21 detects the movement (angular acceleration) of the head unit 10.
  • the infrared sensor 55 b is provided below the housing 50 b of the body unit 50 and detects the presence or absence of an obstacle around the home robot 100.
  • the data storage unit 54 is a recording medium that stores control parameters, data, control programs, and the like necessary for realizing the functions of the home robot 100.
  • the data storage unit 54 can be composed of, for example, a hard disk (HDD), a semiconductor storage device (SSD), or an optical disk medium.
  • the controller 51 is a control device that controls the operation of the home robot 100.
  • the controller 51 includes a CPU, a RAM, a video signal processing circuit, an audio signal processing circuit, and the like that control the operation of the home robot 100 as a whole.
  • the CPU realizes the function of the home robot 100 by executing a control program (software).
  • the function of the home robot 100 may be realized by cooperation of software and hardware, or may be realized only by a hardware circuit designed exclusively. That is, an MPU, DSP, FPGA, ASIC, or the like may be mounted instead of the CPU.
  • the wireless communication unit 53 is a communication module (communication circuit) for performing communication according to the Bluetooth (registered trademark) standard, the WiFi standard, or the like.
  • the wireless communication unit 53 may include a communication module (circuit) for performing communication according to a communication standard for wide-area communication such as 3G, 4G, LTE, WiMAX (registered trademark).
  • the home robot 100 has four independent wheels 57.
  • the wheel drive unit 56 is a drive circuit that generates control signals for independently driving the four wheels 57.
  • the wheel 57 is composed of an omni wheel that enables various movements.
  • the omni wheel is configured by arranging a plurality of barrel-shaped rotating bodies on the circumference.
  • the omni wheel can realize movement in multiple directions by a combination of rotation of the main body (wheel) on the shaft (back and forth movement) and rotation of the barrel-shaped rotating body on the circumference (left and right movement).
  • the wheel driving unit 56 independently drives the four wheels 57 under the control of the controller 51, thereby realizing various movements including linear movement and turning in the front-rear and left-right directions.
  • the battery 59 supplies power for causing each part of the home robot 100 to function.
  • the battery 56 is a rechargeable secondary battery.
  • the home robot 100 has a voice recognition function and can accept voice instructions from the user. Specifically, the voice from the user is input through the microphone 15 mounted on the head unit 10.
  • the controller 51 performs AD conversion on the audio signal from the microphone 15 and recognizes the content of the user instruction by performing voice recognition based on the converted audio data.
  • the home robot 100 can interact with the user by outputting sound from the speaker 17.
  • Various kinds of voices such as adult / child voices, male / female voices, beep sounds, and the like can be output as voice types.
  • the home robot 100 stores various types of vocabulary information in the data storage unit 54, and can output information having the same meaning in different expressions depending on situations in a conversation with the user.
  • the head unit 10 When the function of the home robot 100 is stopped, as shown in FIG. 3, the head unit 10 is lowered and brought into contact with the body unit 50. In this closed state, the projector 11 and the camera 13 are housed inside the head unit 10 and the body unit 50 so as not to be exposed to the outside. At this time, the home robot 100 has an egg shape as shown in FIG.
  • the voice recognition function via the microphone 15 is turned on, and the other functions are in a standby state in which they are turned off from the viewpoint of energy saving.
  • the controller 51 wakes up, and as shown in FIG. 1 or FIG. 2, the head unit 10 rises from the body unit 50 and the home robot 100 enters an open state. In this open state, the projector 11 and the camera 13 are exposed from the head unit 10 and the body unit 50, thereby enabling video projection and image shooting.
  • the home robot 100 of the present embodiment is deformed from the egg shape as shown in FIG. 3 to the shape as shown in FIGS.
  • Such a deformation can give the user the impression that a new creature has been born as a result of a broken egg, and is pronounced of creatures.
  • the home robot 100 can be connected to an access point via the wireless communication unit 53 and can be connected to a network (for example, the Internet) via the access point. Therefore, the home robot 100 can access a server and other devices connected on the network (cloud), and can acquire various content information.
  • a network for example, the Internet
  • the controller 51 of the home robot 100 determines obstacles and surrounding conditions around the robot based on the detection signal from the infrared sensor 55b. For example, when it is determined that there is an obstacle around the controller 51, the controller 51 determines a movement path so as to avoid the obstacle and controls the wheel driving unit 56. If the controller 51 detects that there is a hole in the destination floor surface or there is no table surface in the destination while moving on the floor or table, the controller 51 drops into the hole or from the table surface. The movement path is determined so as to prevent this, and the wheel drive unit 56 is controlled.
  • the home robot 100 When the home robot 100 receives a shutdown (power off) instruction from the user, the home robot 100 changes from an open state to a closed state. At this time, in order to prevent the user's finger from being sandwiched between the head unit 10 and the body unit 50, the head unit 10 and the body unit 50 are not brought into contact with each other with a certain gap therebetween. It may be held. In addition, a warning message or a warning sound may be output before starting shutdown.
  • the home robot 100 has a function of shutting down the function of the robot 100 (that is, stopping the projector 11) when the temperature of the head unit 10 becomes high. For this reason, the controller 51 determines whether or not the robot 100 has reached a high temperature based on the temperature detected by the temperature sensor 19. Specifically, when the temperature detected by the temperature sensor 19 exceeds a predetermined value, the controller 51 determines that the robot 100 is hot and shuts down (stops the projector 11). At that time, the controller 51 may output a voice message (warning message) “Please cool down a little” from the speaker 17 and then automatically shut down. Accordingly, the user can recognize that the robot 100 will be shut down soon because the temperature of the robot 100 is high.
  • a voice message warning message
  • the home robot 100 is shut down after changing the posture of the home robot 100 from the open state to the closed state. Thereby, it can prevent touching a user's finger
  • the controller 51 may determine whether or not the robot 100 is hot based on the continuous operation time of the projector 11 in addition to the temperature of the head unit 10.
  • the home robot 100 performs actions with cute caress and emotion (depressing, nodding, shaking the body, etc.) according to various scenes. For example, the home robot 100 recognizes a question from the user and recognizes the content of the question. When “YES” is shown in response to the question, the home robot 100 moves the head unit 10 up and down so that it looks like it is creeping. Specifically, the controller 51 drives an actuator 37 at a connection portion between the second arm 32 and the head unit 10 to cause the head unit 10 to slightly vibrate up and down with respect to the second arm 32. Alternatively, when “NO” is indicated in response to an inquiry from the user, the entire home robot 100 is moved to the left and right.
  • the controller 51 controls the movement of the four wheels 57 so that the entire home robot 100 is swung left and right.
  • embarrassment and anger may be expressed by bringing the head portion 10 into a closed state in contact with the body portion 50. Emotions can be expressed by these various movements.
  • the home robot 100 has a function of capturing an image in front of the robot 100 by the camera 13.
  • An image captured by the camera 13 may be stored in the data storage unit 54.
  • the controller 51 can always acquire an image from the camera 13 and recognize the situation in front of the robot 100 by analyzing the acquired image. For example, the controller 51 can detect whether or not there is a user (person) ahead, the position of the user, the number of people, the orientation of the user's face, and the like based on the result of the image analysis.
  • the home robot 100 can project a video (content information 80) from the projector 11 as shown in FIG.
  • the projection position of the image can be changed by changing the direction of the projector 11.
  • the vertical direction of the projector 11 can be changed by controlling the arc drive actuators 35 to 37 and changing the height and tilt angle of the head unit 10 (ie, the projector 11). In the vertical direction, the projection position can be changed within a range of approximately 180 degrees from approximately directly below to directly above.
  • the horizontal direction (pan direction) of the projector 11 can be changed by controlling the wheel 57 to change the horizontal direction (pan direction) of the entire robot 100.
  • an image can be projected on various places such as a table surface, a wall, a floor, and a ceiling.
  • the controller 51 of the home robot 100 may recognize the user position from the image analysis result and control the image projection operation by the projector 11 based on the user position. For example, when the controller 51 detects that the user is at a position where the video is to be projected, the projector 11 stops the video projection operation or reduces the intensity of the video light from the viewpoint of safety. May be controlled. Thereby, it is possible to prevent the user from being irradiated with strong image light from the projector 11.
  • the home robot 100 when the home robot 100 (the controller 51) detects that the user is at a position where the video is to be projected, the home robot 100 (controller 51) may control the orientation of the projector 11 so as to project the video at a position where the user is not present. Thereby, it is possible to prevent the user from being irradiated with the image light from the projector 11.
  • the orientation of the home robot 100 when the orientation of the home robot 100 is changed, an image is projected within a predetermined time after the orientation is changed. This is to prevent the user from feeling sad about being turned away.
  • the controller 51 may project an image on a position that is easy for the user to see (for example, an area in front of the user). At that time, the controller 51 detects the user's face from the image, determines the orientation of the user's face, and appropriately adjusts the orientation of the video (content) so that the user can visually recognize the video (content) in the correct orientation ( Project the image upside down and horizontally. For example, the controller 51 controls the projection position of the video and the direction of the video so that a video in the correct orientation as viewed from the user is displayed in an area in front of the user from the side of the user. Thereby, the user can always visually recognize an image in the correct direction regardless of the position of the robot 100.
  • the controller 51 of the home robot 100 When projecting an image from the projector 11, the controller 51 of the home robot 100 performs an operation on the image to be projected so that an image having a correct shape is displayed on the projection surface based on the relative positional relationship between the projection surface and the projector 11. Correct keystone.
  • the home robot 100 can create and project an image obtained by inverting the image vertically and horizontally based on the original image signal.
  • the projector 11 may be provided with a shake correction function. Therefore, the projector 11 includes a correction lens for changing the optical path, and a drive mechanism that moves the correction lens in a plane orthogonal to the optical axis of the projection light in accordance with the shake.
  • the projector 11 receives a signal indicating the shake of the head unit 10 from the gyro sensor 21, and moves the correction lens in a plane orthogonal to the optical axis of the projection light so as to cancel the shake.
  • the home robot 100 can project an image without blurring the image at a desired position even when the whirling operation is performed during the image projection.
  • a shake correction technique can be realized by a known camera shake correction technique used in a general camera.
  • the video signal may be controlled according to the detected blur. That is, the position of the image (object) indicated by the video signal may be shifted to a position where the blur is canceled according to the blur.
  • the home robot 100 captures a book or document by the camera 13 and recognizes information described in the book or document from the captured image.
  • the controller 51 has an OCR (Optical Character Recognition) function and recognizes text from a captured image.
  • the controller 51 may synthesize speech based on the recognized text content and output it from the speaker 17. Thereby, the text of the book can be read out.
  • the controller 51 may read the answer sheet in which the answer is written, score it, and convey the score result to the user by video or audio.
  • the controller 51 may analyze the image, recognize a note, and play music according to the note.
  • the home robot 100 configured as described above can present various information (video, audio) to the user while talking to the user.
  • the home robot 100 is an interactive robot that can communicate with a user through conversation.
  • the home robot 100 connects the head unit 10 (an example of an upper unit), the body unit 50 (an example of a lower unit), the head unit 10 and the body unit 50, and moves the head unit 10 up and down relative to the body unit 50. And a bendable joint portion 30.
  • the head unit 10 includes a projector 11 (an example of a projection device) that projects an image, and a camera 13 (an example of an imaging device) that captures an image.
  • the body unit 50 includes a wheel drive unit 56 (an example of a drive device) that drives a wheel 57 for moving the robot 100, and a controller 51 (an example of a control device) that controls the operation of the robot 100.
  • the home robot 100 is placed in contact with the head unit 10 on the body unit 50, and the projector 11, the camera 13, and the joint unit 30 are stored in the head unit 10 and the body unit 50 (in the first state). For example, see FIG. Further, the home robot 100 is in an open state in which the head unit 10 is lifted away from the body unit 50 and at least a part of the projector 11, the camera 13, and the joint unit 30 is exposed (an example of the second state, see FIGS. 1 and 2). ) Can be taken.
  • the home robot 100 configured as described above can present various information (video, audio) while talking to the user. Therefore, nursing care support for the elderly (prevention of dementia and post-growth depression, etc.), child education (distance education, learning support, answering machine when parents are absent, etc.), recreation (games, reading aloud, etc.) Various aspects of support can be provided (see FIGS. 13A and 13B).
  • the home robot 100 has an egg shape when closed. By having an egg-shaped design, the home robot 100 can naturally blend into the living space and make people feel mentally and psychologically friendly.
  • the first embodiment has been described as an example of the technique disclosed in the present application.
  • the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed.
  • the home robot of the present disclosure is an interactive robot, and can provide various user support such as recreation, education, and nursing care at home.

Abstract

An interactive robot (100) can communicate with a user by means of conversation, and is provided with: an upper unit (10); a lower unit (50); and a joint (30) that couples the upper unit and the lower unit and that is freely foldable so as to raise/lower the upper unit with respect to the lower unit. The upper unit (10) is provided with: a projection device (11) that projects a video; and an imaging device (13) that photographs an image. The lower unit is provided with: a driving device that drives wheels for moving the robot; and a control device that controls the motion of the robot. The robot can take a first state where the upper unit is mounted in contact with the lower unit, and the projection device, the imaging device, and the joint are housed in the upper and lower units, and a second state where the upper unit is raised away from the lower unit, and at least some of the projection device, the imaging device, and the joint are exposed.

Description

対話型ロボットInteractive robot
 本開示は、ユーザと会話によるコミュニケーションが可能な対話型ロボットに関する。 This disclosure relates to an interactive robot that can communicate with a user through conversation.
 近年、家庭で使用されるロボットの実用化が進み、家宅内を移動して掃除をしたり、留守番をしたりすることができるロボットが実現化されている。 In recent years, robots used at home have been put into practical use, and robots that can be moved around the house for cleaning and answering machines have been realized.
 例えば、特許文献1は、家庭環境内で人など(大人や子供、ペットも含む)とコミュニケーションをとるロボット装置を開示する。ロボット装置は、複数のリンク及びリンク間を接続する関節からなる駆動部と、タスクの指示を入力するタスク指示入力部と、入力されたタスクに基づいて駆動部の動作を制御するとともに、駆動部の動作に必要となる空間からなる立ち入り禁止領域を決定する駆動制御部と、立ち入り禁止領域を表示する領域表示部と、を具備する。このロボット装置によれば、家庭環境において人などに対する適切な安全対策を好適に行なうことができる。 For example, Patent Document 1 discloses a robot apparatus that communicates with people (including adults, children, and pets) in a home environment. The robot apparatus includes a drive unit including a plurality of links and joints connecting the links, a task instruction input unit that inputs a task instruction, and controls the operation of the drive unit based on the input task. A drive control unit for determining a no-go-in area composed of a space required for the operation, and an area display unit for displaying the no-go-in area. According to this robot apparatus, it is possible to suitably take appropriate safety measures against people in the home environment.
 また、特許文献2は、人物を識別することができるロボット装置を開示する。このロボット装置は、画像を取得する映像取得手段と、画像中から人間の頭部を検出する頭部検出追跡手段と、検出された頭部の部分画像中から正面顔画像を取得する正面顔位置合わせ手段と、正面顔画像を特徴量に変換する顔特徴抽出手段と、識別辞書を用いて特徴量から人物を識別する顔識別手段と、識別辞書を保存する識別辞書記憶手段とを備えた人物識別装置と、ロボットの動作を制御する全体制御部と、全体制御部の指示でロボットを移動する移動手段と、を備える。このロボット装置によれば、庭環境のように照明条件が一定でない環境でも人物識別が可能となる。 Patent Document 2 discloses a robot apparatus that can identify a person. The robot apparatus includes a video acquisition unit that acquires an image, a head detection tracking unit that detects a human head from the image, and a front face position that acquires a frontal face image from the detected partial image of the head A person comprising a matching means, a face feature extraction means for converting a front face image into a feature value, a face identification means for identifying a person from the feature value using an identification dictionary, and an identification dictionary storage means for storing the identification dictionary An identification device; an overall control unit that controls the operation of the robot; and a moving unit that moves the robot in response to an instruction from the overall control unit. According to this robot apparatus, a person can be identified even in an environment where lighting conditions are not constant, such as a garden environment.
特開2012-236244号公報JP 2012-236244 A 特開2002-056388号公報JP 2002-056388 A
 本開示は、ユーザと会話によるコミュニケーションが可能な対話型ロボットを提供する。 This disclosure provides an interactive robot that can communicate with a user through conversation.
 本開示の一態様において、ユーザと会話によるコミュニケーションが可能な対話型ロボットが提供される。対話型ロボットは、上部ユニットと、下部ユニットと、上部ユニットと下部ユニットを連結し、下部ユニットに対して上部ユニットを昇降させるよう折り曲げ自在なジョイント部と、を備える。上部ユニットは、映像を投影する投影装置と、画像を撮影する撮像装置と、を備える。下部ユニットは、ロボットを移動させるためのホイールを駆動する駆動装置と、ロボットの動作を制御する制御装置と、を備える。ロボットは、上部ユニットが下部ユニット上に接触して載置され、投影装置、撮像装置及びジョイント部が上部及び下部ユニット内に収納される第1状態と、上部ユニットが下部ユニットから離れて上昇し、投影装置、撮像装置及びジョイント部の少なくとも一部が露出する第2状態と、を取り得る。 In one aspect of the present disclosure, an interactive robot capable of communicating with a user through conversation is provided. The interactive robot includes an upper unit, a lower unit, and a joint part that connects the upper unit and the lower unit and is foldable so as to raise and lower the upper unit relative to the lower unit. The upper unit includes a projection device that projects an image and an imaging device that captures an image. The lower unit includes a drive device that drives a wheel for moving the robot, and a control device that controls the operation of the robot. In the robot, the upper unit is placed in contact with the lower unit, the projection device, the imaging device, and the joint unit are housed in the upper and lower units, and the upper unit is lifted away from the lower unit. The projection device, the imaging device, and the second state in which at least a part of the joint portion is exposed can be taken.
本開示の一実施の形態であるホームロボットの斜視図(ヘッド部が開いた状態)The perspective view of the home robot which is one embodiment of this indication (the state where the head part opened) ホームロボットの斜視図(ヘッド部が上昇しボディ部から離れた状態)Perspective view of home robot (head is raised and away from body) ホームロボットの斜視図(ヘッド部が下降しボディ部と接触した状態)Perspective view of home robot (head part lowered and in contact with body part) ホームロボットの展開図である。It is an expanded view of a home robot. ヘッド部がボディ部からリフトアップされた状態のホームロボットの斜視図である。It is a perspective view of a home robot in a state where the head part is lifted up from the body part. ヘッド部とボディ部とを連結するジョイント部の斜視図である。It is a perspective view of the joint part which connects a head part and a body part. (A)図5に示す状態のホームロボットの透視図(正面から見た図)、(B)図5に示す状態のホームロボットの透視図(側方から見た図)。(A) Perspective view of the home robot in the state shown in FIG. 5 (viewed from the front), (B) Perspective view of the home robot in the state shown in FIG. 5 (viewed from the side). 前方の下方側に向けて映像を投影している状態のホームロボットの姿勢を説明した図である。It is a figure explaining the attitude | position of the home robot of the state which is projecting the image | video toward the front lower side. 正面に向けて映像を投影している状態のホームロボットの姿勢を説明した図である。It is the figure explaining the attitude | position of the home robot of the state which is projecting an image | video toward the front. 真上に向けて映像を投影している状態のホームロボットの姿勢を説明した図である。It is a figure explaining the attitude | position of the home robot of the state which is projecting an image | video toward right above. 図11は、ホームロボットの内部構成を示すブロック図FIG. 11 is a block diagram showing the internal configuration of the home robot 図12は、映像を投影しているホームロボットの状態を説明した図である。FIG. 12 is a diagram illustrating a state of the home robot that projects the video. ユーザと対話しているホームロボットの状態を説明した図である。It is a figure explaining the state of the home robot which is interacting with the user.
 以下、適宜図面を参照しながら、実施の形態を詳細に説明する。但し、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明や実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。
 なお、発明者(ら)は、当業者が本開示を十分に理解するために添付図面および以下の説明を提供するのであって、これらによって特許請求の範囲に記載の主題を限定することを意図するものではない。
Hereinafter, embodiments will be described in detail with reference to the drawings as appropriate. However, more detailed description than necessary may be omitted. For example, detailed descriptions of already well-known matters and repeated descriptions for substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art.
The inventor (s) provides the accompanying drawings and the following description in order for those skilled in the art to fully understand the present disclosure, and is intended to limit the subject matter described in the claims. Not what you want.
(実施の形態1)
 以下、添付の図面を参照して、本開示の対話型ロボットの一実施の形態であるホームロボットの詳細について説明する。
(Embodiment 1)
Hereinafter, a home robot that is an embodiment of the interactive robot according to the present disclosure will be described in detail with reference to the accompanying drawings.
 [1-1.構成]
 図1ないし図3は本開示の実施の形態1のホームロボットの斜視図である。
[1-1. Constitution]
1 to 3 are perspective views of the home robot according to the first embodiment of the present disclosure.
 ホームロボット100は図1~図3に示すように、生活空間に自然に馴染み、精神的、心理的にも人に親和性の高い卵型のデザインを有する。ホームロボット100はヘッド部10(上部ユニットの一例)とボディ部50(下部ユニットの一例)の2つの部分に分割される。分割の境界(接合面)の位置は高さ方向の中央位置より上側の位置となっている。これにより、ヘッド部10の半径を小さくでき、ヘッド部10の重量を低減できるため、ロボット100全体の重心を下方側に位置させることができ、転倒しにくくなる。また、図1、2に示すような、ヘッド部10がボディ部50から離れた状態の外観は、卵の殻が割れた状態を想起させ、人に親しみを覚えさせることができる。 As shown in FIGS. 1 to 3, the home robot 100 is naturally adapted to the living space and has an egg-shaped design that is highly mentally and psychologically compatible with humans. The home robot 100 is divided into two parts, a head unit 10 (an example of an upper unit) and a body unit 50 (an example of a lower unit). The position of the boundary of the division (joint surface) is a position above the center position in the height direction. Thereby, since the radius of the head part 10 can be made small and the weight of the head part 10 can be reduced, the center of gravity of the entire robot 100 can be positioned on the lower side, and it becomes difficult to fall over. Moreover, the external appearance of the state where the head part 10 is separated from the body part 50 as shown in FIGS. 1 and 2 can remind a person that the egg shell is broken, and can make a person feel familiar.
 ヘッド部10とボディ部50はジョイント部30により連結されている。ジョイント部30の構成の詳細については後述する。 The head part 10 and the body part 50 are connected by a joint part 30. Details of the configuration of the joint unit 30 will be described later.
 ホームロボット100は、図1、2に示すようなヘッド部10が上昇してボディ部50から離れた状態(以下「開状態」という)と、図3に示すようなヘッド部10がボディ部50上に接触して載置された状態(以下「閉状態」という)の2つの状態を取り得る。 In the home robot 100, the head unit 10 as shown in FIGS. 1 and 2 is lifted away from the body unit 50 (hereinafter referred to as “open state”), and the head unit 10 as shown in FIG. It can take two states: a state of being placed in contact with the top (hereinafter referred to as “closed state”).
 ヘッド部10には、映像を投影するプロジェクタ11とカメラ13とが固定されて取り付けられている。また、ヘッド部10には、閉状態においてジョイント部30(32)を収納するための凹部41aが設けられている(図1参照)。同様に、ボディ部50においても、閉状態において、ジョイント部30(32)及びヘッド部10のプロジェクタ11やカメラ13を収納するための凹部41bが設けられている(図2参照)。 A projector 11 for projecting an image and a camera 13 are fixedly attached to the head unit 10. Moreover, the recessed part 41a for accommodating the joint part 30 (32) in the closed state is provided in the head part 10 (refer FIG. 1). Similarly, the body portion 50 is also provided with a concave portion 41b for accommodating the joint portion 30 (32) and the projector 11 and the camera 13 of the head portion 10 in the closed state (see FIG. 2).
 図4は、ホームロボット100の展開図である。ヘッド部10は、外殻ケースとして、第1上側筐体10aと第2上側筐体10bとを含む。ボディ部50は、外殻ケースとして、第1下側筐体50aと、第2下側筐体50bと、第3下側筐体50cとを含む。各筐体10a~10b、50a~50cは樹脂で形成される。 FIG. 4 is a development view of the home robot 100. The head unit 10 includes a first upper housing 10a and a second upper housing 10b as outer shell cases. The body part 50 includes a first lower casing 50a, a second lower casing 50b, and a third lower casing 50c as outer shell cases. Each housing 10a to 10b, 50a to 50c is made of resin.
 第1上側筐体10aには、プロジェクタ11やカメラ13等が取り付けられる。第1上側筐体10aには、ジョイント部30を取り付けるためのアーム取り付け部10dが設けられている。第2上側筐体10bの上部にはスピーカ孔17bが設けられている。第2上側筐体10bは、第1上側筐体10a上に実装されたプロジェクタ11等をカバーするケースである。 The projector 11 and the camera 13 are attached to the first upper housing 10a. The first upper housing 10 a is provided with an arm attachment portion 10 d for attaching the joint portion 30. A speaker hole 17b is provided in the upper part of the second upper housing 10b. The second upper housing 10b is a case that covers the projector 11 and the like mounted on the first upper housing 10a.
 ヘッド部10に取り付けられたプロジェクタ11及びカメラ13に対して、それぞれのレンズを保護するためのレンズカバー11b、13bが取り付けられる。 Lens covers 11b and 13b for protecting the respective lenses are attached to the projector 11 and the camera 13 attached to the head unit 10.
 ボディ部50内には、ハードウェア部品(回路等)を配置するための基盤51が配置されている。ボディ部50において、第2下側筐体50bの位置55において赤外線センサ55b(図11参照)が配置される。なお、赤外線センサ55bは第2下側筐体50bの4カ所において等間隔で配置される。 In the body part 50, a base 51 for arranging hardware parts (circuits, etc.) is arranged. In the body portion 50, an infrared sensor 55b (see FIG. 11) is disposed at a position 55 of the second lower housing 50b. Note that the infrared sensors 55b are arranged at equal intervals in four locations of the second lower housing 50b.
 ボディ部50の下側の領域において、ホームロボット100を移動させるためのホイール57が取り付けられている。ホイール57は、ボディ部50の下側領域において4箇所に取り付けられている。各ホイール57に対して、ホールを保護するためのホイールカバー57bが取り付けられている。 A wheel 57 for moving the home robot 100 is attached in the lower region of the body portion 50. The wheel 57 is attached at four locations in the lower region of the body portion 50. A wheel cover 57b for protecting the hole is attached to each wheel 57.
 [1-1-1.ジョイント部]
 以下、ヘッド部10とボディ部50とを連結する機構であるジョイント部30について説明する。
[1-1-1. Joint part]
Hereinafter, the joint part 30 which is a mechanism for connecting the head part 10 and the body part 50 will be described.
 図5は、ヘッド部10がボディ部50から上昇した状態のホームロボット100の斜視図である。図6は、図5に示す姿勢をとるホームロボット100におけるジョイント部30の斜視図である。図7(A)は、図5に示す状態にあるときのホームロボット100の透視図であり、ホームロボット100の正面から見た図である。図7(B)は、同様の状態のホームロボット100の透視図であり、ホームロボット100の側面から見た図である。 FIG. 5 is a perspective view of the home robot 100 with the head unit 10 raised from the body unit 50. FIG. 6 is a perspective view of the joint unit 30 in the home robot 100 in the posture shown in FIG. FIG. 7A is a perspective view of the home robot 100 in the state shown in FIG. 5, and is a view seen from the front of the home robot 100. FIG. 7B is a perspective view of the home robot 100 in the same state, as viewed from the side of the home robot 100.
 図5~図7(B)に示すように、ジョイント部30は、ボディ部50側に取り付けられる第1アーム31と、ヘッド部10側に取り付けられる第2アーム32とで構成される。第1及び第2アーム31、32は金属製である。第1アーム31は、ボディ部50の第1下側筐体50aに対して回動可能に連結される。第2アーム32は、ヘッド部10の第1上側筐体50aのアーム取り付け部10dに対して回動可能に連結される。第1アーム31と第2アーム32とは互いに回動可能に連結される。このようにジョイント部30は2つのアームを用いて3つの関節機構でヘッド部10とボディ部50を連結する。これにより、重心バランスを最適に保ちながらプロジェクタの照射角度(チルト角度)を広範に変化させることができる。 As shown in FIG. 5 to FIG. 7B, the joint part 30 is composed of a first arm 31 attached to the body part 50 side and a second arm 32 attached to the head part 10 side. The first and second arms 31 and 32 are made of metal. The first arm 31 is rotatably connected to the first lower housing 50a of the body part 50. The second arm 32 is rotatably connected to the arm attachment portion 10d of the first upper housing 50a of the head portion 10. The 1st arm 31 and the 2nd arm 32 are connected so that rotation is mutually possible. Thus, the joint part 30 connects the head part 10 and the body part 50 by three joint mechanisms using two arms. As a result, the irradiation angle (tilt angle) of the projector can be widely changed while keeping the balance of the center of gravity optimal.
 第1アーム31が第1下側筐体50aと連結される関節において、アーム駆動用のアクチュエータ35が設けられている。第1アーム31と第2アーム32が連結される関節において、アーム駆動用のアクチュエータ36が設けられている。第2アーム32がヘッド部10のアーム取り付け部10dと連結される関節において、アーム駆動用のアクチュエータ37が設けられている。それらのアクチュエータ35~37はコントローラ51による制御にしたがい第1及び第2アーム31、32を駆動する。アクチュエータ35~37は例えばサーボモータである。 An actuator 35 for driving the arm is provided at the joint where the first arm 31 is connected to the first lower housing 50a. At the joint where the first arm 31 and the second arm 32 are coupled, an arm driving actuator 36 is provided. At the joint where the second arm 32 is connected to the arm mounting portion 10d of the head portion 10, an arm driving actuator 37 is provided. These actuators 35 to 37 drive the first and second arms 31 and 32 under the control of the controller 51. The actuators 35 to 37 are servo motors, for example.
 アクチュエータ35を駆動することで、ボディ部50に対する第1アーム31の角度を調整することができる。アクチュエータ36を駆動することで、第1アーム31に対する第2アーム32の角度を調整することができる。アクチュエータ37を駆動することで、第2アーム32に対するヘッド部10の角度を調整することができる。これらのアクチュエータ35~37を駆動することで、プロジェクタ11による投影位置の高さまたはカメラ13による撮影領域の高さを調整することができる。 By driving the actuator 35, the angle of the first arm 31 with respect to the body portion 50 can be adjusted. By driving the actuator 36, the angle of the second arm 32 with respect to the first arm 31 can be adjusted. By driving the actuator 37, the angle of the head unit 10 with respect to the second arm 32 can be adjusted. By driving these actuators 35 to 37, the height of the projection position by the projector 11 or the height of the photographing area by the camera 13 can be adjusted.
 図8(A)~(C)は、前方の下方側に向けて映像を投影している状態のホームロボット100の姿勢を説明した図である。図9(A)~(C)は、正面に向けて映像を投影している状態のホームロボット100の姿勢を説明した図である。図10(A)~(C)は、真上に向けて映像を投影している状態のホームロボット100の姿勢を説明した図である。図8(A)、9(A)、10(A)は正面図であり、図8(B)、9(B)、10(B)は側面図であり、図8(C)、9(C)、10(C)は背面図である。 8A to 8C are diagrams illustrating the posture of the home robot 100 in a state where an image is projected toward the front lower side. FIGS. 9A to 9C are diagrams illustrating the posture of the home robot 100 in a state where an image is projected toward the front. FIGS. 10A to 10C are diagrams illustrating the posture of the home robot 100 in a state where an image is projected upward. 8 (A), 9 (A), 10 (A) are front views, FIGS. 8 (B), 9 (B), 10 (B) are side views, and FIGS. C) and 10 (C) are rear views.
 これらの図に示すように、ジョイント部30の第1及び第2アーム31、32の角度が調整されることで、プロジェクタ11の投影方向を自在に変更することができる。特に、より高い位置に映像を投影する場合、ヘッド部10の開き角度が大きくなる。この場合、ヘッド部10の重みでバランスを崩してロボット100が転倒する可能性がある。そこで、本ホームロボット100においては、コントローラ51によって、ヘッド部10の重みでバランスを崩してロボット100が転倒しないように第1及び第2アーム31、32それぞれの角度が制御される。例えば、図10(A)~(C)に示すように真上に向けて映像を投影する場合に、ヘッド部10の重みでバランスを崩してロボット100が転倒しないように第1及び第2アーム31、32それぞれの角度が調整される。 As shown in these drawings, the projection direction of the projector 11 can be freely changed by adjusting the angles of the first and second arms 31 and 32 of the joint portion 30. In particular, when an image is projected at a higher position, the opening angle of the head unit 10 is increased. In this case, there is a possibility that the robot 100 may fall over due to a loss of balance due to the weight of the head unit 10. Therefore, in the home robot 100, the controller 51 controls the angles of the first and second arms 31 and 32 so that the balance is not lost due to the weight of the head unit 10 and the robot 100 does not fall. For example, as shown in FIGS. 10A to 10C, the first and second arms prevent the robot 100 from tipping over when the image is projected directly upward so that the balance is lost due to the weight of the head unit 10. The angles of 31 and 32 are adjusted.
 [1-1-2.電気的構成]
 図11は、ホームロボット100の電気的構成を示すブロック図である。図11に示すように、ホームロボット100は、ヘッダ部10内において、プロジェクタ11と、カメラ13と、マイク15と、スピーカ17と、温度センサ19と、ジャイロセンサ21と、を備える。
[1-1-2. Electrical configuration]
FIG. 11 is a block diagram showing an electrical configuration of the home robot 100. As shown in FIG. 11, the home robot 100 includes a projector 11, a camera 13, a microphone 15, a speaker 17, a temperature sensor 19, and a gyro sensor 21 in the header unit 10.
 また、ホームロボット100は、ボディ部50内において、コントローラ51と、通信装置53と、データ記憶部54と、赤外線センサ55bと、ホイール駆動部56と、バッテリ59と、を備える。 The home robot 100 includes a controller 51, a communication device 53, a data storage unit 54, an infrared sensor 55b, a wheel drive unit 56, and a battery 59 in the body unit 50.
 さらに、ホームロボット100は、ヘッド部10、ボディ部50及びジョイント部30のそれぞれの連結部においてアーム駆動アクチュエータ35~37を備えている。 Furthermore, the home robot 100 is provided with arm drive actuators 35 to 37 at the connecting portions of the head portion 10, the body portion 50 and the joint portion 30.
 プロジェクタ11は4Kのような高画質の映像を投影する投影装置である。プロジェクタ11は、コントローラ51から映像信号を受け、映像信号が示す映像を投影する。 The projector 11 is a projection device that projects a high-quality image such as 4K. The projector 11 receives the video signal from the controller 51 and projects the video indicated by the video signal.
 カメラ13は、光学系や画像センサを含み、被写体を撮影して画像データを生成する撮像装置である。マイク15は、外部の音声を入力して音声信号に変換してコントローラ51に送信する。スピーカ17は、コントローラ51からの音声信号に基づき音声を出力する。 The camera 13 is an imaging device that includes an optical system and an image sensor, and shoots a subject to generate image data. The microphone 15 inputs external sound, converts it into a sound signal, and transmits it to the controller 51. The speaker 17 outputs sound based on the sound signal from the controller 51.
 温度センサ19は、ヘッド部10内部(特に、プロジェクタ11周辺)の温度を検出する。ジャイロセンサ21は、ヘッド部10の動き(角加速度)を検出する。 The temperature sensor 19 detects the temperature inside the head unit 10 (particularly, around the projector 11). The gyro sensor 21 detects the movement (angular acceleration) of the head unit 10.
 赤外線センサ55bは、ボディ部50の筐体50bの下方に設けられ、ホームロボット100の周囲における障害物の有無を検知する。 The infrared sensor 55 b is provided below the housing 50 b of the body unit 50 and detects the presence or absence of an obstacle around the home robot 100.
 データ記憶部54は、ホームロボット100の機能を実現するために必要な制御パラメータ、データ及び制御プログラム等を記憶する記録媒体である。データ記憶部54は例えばハードディスク(HDD)、半導体記憶装置(SSD)、光ディスク媒体で構成することができる。 The data storage unit 54 is a recording medium that stores control parameters, data, control programs, and the like necessary for realizing the functions of the home robot 100. The data storage unit 54 can be composed of, for example, a hard disk (HDD), a semiconductor storage device (SSD), or an optical disk medium.
 コントローラ51はホームロボット100の動作を制御する制御装置である。コントローラ51は、内部に、ホームロボット100全体の動作を制御するCPU、RAM、映像信号処理回路、音声信号処理回路、等を備えている。CPUは、制御プログラム(ソフトウェア)を実行することでホームロボット100の機能を実現する。ホームロボット100の機能はソフトウェアとハードウェアの協働により実現してもよいし、専用に設計されたハードウェア回路のみで実現してもよい。すなわち、CPUに代えて、MPU、DSP、FPGA、ASIC等を実装してもよい。 The controller 51 is a control device that controls the operation of the home robot 100. The controller 51 includes a CPU, a RAM, a video signal processing circuit, an audio signal processing circuit, and the like that control the operation of the home robot 100 as a whole. The CPU realizes the function of the home robot 100 by executing a control program (software). The function of the home robot 100 may be realized by cooperation of software and hardware, or may be realized only by a hardware circuit designed exclusively. That is, an MPU, DSP, FPGA, ASIC, or the like may be mounted instead of the CPU.
 無線通信部53は、Bluetooth(登録商標)規格やWiFi規格等にしたがい通信を行うための通信モジュール(通信回路)である。無線通信部53は、3G、4G、LTE、WiMAX(登録商標)等の広域通信のための通信規格にしたがい通信を行うための通信モジュール(回路)を備えても良い。 The wireless communication unit 53 is a communication module (communication circuit) for performing communication according to the Bluetooth (registered trademark) standard, the WiFi standard, or the like. The wireless communication unit 53 may include a communication module (circuit) for performing communication according to a communication standard for wide-area communication such as 3G, 4G, LTE, WiMAX (registered trademark).
 ホームロボット100は4個の独立したホイール57を備えている。ホイール駆動部56は、4個のホイール57をそれぞれ独立して駆動するための制御信号を生成する駆動回路である。ホイール57は多様な動きを可能とするオムニホイールで構成される。オムニホイールは、円周上に複数の樽状回転体を配置して構成される。オムニホイールは、軸上の本体(ホイール)の回転(前後の動き)と、円周上の樽状回転体の回転(左右の動き)とのコンビネーションによって多方向への動きを実現できる。ホイール駆動部56は、コントローラ51の制御にしたがい4個のホイール57を独立して駆動することで、前後左右方向への直進移動や旋回を含む様々な動きを実現する。 The home robot 100 has four independent wheels 57. The wheel drive unit 56 is a drive circuit that generates control signals for independently driving the four wheels 57. The wheel 57 is composed of an omni wheel that enables various movements. The omni wheel is configured by arranging a plurality of barrel-shaped rotating bodies on the circumference. The omni wheel can realize movement in multiple directions by a combination of rotation of the main body (wheel) on the shaft (back and forth movement) and rotation of the barrel-shaped rotating body on the circumference (left and right movement). The wheel driving unit 56 independently drives the four wheels 57 under the control of the controller 51, thereby realizing various movements including linear movement and turning in the front-rear and left-right directions.
 バッテリ59はホームロボット100の各部を機能させるための電源を供給する。バッテリ56は充電可能な二次電池である。 The battery 59 supplies power for causing each part of the home robot 100 to function. The battery 56 is a rechargeable secondary battery.
 [1-2.動作]
 以上のように構成されるホームロボット100の動作を以下に説明する。なお、以下に説明する機能はコントローラ51の制御により実施される。
[1-2. Operation]
The operation of home robot 100 configured as described above will be described below. Note that the functions described below are performed under the control of the controller 51.
 ホームロボット100は音声認識機能を有し、ユーザからの音声による指示を受け付けることができる。具体的には、ヘッド部10に実装されたマイク15を介してユーザからの音声を入力する。コントローラ51は、マイク15からの音声信号をAD変換し、変換後の音声データに基づき音声認識を行ってユーザの指示内容を認識する。 The home robot 100 has a voice recognition function and can accept voice instructions from the user. Specifically, the voice from the user is input through the microphone 15 mounted on the head unit 10. The controller 51 performs AD conversion on the audio signal from the microphone 15 and recognizes the content of the user instruction by performing voice recognition based on the converted audio data.
 ホームロボット100は、スピーカ17から音声を出力してユーザと対話することができる。音声の種類として、成人/子供の声、男性/女性の声、ビープ音等、種々のバリエーションの音声を出力することができる。また、ホームロボット100は、データ記憶部54において種々のボキャブラリの情報を記憶しており、ユーザとの会話において、同様の意味の情報をシチュエーションに応じて異なる表現で出力することができる。 The home robot 100 can interact with the user by outputting sound from the speaker 17. Various kinds of voices such as adult / child voices, male / female voices, beep sounds, and the like can be output as voice types. In addition, the home robot 100 stores various types of vocabulary information in the data storage unit 54, and can output information having the same meaning in different expressions depending on situations in a conversation with the user.
 ホームロボット100は、その機能を停止しているときは、図3に示すように、ヘッド部10が下降してボディ部50と接触した閉状態になる。この閉状態では、プロジェクタ11及びカメラ13は、外部に露出しないようにヘッド部10及びボディ部50の内部に収納された状態となる。このとき、ホームロボット100は図3に示すように卵型形状となる。 When the function of the home robot 100 is stopped, as shown in FIG. 3, the head unit 10 is lowered and brought into contact with the body unit 50. In this closed state, the projector 11 and the camera 13 are housed inside the head unit 10 and the body unit 50 so as not to be exposed to the outside. At this time, the home robot 100 has an egg shape as shown in FIG.
 閉状態では、マイク15を介した音声認識機能はONになり、他の機能は省エネルギーの点からオフになる待機状態となる。閉状態においてユーザから音声による指示を受けると、コントローラ51はウェイクアップし、図1または図2に示すように、ヘッド部10がボディ部50から立ち上がり、ホームロボット100は開状態となる。この開状態では、プロジェクタ11及びカメラ13はヘッド部10及びボディ部50から露出し、これにより映像投影及び画像撮影が可能となる。 In the closed state, the voice recognition function via the microphone 15 is turned on, and the other functions are in a standby state in which they are turned off from the viewpoint of energy saving. When a voice instruction is received from the user in the closed state, the controller 51 wakes up, and as shown in FIG. 1 or FIG. 2, the head unit 10 rises from the body unit 50 and the home robot 100 enters an open state. In this open state, the projector 11 and the camera 13 are exposed from the head unit 10 and the body unit 50, thereby enabling video projection and image shooting.
 このように本実施の形態のホームロボット100は、図3に示すような卵型形状から図1及び図2に示すような形状に変形する。このような変形は、ユーザに対して、あたかも卵が割れて新たな生物が誕生したような印象を与えることができ、生き物らしさを想起させる。 As described above, the home robot 100 of the present embodiment is deformed from the egg shape as shown in FIG. 3 to the shape as shown in FIGS. Such a deformation can give the user the impression that a new creature has been born as a result of a broken egg, and is reminiscent of creatures.
 ホームロボット100は、無線通信部53を介してアクセスポイントと接続し、アクセスポイントを介してネットワーク(例えばインターネット)に接続できる。よって、ホームロボット100は、ネットワーク(クラウド)上に接続されたサーバや他の機器とアクセスでき、様々なコンテンツ情報を取得することができる。 The home robot 100 can be connected to an access point via the wireless communication unit 53 and can be connected to a network (for example, the Internet) via the access point. Therefore, the home robot 100 can access a server and other devices connected on the network (cloud), and can acquire various content information.
 ホームロボット100のコントローラ51は、赤外線センサ55bからの検出信号に基づきロボット周囲にある障害物や周囲の状況を判断する。例えば、周囲に障害物があると判断したときは、コントローラ51は、その障害物を回避するように移動経路を決定し、ホイール駆動部56を制御する。また、床面やテーブル上を移動中に、移動先の床面に穴があることや移動先にテーブル面がないことを検出したときは、コントローラ51は、穴へのまたはテーブル面からの落下を防止するように移動経路を決定し、ホイール駆動部56を制御する。 The controller 51 of the home robot 100 determines obstacles and surrounding conditions around the robot based on the detection signal from the infrared sensor 55b. For example, when it is determined that there is an obstacle around the controller 51, the controller 51 determines a movement path so as to avoid the obstacle and controls the wheel driving unit 56. If the controller 51 detects that there is a hole in the destination floor surface or there is no table surface in the destination while moving on the floor or table, the controller 51 drops into the hole or from the table surface. The movement path is determined so as to prevent this, and the wheel drive unit 56 is controlled.
 ホームロボット100はユーザからシャットダウン(電源オフ)の指示を受けたときは、開状態から閉状態に変化する。このとき、ユーザの指がヘッド部10とボディ部50の間に挟まれないようにするため、ヘッド部10とボディ部50とを接触させず、それらの間に一定の間隔を開けた状態で保持するようにても良い。また、シャットダウン開始前に、音声による警告メッセージや警告音を出力してもよい。 When the home robot 100 receives a shutdown (power off) instruction from the user, the home robot 100 changes from an open state to a closed state. At this time, in order to prevent the user's finger from being sandwiched between the head unit 10 and the body unit 50, the head unit 10 and the body unit 50 are not brought into contact with each other with a certain gap therebetween. It may be held. In addition, a warning message or a warning sound may be output before starting shutdown.
 また、ホームロボット100はヘッド部10の温度が高温になったときに、ロボット100の機能をシャットダウン(すなわち、プロジェクタ11を停止)させる機能を有する。このため、コントローラ51は、温度センサ19により検知された温度に基づきロボット100が高温になったか否かを判定する。具体的には、コントローラ51は、温度センサ19により検知された温度が所定値を超えたときに、ロボット100が高温であると判定し、シャットダウン(プロジェクタ11を停止)させる。その際コントローラ51は、「少々クールダウンさせてください」という音声メッセージ(警告メッセージ)をスピーカ17から出力し、その後に自動的にシャットダウンさせてもよい。これにより、ユーザは、ロボット100の温度が高くなっているために、間もなくシャットダウンする旨を認識することができる。また、シャットダウン時には、ホームロボット100の姿勢を開状態から閉状態に変化させた後にシャットダウンする。これにより、高温になっているプロジェクタ11の一部が露出した状態でユーザの指等に触れることを防止できる。なお、コントローラ51は、ヘッド部10の温度に加えて、プロジェクタ11の連続稼働時間に基づいて、ロボット100が高温であるか否かを判断してもよい。 Further, the home robot 100 has a function of shutting down the function of the robot 100 (that is, stopping the projector 11) when the temperature of the head unit 10 becomes high. For this reason, the controller 51 determines whether or not the robot 100 has reached a high temperature based on the temperature detected by the temperature sensor 19. Specifically, when the temperature detected by the temperature sensor 19 exceeds a predetermined value, the controller 51 determines that the robot 100 is hot and shuts down (stops the projector 11). At that time, the controller 51 may output a voice message (warning message) “Please cool down a little” from the speaker 17 and then automatically shut down. Accordingly, the user can recognize that the robot 100 will be shut down soon because the temperature of the robot 100 is high. Further, at the time of shutdown, the home robot 100 is shut down after changing the posture of the home robot 100 from the open state to the closed state. Thereby, it can prevent touching a user's finger | toe etc. in the state in which a part of projector 11 which is high temperature was exposed. The controller 51 may determine whether or not the robot 100 is hot based on the continuous operation time of the projector 11 in addition to the temperature of the head unit 10.
 ホームロボット100は、様々なシーンに応じて可愛らしく愛嬌と情感のある動作(うつむき、うなずき、ボディを振る等)を行う。例えば、ホームロボット100は、ユーザからの問いかけを音声認識するとともに、問いかけの内容を認識する。その問いかけに対して「YES」を示す場合、ホームロボット100は、頷いているように見えるようにヘッド部10を上下に運動させる。具体的には、コントローラ51は、第2アーム32とヘッド部10の連結部分にあるアクチュエータ37を駆動し、第2アーム32に対してヘッド部10を上下に微小振動させる。または、ユーザからの問いかけに対して「NO」を示す場合、ホームロボット100全体を左右に振るように動作する。この場合、コントローラ51は、ホームロボット100全体が左右に振られるように4つのホイール57の動きを制御する。その他、ヘッド部10をボディ部50に接触させた閉状態にすることで、恥ずかしさや、怒りを表現してもよい。このような様々な動きにより感情を表現できる。 The home robot 100 performs actions with cute caress and emotion (depressing, nodding, shaking the body, etc.) according to various scenes. For example, the home robot 100 recognizes a question from the user and recognizes the content of the question. When “YES” is shown in response to the question, the home robot 100 moves the head unit 10 up and down so that it looks like it is creeping. Specifically, the controller 51 drives an actuator 37 at a connection portion between the second arm 32 and the head unit 10 to cause the head unit 10 to slightly vibrate up and down with respect to the second arm 32. Alternatively, when “NO” is indicated in response to an inquiry from the user, the entire home robot 100 is moved to the left and right. In this case, the controller 51 controls the movement of the four wheels 57 so that the entire home robot 100 is swung left and right. In addition, embarrassment and anger may be expressed by bringing the head portion 10 into a closed state in contact with the body portion 50. Emotions can be expressed by these various movements.
 ホームロボット100はカメラ13によりロボット100前方の画像を撮像する機能を有する。カメラ13により撮像された画像はデータ記憶部54に格納されてもよい。また、コントローラ51は、常時カメラ13から画像を取得し、取得した画像を解析することでロボット100の前方の状況を認識することができる。例えば、コントローラ51は、画像解析の結果に基づき、前方にユーザ(人)がいるか否か、ユーザの位置、人数、ユーザの顔の向き等を検出することができる。 The home robot 100 has a function of capturing an image in front of the robot 100 by the camera 13. An image captured by the camera 13 may be stored in the data storage unit 54. Further, the controller 51 can always acquire an image from the camera 13 and recognize the situation in front of the robot 100 by analyzing the acquired image. For example, the controller 51 can detect whether or not there is a user (person) ahead, the position of the user, the number of people, the orientation of the user's face, and the like based on the result of the image analysis.
 ホームロボット100は図12に示すようにプロジェクタ11から映像(コンテンツ情報80)を投影することができる。映像の投影位置はプロジェクタ11の向きを変更することで変更できる。プロジェクタ11の上下方向の向きは、アーク駆動アクチュエータ35~37を制御して、ヘッド部10(すなわちプロジェクタ11)の高さ及びチルト角度を変えることにより変更できる。上下方向においては、略真下から真上まで略180度の範囲で投影位置を変更することができる。プロジェクタ11の左右方向(パン方向)の向きは、ホイール57を制御してロボット100全体の左右方向(パン方向)の向きを変えることで変更できる。プロジェクタ11の向きを適宜変更することで、テーブル面、壁、床、天井等、様々な場所に映像を投影することができる。 The home robot 100 can project a video (content information 80) from the projector 11 as shown in FIG. The projection position of the image can be changed by changing the direction of the projector 11. The vertical direction of the projector 11 can be changed by controlling the arc drive actuators 35 to 37 and changing the height and tilt angle of the head unit 10 (ie, the projector 11). In the vertical direction, the projection position can be changed within a range of approximately 180 degrees from approximately directly below to directly above. The horizontal direction (pan direction) of the projector 11 can be changed by controlling the wheel 57 to change the horizontal direction (pan direction) of the entire robot 100. By appropriately changing the orientation of the projector 11, an image can be projected on various places such as a table surface, a wall, a floor, and a ceiling.
 ホームロボット100のコントローラ51は、画像解析結果からユーザ位置を認識し、ユーザ位置に基づいてプロジェクタ11による映像の投影動作を制御してもよい。例えば、コントローラ51は、映像を投影しようとする位置にユーザがいることを検出した場合、安全上の観点から、映像の投影動作を停止するか、または、映像光の強度を弱めるようにプロジェクタ11を制御してもよい。これにより、ユーザに対してプロジェクタ11からの強い映像光が照射されることを防止できる。 The controller 51 of the home robot 100 may recognize the user position from the image analysis result and control the image projection operation by the projector 11 based on the user position. For example, when the controller 51 detects that the user is at a position where the video is to be projected, the projector 11 stops the video projection operation or reduces the intensity of the video light from the viewpoint of safety. May be controlled. Thereby, it is possible to prevent the user from being irradiated with strong image light from the projector 11.
 または、ホームロボット100(コントローラ51)は、映像を投影しようとする位置にユーザがいることを検出した場合、ユーザがいない位置に映像を投影するようにプロジェクタ11の向きを制御してもよい。これにより、ユーザにプロジェクタ11からの映像光が照射されることを防止できる。また、ホームロボット100の向きを変更したときは、向きの変更後所定時間内に映像を投影するようにする。これは、ユーザに、そっぽを向かれたという悲しい思いをさせないためである。 Alternatively, when the home robot 100 (the controller 51) detects that the user is at a position where the video is to be projected, the home robot 100 (controller 51) may control the orientation of the projector 11 so as to project the video at a position where the user is not present. Thereby, it is possible to prevent the user from being irradiated with the image light from the projector 11. In addition, when the orientation of the home robot 100 is changed, an image is projected within a predetermined time after the orientation is changed. This is to prevent the user from feeling sad about being turned away.
 また、コントローラ51は、ユーザの見やすい位置(例えば、ユーザの前方の領域)に映像を投影するようにしてもよい。その際、コントローラ51は、画像からユーザの顔を検出してユーザの顔の向きを判定し、ユーザが正しい向きの映像(コンテンツ)を視認できるように、映像(コンテンツ)の向きを適宜調整(上下反転、左右回転等)して投影する。例えば、コントローラ51は、ユーザの横側からユーザの手前の領域にユーザから見て正しい向きの映像を表示するように映像の投影位置及び映像の向きを制御する。これにより、ユーザは、ロボット100の位置に関わらず、常に正しい向きの映像を視認することが可能となる。 Further, the controller 51 may project an image on a position that is easy for the user to see (for example, an area in front of the user). At that time, the controller 51 detects the user's face from the image, determines the orientation of the user's face, and appropriately adjusts the orientation of the video (content) so that the user can visually recognize the video (content) in the correct orientation ( Project the image upside down and horizontally. For example, the controller 51 controls the projection position of the video and the direction of the video so that a video in the correct orientation as viewed from the user is displayed in an area in front of the user from the side of the user. Thereby, the user can always visually recognize an image in the correct direction regardless of the position of the robot 100.
 プロジェクタ11から映像を投影する場合、ホームロボット100のコントローラ51は、投影面とプロジェクタ11の相対的な位置関係に基づき、投影面上で正しい形状の映像が表示されるように投影する映像に対して台形補正を行う。また、ホームロボット100は元の映像信号に基づき映像の上下左右を反転させた映像を作成し投影することもできる。 When projecting an image from the projector 11, the controller 51 of the home robot 100 performs an operation on the image to be projected so that an image having a correct shape is displayed on the projection surface based on the relative positional relationship between the projection surface and the projector 11. Correct keystone. In addition, the home robot 100 can create and project an image obtained by inverting the image vertically and horizontally based on the original image signal.
 ホームロボット100が、映像投影中に前述のような頷き動作を行った場合、投影された映像がぶれるという問題がある。これを解決するため、プロジェクタ11にぶれ補正機能を設けても良い。このため、プロジェクタ11は、光路を変更するための補正レンズと、補正レンズをぶれに応じて投影光の光軸に直交する平面内を移動させる駆動機構とを備える。プロジェクタ11はジャイロセンサ21からヘッド部10のぶれを示す信号を入力し、ぶれをキャンセルするように、補正レンズを投影光の光軸と直交する面内で移動させる。これにより、ホームロボット100は、映像投影中に頷き動作を行った場合であっても、所望の位置に映像をぶれずに投影することが可能になる。このようなぶれ補正の技術は一般的なカメラにおいて使用されている公知の手ぶれ補正技術によって実現することができる。または、検出したぶれに応じて、映像信号を制御してもよい。すなわち、映像信号が示す画像(オブジェクト)の位置をぶれに応じて、ぶれがキャンセルされるような位置にシフトしてもよい。 When the home robot 100 performs the above-described whirling operation during image projection, there is a problem that the projected image is blurred. In order to solve this, the projector 11 may be provided with a shake correction function. Therefore, the projector 11 includes a correction lens for changing the optical path, and a drive mechanism that moves the correction lens in a plane orthogonal to the optical axis of the projection light in accordance with the shake. The projector 11 receives a signal indicating the shake of the head unit 10 from the gyro sensor 21, and moves the correction lens in a plane orthogonal to the optical axis of the projection light so as to cancel the shake. As a result, the home robot 100 can project an image without blurring the image at a desired position even when the whirling operation is performed during the image projection. Such a shake correction technique can be realized by a known camera shake correction technique used in a general camera. Alternatively, the video signal may be controlled according to the detected blur. That is, the position of the image (object) indicated by the video signal may be shifted to a position where the blur is canceled according to the blur.
 ホームロボット100は、カメラ13により本や書類を撮影し、撮影した画像から本や書類に記載されている情報を認識する。例えば、コントローラ51はOCR(Optical Character Recognition)機能を有し、撮影した画像からテキストを認識する。コントローラ51は認識したテキストの内容に基づき音声を合成してスピーカ17から出力してもよい。これにより本の文章の読み上げを行うことができる。または、コントローラ51は、解答が記入された答案用紙を読み取り、採点を行い、採点結果を映像または音声でユーザに伝えてもよい。または、コントローラ51は画像を解析し、音符を認識し、その音符にしたがい音楽を再生してもよい。 The home robot 100 captures a book or document by the camera 13 and recognizes information described in the book or document from the captured image. For example, the controller 51 has an OCR (Optical Character Recognition) function and recognizes text from a captured image. The controller 51 may synthesize speech based on the recognized text content and output it from the speaker 17. Thereby, the text of the book can be read out. Alternatively, the controller 51 may read the answer sheet in which the answer is written, score it, and convey the score result to the user by video or audio. Alternatively, the controller 51 may analyze the image, recognize a note, and play music according to the note.
 以上のように構成されるホームロボット100は、ユーザと会話しながら様々な情報(映像、音声)をユーザに提示することができる。 The home robot 100 configured as described above can present various information (video, audio) to the user while talking to the user.
 [1-3.効果等]
 以上のように本実施の形態のホームロボット100は、ユーザと会話によるコミュニケーションが可能な対話型ロボットである。ホームロボット100は、ヘッド部10(上部ユニットの一例)と、ボディ部50(下部ユニットの一例)と、ヘッド部10とボディ部50を連結し、ボディ部50に対してヘッド部10を昇降させるよう折り曲げ自在なジョイント部30と、を備える。ヘッド部10は、映像を投影するプロジェクタ11(投影装置の一例)と、画像を撮影するカメラ13(撮像装置の一例)と、を備える。ボディ部50は、ロボット100を移動させるためのホイール57を駆動するホイール駆動部56(駆動装置の一例)と、ロボット100の動作を制御するコントローラ51(制御装置の一例)と、を備える。ホームロボット100は、ヘッド部10がボディ部50上に接触して載置され、プロジェクタ11、カメラ13及びジョイント部30がヘッド部10及びボディ部50内に収納される閉状態(第1状態の一例、図3参照)を取り得る。また、ホームロボット100は、ヘッド部10がボディ部50から離れて上昇し、プロジェクタ11、カメラ13及びジョイント部30の少なくとも一部が露出する開状態(第2状態の一例、図1、2参照)を取り得る。
[1-3. Effect]
As described above, the home robot 100 according to the present embodiment is an interactive robot that can communicate with a user through conversation. The home robot 100 connects the head unit 10 (an example of an upper unit), the body unit 50 (an example of a lower unit), the head unit 10 and the body unit 50, and moves the head unit 10 up and down relative to the body unit 50. And a bendable joint portion 30. The head unit 10 includes a projector 11 (an example of a projection device) that projects an image, and a camera 13 (an example of an imaging device) that captures an image. The body unit 50 includes a wheel drive unit 56 (an example of a drive device) that drives a wheel 57 for moving the robot 100, and a controller 51 (an example of a control device) that controls the operation of the robot 100. The home robot 100 is placed in contact with the head unit 10 on the body unit 50, and the projector 11, the camera 13, and the joint unit 30 are stored in the head unit 10 and the body unit 50 (in the first state). For example, see FIG. Further, the home robot 100 is in an open state in which the head unit 10 is lifted away from the body unit 50 and at least a part of the projector 11, the camera 13, and the joint unit 30 is exposed (an example of the second state, see FIGS. 1 and 2). ) Can be taken.
 以上のように構成されるホームロボット100は、ユーザと会話しながら様々な情報(映像、音声)を提示することができる。よって、高齢者の介護サポート(認知症や老後鬱の防止等)、子供の教育(遠隔教育、学習支援、親が不在時の留守番等)、レクリエーション(ゲーム、本読み上げ等)等、家庭内で様々な態様のサポートを行うことができる(図13(A)及び13(B)参照)。 The home robot 100 configured as described above can present various information (video, audio) while talking to the user. Therefore, nursing care support for the elderly (prevention of dementia and post-growth depression, etc.), child education (distance education, learning support, answering machine when parents are absent, etc.), recreation (games, reading aloud, etc.) Various aspects of support can be provided (see FIGS. 13A and 13B).
 ホームロボット100は閉状態において卵型の形状を有する。卵型デザインを有することにより、ホームロボット100は生活空間に自然に溶け込み、精神的、心理的に人に親しみを感じさせることができる。 The home robot 100 has an egg shape when closed. By having an egg-shaped design, the home robot 100 can naturally blend into the living space and make people feel mentally and psychologically friendly.
 以上のように、本出願において開示する技術の例示として、実施の形態1を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。また、上記実施の形態で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。 As described above, the first embodiment has been described as an example of the technique disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed. Moreover, it is also possible to combine each component demonstrated in the said embodiment and it can also be set as a new embodiment.
 以上のように、本開示における技術の例示として、実施の形態を説明した。そのために、添付図面および詳細な説明を提供した。 As described above, the embodiments have been described as examples of the technology in the present disclosure. For this purpose, the accompanying drawings and detailed description are provided.
 したがって、添付図面および詳細な説明に記載された構成要素の中には、課題解決のために必須な構成要素だけでなく、上記技術を例示するために、課題解決のためには必須でない構成要素も含まれ得る。そのため、それらの必須ではない構成要素が添付図面や詳細な説明に記載されていることをもって、直ちに、それらの必須ではない構成要素が必須であるとの認定をするべきではない。 Accordingly, among the components described in the accompanying drawings and the detailed description, not only the components essential for solving the problem, but also the components not essential for solving the problem in order to illustrate the above technique. May also be included. Therefore, it should not be immediately recognized that these non-essential components are essential as those non-essential components are described in the accompanying drawings and detailed description.
 また、上述の実施の形態は、本開示における技術を例示するためのものであるから、特許請求の範囲またはその均等の範囲において種々の変更、置き換え、付加、省略などを行うことができる。 In addition, since the above-described embodiments are for illustrating the technique in the present disclosure, various modifications, replacements, additions, omissions, and the like can be made within the scope of the claims and the equivalents thereof.
 本開示のホームロボットは、対話型ロボットであって、家庭内において、レクリエーション、教育、介護等種々のユーザサポートを行うことができる。 The home robot of the present disclosure is an interactive robot, and can provide various user support such as recreation, education, and nursing care at home.

Claims (10)

  1.  ユーザと会話によるコミュニケーションが可能な対話型ロボットであって、
     上部ユニットと、
     下部ユニットと、
     前記上部ユニットと前記下部ユニットを連結し、前記下部ユニットに対して前記上部ユニットを昇降させるよう折り曲げ自在なジョイント部と、
    を備え、
     前記上部ユニットは、映像を投影する投影装置と、画像を撮影する撮像装置と、を備え、
     前記下部ユニットは、前記ロボットを移動させるためのホイールを駆動する駆動装置と、前記ロボットの動作を制御する制御装置と、を備え、
     前記ロボットは、
      前記上部ユニットが前記下部ユニット上に接触して載置され、前記投影装置、前記撮像装置及び前記ジョイント部が前記上部ユニット及び下部ユニット内に収納される第1状態と、
      前記上部ユニットが前記下部ユニットから離れて上昇し、前記投影装置、前記撮像装置及び前記ジョイント部の少なくとも一部が露出する第2状態とを取り得る、
    対話型ロボット。
    An interactive robot that can communicate with users through conversations.
    The upper unit,
    The lower unit,
    A joint part that connects the upper unit and the lower unit, and is foldable to raise and lower the upper unit relative to the lower unit;
    With
    The upper unit includes a projection device that projects video, and an imaging device that captures images.
    The lower unit includes a driving device that drives a wheel for moving the robot, and a control device that controls the operation of the robot.
    The robot is
    A first state in which the upper unit is placed in contact with the lower unit, and the projection device, the imaging device, and the joint unit are housed in the upper unit and the lower unit;
    The upper unit rises away from the lower unit, and can take a second state in which at least a part of the projection device, the imaging device, and the joint portion is exposed.
    Interactive robot.
  2.  前記第1状態における全体形状が卵型である、請求項1記載の対話型ロボット。 The interactive robot according to claim 1, wherein the overall shape in the first state is an egg shape.
  3.  前記上部ユニットと前記下部ユニットの境界は、前記ロボットの高さ方向における中央位置よりも上側に位置する、請求項2記載の対話型ロボット。 The interactive robot according to claim 2, wherein a boundary between the upper unit and the lower unit is located above a center position in the height direction of the robot.
  4.  前記ジョイント部は、前記下部ユニットに回動可能に連結された第1アームと、一端が前記第1アームに回動可能に連結され、他端が前記上部ユニットに回動可能に連結された第2アームとで構成される、請求項1記載の対話型ロボット。 The joint unit includes a first arm pivotably connected to the lower unit, one end pivotably coupled to the first arm, and the other end pivotally coupled to the upper unit. The interactive robot according to claim 1, comprising two arms.
  5.  音声を入力し音声信号に変換するマイクと、音声信号に基づき音声を出力するスピーカとをさらに備えた、請求項1記載の対話型ロボット。 The interactive robot according to claim 1, further comprising a microphone that inputs voice and converts it into a voice signal, and a speaker that outputs voice based on the voice signal.
  6.  外部のネットワークと無線通信を行うための無線通信部をさらに備えた、請求項1記載の対話型ロボット。 The interactive robot according to claim 1, further comprising a wireless communication unit for performing wireless communication with an external network.
  7.  前記制御装置は、前記撮像装置により撮像された画像を解析してユーザの位置を検出し、検出したユーザ位置に基づいて、前記投影装置による映像の投影位置を制御する、請求項1記載の対話型ロボット。 The dialogue according to claim 1, wherein the control device detects a position of a user by analyzing an image captured by the imaging device, and controls a projection position of an image by the projection device based on the detected user position. Type robot.
  8.  前記制御装置は、ユーザがいない方向に映像を投影するように前記投影装置を制御する、請求項7記載の対話型ロボット。 The interactive robot according to claim 7, wherein the control device controls the projection device to project an image in a direction where there is no user.
  9.  前記制御装置は、ユーザが映像を見やすい位置に映像を投影するように前記投影装置を制御する、請求項7記載の対話型ロボット。 The interactive robot according to claim 7, wherein the control device controls the projection device so as to project the image at a position where the user can easily view the image.
  10.  前記制御装置は、前記上部ユニット内の温度または前記投影装置の稼働時間に基づいて前記第上部ユニットが高温であるか否かを判定し、高温であると判定したときに、前記ロボットを前記第1状態に変化させるとともに前記投影装置の動作を停止させる、請求項1記載の対話型ロボット。 The control device determines whether or not the first upper unit is at a high temperature based on the temperature in the upper unit or the operating time of the projection device. The interactive robot according to claim 1, wherein the interactive robot is changed to one state and the operation of the projection device is stopped.
PCT/JP2017/043112 2016-12-28 2017-11-30 Interactive robot WO2018123431A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662439575P 2016-12-28 2016-12-28
US62/439,575 2016-12-28

Publications (1)

Publication Number Publication Date
WO2018123431A1 true WO2018123431A1 (en) 2018-07-05

Family

ID=62707219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/043112 WO2018123431A1 (en) 2016-12-28 2017-11-30 Interactive robot

Country Status (1)

Country Link
WO (1) WO2018123431A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108877347A (en) * 2018-08-02 2018-11-23 安徽硕威智能科技有限公司 Classroom outdoor scene reproducing interactive tutoring system based on robot projection function
CN109015685A (en) * 2018-08-28 2018-12-18 广东海翔教育科技有限公司 One kind being used for educational robot
WO2019070050A1 (en) * 2017-10-06 2019-04-11 本田技研工業株式会社 Mobile robot
WO2020253118A1 (en) * 2019-06-20 2020-12-24 深圳前海微众银行股份有限公司 Business development method and apparatus
US20230191632A1 (en) * 2021-12-22 2023-06-22 Lg Electronics Inc. Mobile robot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005084789A (en) * 2003-09-05 2005-03-31 Advanced Telecommunication Research Institute International Personal computer
JP2005313308A (en) * 2004-03-30 2005-11-10 Nec Corp Robot, robot control method, robot control program, and thinking device
JP2012519264A (en) * 2009-02-27 2012-08-23 アール. ブルックス アソシエイツ インコーポレーティッド Inspection system and inspection process using magnetic inspection vehicle
WO2013099104A1 (en) * 2011-12-28 2013-07-04 パナソニック株式会社 Robot arm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005084789A (en) * 2003-09-05 2005-03-31 Advanced Telecommunication Research Institute International Personal computer
JP2005313308A (en) * 2004-03-30 2005-11-10 Nec Corp Robot, robot control method, robot control program, and thinking device
JP2012519264A (en) * 2009-02-27 2012-08-23 アール. ブルックス アソシエイツ インコーポレーティッド Inspection system and inspection process using magnetic inspection vehicle
WO2013099104A1 (en) * 2011-12-28 2013-07-04 パナソニック株式会社 Robot arm

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019070050A1 (en) * 2017-10-06 2019-04-11 本田技研工業株式会社 Mobile robot
CN108877347A (en) * 2018-08-02 2018-11-23 安徽硕威智能科技有限公司 Classroom outdoor scene reproducing interactive tutoring system based on robot projection function
CN109015685A (en) * 2018-08-28 2018-12-18 广东海翔教育科技有限公司 One kind being used for educational robot
WO2020253118A1 (en) * 2019-06-20 2020-12-24 深圳前海微众银行股份有限公司 Business development method and apparatus
US20230191632A1 (en) * 2021-12-22 2023-06-22 Lg Electronics Inc. Mobile robot

Similar Documents

Publication Publication Date Title
WO2018123431A1 (en) Interactive robot
US10921818B2 (en) Robot
US10120386B2 (en) Robotic creature and method of operation
US10307911B2 (en) Robot
US9517559B2 (en) Robot control system, robot control method and output control method
WO2016068262A1 (en) Communication robot
US20180154513A1 (en) Robot
JP2009113136A (en) Robot
CN109262606B (en) Apparatus, method, recording medium, and robot
JP2011204145A (en) Moving device, moving method and program
JP6565853B2 (en) Communication device
US20180376069A1 (en) Erroneous operation-preventable robot, robot control method, and recording medium
US11065769B2 (en) Robot, method for operating the same, and server connected thereto
CN111163906A (en) Mobile electronic device and operation method thereof
US20220347860A1 (en) Social Interaction Robot
JP2005335053A (en) Robot, robot control apparatus and robot control method
JP2023095918A (en) Robot, method for controlling robot, and program
Pierce et al. “Mask-Bot 2i”: An active customisable robotic head with interchangeable face
JP6586810B2 (en) toy
US20230195401A1 (en) Information processing apparatus and information processing method
JP6686583B2 (en) Robots and programs
US20220291665A1 (en) Information processing apparatus, control method, and program
US20220055224A1 (en) Configurable and Interactive Robotic Systems
WO2023243431A1 (en) Nursing-care robot, method for controlling nursing-care robot, and information processing device
WO2024004623A1 (en) Robot and robot control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17885696

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17885696

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP