US20180185764A1 - Robot - Google Patents

Robot Download PDF

Info

Publication number
US20180185764A1
US20180185764A1 US15/905,893 US201815905893A US2018185764A1 US 20180185764 A1 US20180185764 A1 US 20180185764A1 US 201815905893 A US201815905893 A US 201815905893A US 2018185764 A1 US2018185764 A1 US 2018185764A1
Authority
US
United States
Prior art keywords
housing
robot
travelling
change
weight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/905,893
Other languages
English (en)
Inventor
Ryouta Miyazaki
Kento Ogawa
Seiya Higuchi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Assigned to PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. reassignment PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HIGUCHI, SEIYA, MIYAZAKI, RYOUTA, OGAWA, KENTO
Publication of US20180185764A1 publication Critical patent/US20180185764A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/005Motorised rolling toys
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H29/00Drive mechanisms for toys in general
    • A63H29/08Driving mechanisms actuated by balls or weights
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H29/00Drive mechanisms for toys in general
    • A63H29/22Electric drives
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/006Dolls provided with electrical lighting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/26Magnetic or electric toys
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0891Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for land vehicles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the present disclosure relates to a robot that determines its own state.
  • the multi-legged walking robot disclosed in International Publication No. WO2000/032360 discloses a multi-legged walking robot having four legs (for example, page 8, lines 15 to 17).
  • the multi-legged walking robot disclosed in International Publication No. WO2000/032360 includes an acceleration sensor that detects acceleration in three-axis (X-axis, Y-axis, and Z-axis) directions, and an angular velocity sensor that detects rotation angular velocity in three-angle (R-angle, P-angle, and Y-angle) directions (for example, page 8, line 26 to page 9, line 8).
  • the robot stops the motion of its legs (for example, page 10. lines 13 to 20). This can prevent the robot from injuring the user (for example, page 6, lines 11 to 12).
  • the techniques disclosed here feature a robot includes: a spherical housing; a frame disposed in the housing; a display unit that is provided on the frame, and that displays at least a portion of a face of the robot; a set of drive wheels that are provided on the frame, and that rotate and move the housing while being in contact with an inner circumferential face of the housing; a weight drive mechanism that is provided on the frame, and that reciprocates a weight in a predetermined direction; an angular velocity sensor that detects angular velocity about a crosswise direction that is perpendicular to a travelling direction of the housing; and a control circuit that, if the control circuit determines, while the housing is being rotated and moved, that a rotational angle of the housing when viewed from front in the travelling direction changes upward beyond a predetermined angle based on a change in the angular velocity about the crosswise direction, moves the weight frontward in the travelling direction of the housing by a distance corresponding to the rotational angle.
  • FIG. 1 is a perspective view illustrating the external appearance of a robot according to an embodiment of the present disclosure
  • FIG. 2 is a perspective view illustrating the inside of the robot according to the embodiment of the present disclosure
  • FIG. 3 is a side view illustrating the inside of the robot according to the embodiment of the present disclosure when viewed from A in FIG. 2 ;
  • FIG. 4 is a side view illustrating linear movement of the robot according to the embodiment of the present disclosure when viewed from A in FIG. 2 ;
  • FIG. 5 is a plan view illustrating rotation of the robot according to the embodiment of the present disclosure when viewed from B in FIG. 2 ;
  • FIG. 6 is a perspective view illustrating rotation of the robot according to the embodiment of the present disclosure.
  • FIG. 7 is a view illustrating a weight drive mechanism in the side view of FIG. 3 ;
  • FIG. 8A is a perspective view illustrating the operation of the drive mechanism for the counterweight to drive the counterweight in a predetermined linear direction
  • FIG. 8B is a side view illustrating the operation of the counterweight drive mechanism to drive the counterweight in the predetermined linear direction
  • FIG. 8C is a side view illustrating the state where the counterweight reciprocates in the predetermined linear direction in the side view of FIG. 3 ;
  • FIG. 9A is a perspective view illustrating the operation of the counterweight drive mechanism to rotate the swing arm
  • FIG. 9B is a side view illustrating the operation of the counterweight drive mechanism to rotate the swing arm
  • FIG. 9C is a plan view illustrating the state where the swing arm of the robot according to the embodiment of the present disclosure rotates when viewed from B in FIG. 2 ;
  • FIG. 10 is a side view illustrating the robot's attitude in which the counterweight is located to the front when viewed from A in FIG. 2 ;
  • FIG. 11 is a side view illustrating the robot's attitude in which the counterweight is located to the rear when viewed from A in FIG. 2 ;
  • FIG. 12 is a front view illustrating the robot's attitude in which the counterweight is located to the right when viewed from C in FIG. 2 ;
  • FIG. 13 is a front view illustrating the robot's attitude in which the counterweight is located to the left when viewed from C in FIG. 2 ;
  • FIG. 14 is a view illustrating an example of overall configuration of a robot system using the robot according to the embodiment of the present disclosure
  • FIG. 15 is a block diagram illustrating the robot according to the embodiment of the present disclosure.
  • FIG. 16 is a flow chart illustrating an example of a main routine of the robot according to the embodiment of the present disclosure
  • FIG. 17 is a flow chart illustrating details of travelling state determination processing (S 103 in FIG. 16 );
  • FIG. 18 is a flow chart illustrating details of moving state determination processing (S 201 in FIG. 17 );
  • FIG. 19 is a flow chart illustrating details of attitude determination processing (S 203 in FIG. 17 );
  • FIG. 20 is a view illustrating of attitude angle of the robot
  • FIG. 21 is a graph illustrating the attitude determination processing
  • FIG. 22 is a flow chart illustrating details of frictional surface travelling determination processing (S 205 in FIG. 17 );
  • FIG. 23A is a schematic view illustrating the state of the robot during “normal travelling” as the travelling state
  • FIG. 23B is a schematic view illustrating the state of the robot during “frictional surface travelling” as the travelling state
  • FIG. 23C is a schematic view illustrating the state of the robot during “uphill travelling” as the travelling state
  • FIG. 24A is a graph illustrating a shift of acceleration Az in the vertical direction, which is exerted on the robot according to the travelling state;
  • FIG. 24B is a graph illustrating a shift of acceleration Az′ exerted on the robot according to the travelling state
  • FIG. 25 is a flow chart illustrating details of idling control processing (S 105 in FIG. 16 );
  • FIG. 26A is a view illustrating the idling control processing
  • FIG. 26B is a view illustrating the idling control processing
  • FIG. 26C is a view illustrating the idling control processing
  • FIG. 26D is a view illustrating the idling control processing
  • FIG. 26E is a view illustrating the idling control processing
  • FIG. 27 is a flow chart illustrating details of attitude direction control processing (S 106 in FIG. 16 ).
  • International Publication No. WO2000/032360 discloses a multi-legged walking robot with four legs, which includes an acceleration sensor and an angular velocity sensor.
  • International Publication No. WO2000/032360 using two threshold values ( 61 , 62 ), variances of outputs detected by the acceleration sensor and the angular velocity sensor are classified to three categories to determine whether the robot acts on the ground, the robot is lifted up, or the robot is lifted down (for example, page 9, lines 5 to 14).
  • the Inventor examines a robot having a spherical housing and a set of drive wheels provided in contact with the inner circumferential face of the housing and configured to rotate the housing.
  • a frame is provided inside the robot, and a display unit that displays at least a portion of the face of the robot is provided to the frame.
  • the robot has no hands or legs because they may obstruct rotation.
  • the Inventor found that the position of the face of the travelling robot, that is, the attitude of the robot changed depending on the material for a floor surface on which the robot travels. For example, when the robot travels on a wood flooring floor having a low friction coefficient, the robot's face is oriented forward. Meanwhile, when the robot travels on a carpet having a high friction coefficient, the robot's face is oriented upward. Hence, the Inventor found that, even though the robot was moved by the same travel processing, the position of the robot's face, that is, the attitude of the robot varied depending on the material for the floor surface rather than internal processing of the robot.
  • a robot includes a spherical housing, a frame disposed in the housing, a display unit that is provided on the frame, a set of drive wheels that are provided on the frame, a weight drive mechanism that is provided on the frame, an angular velocity sensor, and a control circuit.
  • the display unit displays at least a portion of a face of the robot.
  • the set of drive wheels rotate and move the housing while being in contact with an inner circumferential face of the housing.
  • the weight drive mechanism reciprocates a weight in a predetermined direction.
  • the angular velocity sensor detects angular velocity about a crosswise direction that is perpendicular to a travelling direction of the housing.
  • the control circuit if the control circuit determines, while the housing is being rotated and moved, that a rotational angle of the housing when viewed from front in the travelling direction changes upward beyond a predetermined angle based on a change in the angular velocity about the crosswise direction, moves the weight frontward in the travelling direction of the housing by a distance corresponding to the rotational angle.
  • the weight is moved forward in the travelling direction of the housing by a distance corresponding to the rotational angle.
  • the position of the robot's face that is, the attitude of the robot can be prevented from unnaturally changing due to the material for the floor surface rather than internal processing of the robot, irrespective of the same travelling processing.
  • FIG. 1 is a perspective view illustrating the external appearance of a robot 1 according to an embodiment of the present disclosure.
  • the robot 1 includes a spherical housing 101 .
  • the housing 101 is formed of a transparent or translucent member, for example.
  • FIG. 2 is a perspective view illustrating the inside of the robot 1 according to the embodiment of the present disclosure.
  • a frame 102 is disposed in the housing 101 .
  • the frame 102 has a first rotating plate 103 and a second rotating plate 104 .
  • the first rotating plate 103 is located above the second rotating plate 104 .
  • the first rotating plate 103 and the second rotating plate 104 correspond to an example of a base.
  • a first display unit 105 and a second display unit 106 are provided on the upper face of the first rotating plate 103 .
  • a third display unit 107 is provided on the upper face of the second rotating plate 104 .
  • the first display unit 105 , the second display unit 106 , and the third display unit 107 each are configured of a plurality of light emitting diodes.
  • the first display unit 105 , the second display unit 106 , and the third display unit 107 can display information of facial expressions of the robot 1 .
  • the first display unit 105 , the second display unit 106 , and the third display unit 107 individually control lighting of the plurality of light emitting diodes to display a portion of the face of the robot 1 such as an eye and a mouth as illustrated in FIG. 1 .
  • the first display unit 105 displays an image of the left eye
  • the second display unit 106 displays an image of the right eye
  • the third display unit 107 displays an image of the mouth.
  • the images of the left eye, the right eye, and the mouth penetrate the main housing 101 made of a transparent or translucent member, and are emitted to the outside.
  • a camera 108 is provided on the upper face of the first rotating plate 103 .
  • the camera 108 acquires an image of environment around the robot 1 .
  • the camera 108 constitutes a portion of the face of the robot 1 , such as a nose.
  • an optical axis of the camera 108 is oriented to the front of the robot 1 . Therefore, the camera 108 can take an image of an object to be recognized presented to the front of the robot.
  • a control circuit 109 is provided on the upper face of the first rotating plate 103 .
  • the control circuit 109 controls various operations of the robot 1 . Details of the control circuit 109 will be described later with reference to FIG. 15 .
  • a first drive wheel 110 and a second drive wheel 111 each are provided on the lower face of the second rotating plate 104 , and are in contact with the inner circumferential face of the housing 101 .
  • the first drive wheel 110 has a first motor 112 that drives the first drive wheel 110 .
  • the second drive wheel 111 has a second motor 113 that drives the second drive wheel 111 . That is, the first drive wheel 110 and the second drive wheel 111 are driven by the respective independent motors. Details of the operation of the robot 1 driven by the first drive wheel 110 and the second drive wheel 111 will be described later.
  • the first drive wheel 110 and the second drive wheel 111 constitute a pair of drive wheels.
  • FIG. 3 is a side view illustrating the inside of the robot 1 according to the embodiment of the present disclosure when viewed from A in FIG. 2 .
  • a counterweight 114 (an example of a weight) is provided between the first rotating plate 103 and the second rotating plate 104 .
  • the counterweight 114 is located somewhat below the center of the housing 101 . Accordingly, the center of gravity of the robot 1 is located below the center of the housing 101 . This can stabilize the operation of the robot 1 . Viewing from A means that the robot 1 is viewed from right toward left.
  • the robot 1 to drive the counterweight 114 , the robot 1 includes a guide shaft 115 that specifies the moving direction of the counterweight 114 , a swing arm 116 that specifies the position of the rotating direction of the counterweight 114 , a rotational motor 117 that rotates the swing arm 116 , a rotating shaft 118 that connects the swing arm 116 to the rotational motor 117 , a belt 119 used to drive the counterweight 114 ( FIGS. 8A and 8B ), a motor pulley 120 that is in contact with the belt 119 ( FIGS. 8A and 8B ), and a weight drive motor not illustrated that rotates the motor pulley 120 .
  • the drive motor is built in the counterweight 114 . Details of the operation of the robot 1 driven by the counterweight 114 will be described later.
  • the rotating shaft 118 extends perpendicular to a drive axis of the first drive wheel 110 and the second drive wheel 111 .
  • the rotating shaft 118 corresponds to an example of a shaft provided on the frame 102 .
  • the first drive wheel 110 and the second drive wheel 111 get gradually away from each other toward the ground.
  • the drive axis of the first drive wheel 110 and the second drive wheel 111 is, for example, a virtual axis connecting the centers of the first drive wheel 110 and the second drive wheel 111 to each other.
  • the actual drive axis becomes the drive axis of the first drive wheel 110 and the second drive wheel 111 .
  • the robot 1 further includes a power source not illustrated and a microphone 217 ( FIG. 15 ).
  • the robot 1 is charged by a charger not illustrated.
  • the microphone 217 acquires sound of environment around the robot 1 .
  • FIG. 4 is a side view illustrating linear movement of the robot 1 according to the embodiment of the present disclosure when viewed from A in FIG. 2 .
  • FIG. 5 is a plan view illustrating the rotation of the robot 1 according to the embodiment of the present disclosure when viewed from B in FIG. 2 .
  • FIG. 6 is a perspective view illustrating the rotation of the robot 1 according to the embodiment of the present disclosure. Looking from B means that the robot is viewed from above.
  • the housing 101 rotates about a vertical axis passing through the housing due to the motive power. That is, the robot 1 rotates clockwise or counterclockwise at the spot. In this manner, the robot 1 moves forward, moves rearward, or rotates.
  • FIG. 7 is a view illustrating the weight drive mechanism in the side view of FIG. 3 .
  • FIG. 8A is a perspective view illustrating the operation of the drive mechanism for the counterweight 114 to drive the counterweight 114 in a predetermined linear direction.
  • FIG. 8B is a side view illustrating the operation of the drive mechanism for the counterweight 114 to drive the counterweight 114 in a predetermined linear direction.
  • FIG. 8C is a side view illustrating the state where the counterweight 114 reciprocates in a predetermined linear direction in the side view of FIG. 3 .
  • FIG. 9A is a perspective view illustrating the operation of the drive mechanism for the counterweight 114 to rotate the swing arm 116 .
  • FIG. 9A is a perspective view illustrating the operation of the drive mechanism for the counterweight 114 to rotate the swing arm 116 .
  • FIG. 9B is a side view illustrating the operation of the weight drive mechanism to rotate the swing arm 116 .
  • FIG. 9C is a plan view illustrating the state where the swing arm 116 of the robot 1 according to the embodiment of the present disclosure rotates when viewed from B in FIG. 2 .
  • the center of the swing arm 116 is a default position of the counterweight 114 .
  • the first rotating plate 103 and the second rotating plate 104 become substantially parallel to a floor surface, to form the face of the robot 1 , for example, eyes, a nose, and mouth are oriented in a default direction.
  • the weight drive motor not illustrated built in the counterweight 114 rotates the motor pulley 120 coupled to the weight drive motor.
  • the rotated motor pulley 120 rolls on the belt 119 , such that the counterweight 114 moves in the swing arm 116 .
  • the counterweight 114 reciprocates in the linear direction in the swing arm 116 by changing the rotating direction of the motor pulley 120 , that is, the driving direction of the weight drive motor.
  • the counterweight 114 reciprocates in the swing arm 116 along the guide shaft 115 in the linear direction.
  • the rotational motor 117 rotates the rotating shaft 118 to rotate the swing arm 116 connected to the rotating shaft 118 ( FIG. 3 ).
  • the swing arm 116 can be rotated clockwise and counterclockwise.
  • FIG. 10 is a side view illustrating the attitude of the robot 1 in which the counterweight 114 is located to the front when viewed from A in FIG. 2 .
  • FIG. 11 is a side view illustrating the attitude of the robot 1 in which the counterweight 114 is located to the rear when viewed from A in FIG. 2 .
  • FIG. 12 is a front view illustrating the attitude of the robot 1 in which the counterweight 114 is located to the right when viewed from C in FIG. 2 .
  • FIG. 13 is a front view illustrating the attitude of the robot 1 in which the counterweight 114 is located to the left when viewed from C in FIG. 2 . Looking from C means that the robot 1 is viewed from the front.
  • the robot 1 in the state where the swing arm 116 is perpendicular to the front of the robot 1 , when the counterweight 114 reciprocates from one end to the other end of the swing arm 116 , the robot 1 alternately tilts forward and rearward as represented by the arrow 121 and the arrow 122 , respectively. That is, the robot 1 rotates with a predetermined angle in the vertical direction.
  • the first display unit 105 , the second display unit 106 , and the third display unit 107 express a portion of the face of the robot 1 , such as eyes and a mouth.
  • the robot 1 can be alternately tilted forward and rearward using the counterweight 114 , as if the robot 1 is short of breath or sleepy.
  • the robot 1 can notify the user that remaining power of the power source is small, without displaying information on the remaining power, which is unrelated to the face, on the first display unit 105 , the second display unit 106 , and the third display unit 107 .
  • the robot 1 in the state where the swing arm 116 is parallel to the front of the robot 1 , when the counterweight 114 reciprocates from one end to the other end of the swing arm 116 , the robot 1 alternately tilts right and left as represented by the arrow 123 and the arrow 124 , respectively. That is, the robot 1 swings side-to-side with a predetermined angle.
  • the first display unit 105 , the second display unit 106 , and the third display unit 107 express a portion of the face of the robot 1 , such as eyes and a mouth.
  • the robot 1 can be alternately tilted rightward and leftward using the counterweight 114 , as if the robot 1 feels good or is thinking deeply.
  • FIG. 14 is a view illustrating an example of overall configuration of a robot system 1500 using the robot 1 according to the embodiment of the present disclosure.
  • the robot system 1500 includes a cloud server 3 , a portable terminal 4 , and the robot 1 .
  • the robot 1 is connected to the Internet via Wifi (registered trademark), and to the cloud server 3 .
  • the robot 1 is also connected to the portable terminal 4 via Wifi (registered trademark), for example.
  • Wifi registered trademark
  • a user 1501 is a child
  • users 1502 , 1503 are parents of the child.
  • an application cooperating with the robot 1 is installed on the portable terminal 4 .
  • the portable terminal 4 can issue various instructions to the robot 1 using the application, and display the image recognition result described referring to FIG. 14 .
  • the robot 1 When receiving a request to read a picture book to the child from the portable terminal 4 , the robot 1 reads the picture book aloud to the child. When accepting a question during reading of the picture book, the robot 1 transmits the question to the cloud server 3 , receives an answer to the question from the cloud server 3 , and makes the answer.
  • the user 1501 can treat the robot 1 like a pet, and learn language through communication with the robot 1 .
  • FIG. 15 is a block diagram illustrating the robot 1 according to the embodiment of the present disclosure.
  • the robot 1 includes the control circuit 109 , a display unit 211 , a shaft control unit 213 , the rotating shaft 118 , a housing drive wheel control unit 214 , a housing drive wheel 212 , a weight drive mechanism control unit 215 , a weight drive mechanism 218 , an attitude detection unit 219 , the microphone 217 , a speaker 216 , the camera 108 , and a communication unit 210 .
  • the control circuit 109 is configured of a computer including a memory 206 , a main control unit 200 configured of a processor such as a CPU, a display information output control unit 205 , and a timer not illustrated that checks the time.
  • the memory 206 is configured of, for example, a nonvolatile rewritable storage device that stores a program for controlling the robot 1 and so on.
  • the main control unit 200 executes the control program for controlling the robot 1 , which is stored in the memory 206 . Thereby, the main control unit 200 functions as a travelling state determination unit 201 , an avoidance action control unit 202 , and an attitude control unit 203 .
  • the attitude detection unit 219 includes an acceleration sensor 221 and an angular velocity sensor 222 .
  • the acceleration sensor 221 is configured of a three-axis acceleration sensor attached to the first rotating plate 103 . As illustrated in FIG. 2 , the acceleration sensor 221 detects an acceleration (an example of a first acceleration) in a vertical direction (Z direction), an acceleration in a crosswise direction (X direction), and an acceleration (an example of second acceleration) in a front-rear direction (Y direction).
  • the vertical direction is orthogonal to the principal plane of the first rotating plate 103 .
  • the crosswise direction is a right-left direction when the robot 1 is viewed from the front.
  • the front-rear direction is orthogonal to the vertical direction and the crosswise direction. Accordingly, the front-rear direction is parallel to the principal plane of the first rotating plate 103 .
  • the acceleration sensor 221 outputs the detected acceleration in the three directions to the main control unit 200 .
  • the acceleration sensor 221 and the angular velocity sensor 222 may be attached to the lower face of the first rotating plate 103 , or the upper or lower face of the second rotating plate 104 , rather than the upper face of the first rotating plate 103 .
  • the angular velocity sensor 222 detects the angular velocity of the robot 1 about the crosswise direction, that is, the angular velocity of the robot 1 in a pitch direction. Further, the angular velocity sensor 222 detects the angular velocity of the robot 1 about the vertical direction, that is, the angular velocity of the robot 1 in a yaw direction. Further, the angular velocity sensor 222 detects the angular velocity of the robot 1 about the front-rear direction, that is, the angular velocity of the robot 1 in a roll direction.
  • the microphone 217 is provided on the frame 102 , converts sound into an electric signal, and outputs the electric signal to the main control unit 200 .
  • the microphone 217 may be attached to the upper face of the first rotating plate 103 , or may be attached to the upper face of the second rotating plate 104 .
  • the main control unit 200 recognizes whether or not the user's voice is present in the sound acquired by the microphone 217 , and stores voice recognition results in the memory 206 to manage the voice recognition results.
  • the main control unit 200 compares voice recognition data stored in the memory 206 with the acquired sound, and recognizes speech contents and the user who spoke.
  • the speaker 216 is provided on the frame 102 such that an output face is oriented to the front, and converts the electric signal of sound into physical vibrations.
  • the main control unit 200 outputs predetermined vice via the speaker 216 to enable the robot 1 to speak.
  • the camera 108 takes an image in front of the robot 1 (Y direction), and outputs the image (hereinafter referred to as taken image) to the main control unit 200 .
  • the main control unit 200 recognizes presence/absence, position, and size of the user's face from the taken image acquired by the camera 108 , and stores face recognition results in the memory 206 to manage the face recognition results.
  • the main control unit 200 generates a command based on the voice recognition result and the face recognition result, and outputs the command to the display information output control unit 205 , the shaft control unit 213 , the housing drive wheel control unit 214 , the weight drive mechanism control unit 215 , and the communication unit 210 .
  • the display information output control unit 205 displays information on facial expression of the robot 1 on the display unit 211 .
  • the display unit 211 is configured of the first display unit 105 , the second display unit 106 , and the third display unit 107 , which are described with reference to FIG. 2 .
  • the shaft control unit 213 rotates the rotating shaft 118 described with reference to FIGS. 9A and 9B .
  • the shaft control unit 213 is configured of the rotational motor 117 described with reference to FIGS. 9A and 9B .
  • the housing drive wheel control unit 214 operates the housing drive wheel 212 of the robot 1 .
  • the housing drive wheel control unit 214 is configured of the first motor 112 and the second motor 113 , which are described with reference to FIG. 2 .
  • the housing drive wheel 212 is configured of the first drive wheel 110 and the second drive wheel 111 , which are described with reference to FIG. 2 .
  • the housing drive wheel 212 corresponds to an example of a set of drive wheels.
  • the weight drive mechanism control unit 215 operates the weight drive mechanism 218 of the robot 1 .
  • the weight drive mechanism control unit 215 is configured of a weight drive motor not illustrated built in the counterweight 114 .
  • the weight drive mechanism 218 is configured of the guide shaft 115 , the swing arm 116 , the rotational motor 117 , the belt 119 , the motor pulley 120 , and the weight drive motor not illustrated, which are described with reference to FIGS. 3, 8A, and 8B .
  • the communication unit 210 is configured of a communication device capable of connecting the robot 1 to the cloud server 3 ( FIG. 14 ).
  • Examples of the communication unit 210 include, but are not limited to, a wireless LAN communication device such as Wifi (registered trademark). According to a command from the main control unit 200 , the communication unit 210 communicates with the cloud server 3 .
  • FIG. 16 is a flow chart illustrating an example of a main routine of the robot 1 according to the embodiment of the present disclosure.
  • the flow chart in FIG. 16 is periodically performed at sampling interval ⁇ t.
  • the main control unit 200 checks whether or not the first motor 112 and the second motor 113 rotate (S 101 ).
  • the main control unit 200 differentiates rotational angles of the first motor 112 and the second motor 113 , which are detected by respective encoders of the first motor 112 and the second motor 113 to find rotational rates of the first motor 112 and the second motor 113 .
  • the main control unit 200 may determine that the robot 1 is “not rotating”, that is, “suspended” when both of the found rotational rates of the first motor 112 and the second motor 113 are substantially 0, and determine that the robot 1 is “rotating”, when at least one of the rotational rates of the first motor 112 and the second motor 113 is not substantially 0.
  • the main control unit 200 proceeds the processing to S 103 . Meanwhile, if it is determined that the robot is “not rotating” in S 101 (NO in S 102 ), the main control unit 200 finishes the processing.
  • the travelling state determination unit 201 executes travelling state determination processing. Details of the travelling state determination processing will be described later with reference to FIG. 17 .
  • the processing branches depending on the result of the travelling state determination processing (S 103 ). That is, if the result of the travelling state determination processing indicates “idling” (“idling” in S 104 ), the avoidance action control unit 202 executes idling control processing (S 105 ), and finishes the processing. Details of the idling control processing will be described later with reference to FIG. 25 . If the result of the travelling state determination processing indicates “uphill travelling” (“uphill travelling” in S 104 ), the main control unit 200 finishes the processing.
  • attitude control unit 203 executes attitude control processing (S 106 ), and finishes the processing. Details of the attitude control processing will be described later with reference to FIG. 27 .
  • the travelling state refers to the travelling state of the robot 1 while the first motor 112 and the second motor 113 are rotating, and includes “idling”, “uphill travelling”, “frictional surface travelling”, and “normal travelling”.
  • the “frictional surface travelling” refers to the state where the robot 1 is travelling on the floor surface having a friction coefficient higher than the typical friction coefficient by a certain value (for example, carpet).
  • the robot 1 is designed such that the Y direction becomes parallel to the travelling direction in FIG. 2 when the robot 1 is travelling on the wood flooring floor having the typical friction coefficient at a predetermined target rate.
  • the position of the first to third display units 105 to 107 at this time is a reference position of the face of the robot 1
  • the angle that forms the Y direction with the travelling direction due to friction increases, turning the face of the robot 1 above the reference position.
  • the face orientation is returned to the reference position.
  • the “normal travelling” refers to the state where the robot 1 is travelling on a flat floor surface having the typical friction coefficient.
  • the “uphill travelling” refers to the state where the robot 1 is going uphill.
  • the “idling” refers to the state where the first motor 112 and the second motor 113 are rotating, but the robot 1 is static.
  • FIG. 17 is a flow chart illustrating details of the travelling state determination processing (S 103 in FIG. 16 ).
  • the travelling state determination unit 201 executes moving state determination processing (S 201 ). Details of the moving state determination processing will be described later with reference to FIG. 18 .
  • the travelling state determination unit 201 executes attitude change determination processing (S 203 ). Details of the attitude change determination processing will be described later with reference to FIG. 19 . Meanwhile, if the result of the moving state determination processing does not indicate “moving state” (NO in S 202 ), the travelling state determination unit 201 determines the travelling state of the robot 1 as “idling” (S 210 ), and the processing returns to S 104 in FIG. 16 .
  • the travelling state determination unit 201 executes frictional surface travelling determination processing (S 205 ). Details of the frictional surface travelling determination processing will be described later with reference to FIG. 22 . Meanwhile, if the result of the attitude change determination processing indicates “no attitude change” (NO in S 204 ), the travelling state determination unit 201 determines the travelling state of the robot 1 as “normal travelling” (S 209 ), and the processing returns to S 104 in FIG. 16 .
  • the travelling state determination unit 201 determines the travelling state as “uphill travelling” (S 207 ), and the processing returns to S 104 in FIG. 16 .
  • the travelling state determination unit 201 determines the travelling state of the robot 1 as “frictional surface travelling” (S 208 ), and the processing returns to S 104 in FIG. 16 .
  • FIG. 18 is a flow chart illustrating details of moving state determination processing (S 201 in FIG. 17 ).
  • the travelling state determination unit 201 acquires acceleration A from the acceleration sensor 221 (S 301 ).
  • the travelling state determination unit 201 differentiates acceleration Ay in the Y direction among the acceleration A acquired in S 301 to calculate current rate Vy of the robot 1 in the Y direction (S 302 ).
  • the travelling state determination unit 201 determines that the robot 1 is “moving state” (S 304 ).
  • the “moving state” refers to the state where the first motor 112 and the second motor 113 do not idle and the robot 1 is actually travelling. Specifically, “moving state” includes the above-mentioned “uphill travelling”, “frictional surface travelling”, and “normal travelling”. Meanwhile, if the current rate Vy of the robot 1 in the Y direction is 0 (NO in S 303 ), the travelling state determination unit 201 returns the processing to S 202 in FIG. 17 . In the case if NO in S 303 , NO is selected in S 202 in FIG. 17 , and the travelling state of the robot 1 is determined as “idling” (S 210 ).
  • FIG. 19 is a flow chart illustrating details of attitude determination processing (S 203 in FIG. 17 ).
  • the travelling state determination unit 201 acquires the acceleration A from the acceleration sensor 221 , and angular velocity ⁇ from the angular velocity sensor 222 (S 401 ).
  • the travelling state determination unit 201 calculates an amount of change ⁇ of attitude angle ⁇ that is the angle of the robot 1 in the pitch direction from angular velocity ⁇ p in the pitch direction among the angular velocity ⁇ acquired in S 401 (S 402 ).
  • FIG. 20 is a view illustrating the attitude angle ⁇ of the robot 1 .
  • FIG. 20 illustrates the state having the attitude angle ⁇ of 0.
  • the attitude angle ⁇ refers to the angle that forms the Y direction with a reference direction D 1 .
  • the reference direction D 1 is a direction acquired by projecting the travelling direction of the robot 1 onto a horizontal surface E 1 .
  • the travelling state determination unit 201 calculates the current attitude angle ⁇ (S 403 ).
  • Values of the acceleration Az′ calculated in S 404 for at least a certain period are stored in the memory to be used in below-mentioned frictional surface travelling determination processing ( FIG. 22 ).
  • the acceleration Az′ is an example of a second value.
  • the symbol “ ⁇ ” added to g ⁇ cos ⁇ means that upward is represented by plus, and downward is represented by minus.
  • FIG. 21 is a graph illustrating the attitude determination processing, a vertical axis represents the angular velocity ⁇ p (degree/sec) in the pitch direction, and a horizontal axis represents time. In FIG. 21 , dotted lines drawn in parallel to the vertical axis represent sampling points.
  • a waveform W 1 indicates a shift of the angular velocity ⁇ p with time. Since an area between the waveform W 1 and the time axis represents an integrated value of the angular velocity ⁇ p, the area refers to the attitude angle ⁇ .
  • the lower limit angle ⁇ L is the attitude angle ⁇ that satisfies a condition for starting timekeeping of determination time TD.
  • the travelling state determination unit 201 increments a count for keeping the determination time TD (S 406 ). Since the flow chart of FIG. 19 is performed every sampling interval ⁇ t, the count is incremented every the sampling interval ⁇ t.
  • the attitude determination processing when the attitude angle ⁇ exceeds the lower limit angle ⁇ L, keeping of the determination time TD is started. This is due to that, during frictional surface travelling and uphill travelling of the robot 1 , the attitude angle ⁇ is assumed to keep the lower limit angle ⁇ L or more. Therefore, the lower limit angle ⁇ L adopts a minimum value of the attitude angle ⁇ of the robot 1 assumed during frictional surface travelling or uphill travelling of the robot 1 .
  • the travelling state determination unit 201 proceeds the processing to S 411 .
  • the travelling state determination unit 201 determines the result of the attitude determination processing as “attitude change” (S 408 ), and finishes keeping of the determination time TD (S 409 ). In this case, the travelling state determination unit 201 may reset the count of the determination time TD to 0.
  • the robot 1 performs frictional surface travelling and uphill travelling while keeping a certain level of attitude angle ⁇ .
  • the travelling state determination unit 201 determines that the attitude of the robot 1 has changed. This can prevent the travelling state determination unit 201 from wrongly determining that the robot 1 is conducting frictional surface travelling or uphill travelling due to a temporal change in the attitude angle ⁇ caused, for example, when the robot 1 runs onto a garbage on the wood flooring floor.
  • the travelling state determination unit 201 sets an attitude control angle ⁇ C to the current attitude angle ⁇ (S 410 ), and returns the processing to S 204 in FIG. 17 .
  • the attitude control angle ⁇ C becomes the attitude angle ⁇ of the robot 1 at an end time at an end point EP of the determination time TD ( FIG. 21 ). That is, the attitude control angle ⁇ C becomes the lower limit angle ⁇ L+amount of change ⁇ _TD of the attitude angle ⁇ for the determination time TD. Accordingly, even when the attitude angle ⁇ continues to increase after the end point EP, the attitude control angle ⁇ C is the attitude angle ⁇ at the end point EP.
  • the travelling state determination unit 201 determines the result of the attitude determination processing as “no attitude change”, and returns the processing to S 204 in FIG. 17 .
  • the attitude angle ⁇ may repeatedly fluctuate up and down around the lower limit angle ⁇ L.
  • the travelling state determination unit 201 may determine “attitude change” due to the accumulated value of the count.
  • the processing in S 411 , S 412 is provided. This can prevent the value in the count from being accumulated when the attitude angle ⁇ repeatedly fluctuates up and down around the lower limit angle ⁇ L.
  • the travelling state determination unit 201 can be prevented from wrongly determining “attitude change”.
  • the attitude determination processing will be summarized with reference to FIG. 21 .
  • the travelling state determination unit 201 acquires the angular velocity ⁇ p at the sampling interval ⁇ t, and adds up the acquired angular velocity ⁇ p to monitor the current attitude angle ⁇ . Then, when the attitude angle ⁇ reaches the lower limit angle ⁇ L, the travelling state determination unit 201 determines a start point SP when keeping of the determination time TD is started arrives, and starts to keep the determination time TD. Then, if the attitude angle ⁇ becomes less than the lower limit angle ⁇ L for the determination time TD, the travelling state determination unit 201 selects NO in S 405 in FIG. 19 to determine the result as “no attitude change” (S 411 ). Meanwhile, if the attitude angle ⁇ keeps the lower limit angle ⁇ L or more by the end point EP in the determination time TD, the travelling state determination unit 201 determines the result as “attitude change” (S 408 in FIG. 19 ).
  • FIG. 22 is a flow chart illustrating details of frictional surface travelling determination processing (S 205 in FIG. 17 ).
  • the travelling state determination unit 201 determines the result as “frictional surface travelling (S 503 ).
  • the processing in FIG. 22 is finished, the processing returns to S 206 in FIG. 17 .
  • FIG. 23A is a schematic view illustrating the state of the robot 1 during “normal travelling”.
  • FIG. 23B is a schematic view illustrating the state of the robot 1 during “frictional surface travelling”.
  • FIG. 23C is a schematic view illustrating the state of the robot 1 during “uphill travelling”.
  • FIG. 24A is a graph illustrating a shift of the acceleration Az exerted on the robot 1 in the vertical direction with time according to the travelling state.
  • FIG. 24B is a graph illustrating a shift of the acceleration Az′ exerted on the robot 1 with time according to the travelling state.
  • a vertical axis represents the acceleration Az
  • a horizontal axis represents time.
  • a vertical axis represents the acceleration Az′
  • a horizontal axis represents time.
  • waveforms W 211 , W 221 represent accelerations Az, Az′, respectively, exerted when the travelling state is switched from “normal travelling: time T 1 ” to “frictional surface travelling: time T 2 ”, and waveforms W 212 , W 222 represent accelerations Az, Az′, respectively, exerted when the travelling state is switched from “normal travelling: time T 1 ” to “uphill travelling: time T 2 ”.
  • the robot 1 travels on a flat floor face FA having the typical friction coefficient at a predetermined target rate.
  • the robot 1 since the robot 1 is designed such that the Y direction is parallel to the floor face FA, the Y direction becomes parallel to the travelling direction D 2 of the robot 1 .
  • a gravitational component ( ⁇ g) is added to the robot 1 in the Z direction, as represented by time T 1 in FIG. 24A , the acceleration Az both in the waveforms W 211 , W 212 is ⁇ g.
  • the acceleration Az′ both in the waveforms W 221 , W 222 keeps substantially 0. Pulsation of the waveforms in FIGS. 24A and 24B is caused by vibrations of the floor and the like.
  • the robot 1 is oriented upward with the attitude angle ⁇ with respect to the travelling direction D 2 that is parallel to the floor face FB, and travels on the floor face FB at a rate V in the travelling direction D 2 . Accordingly, during frictional surface travelling, the robot 1 has rate Vy in the Y direction and rate Vz in the Z direction.
  • the rate V decreases one by friction and so on, but the robot 1 is controlled to travel at a uniform rate and thus, the rate V returns to target rate soon.
  • acceleration az caused by a change in the rate Vz is added to the robot 1 in the Z direction.
  • the rate Vz decreases and then, increases, the acceleration az changes in the ⁇ direction and then, changes in the +direction. Therefore, in the transient period of frictional surface travelling, as represented by the waveform W 221 in FIG. 24B , the waveform of acceleration Az′ protrudes downward.
  • the robot 1 travels on the sloping road FC at the rate V while being oriented upward with the inclination angle ⁇ with respect to the reference direction D 1 .
  • the robot 1 since the Y direction of the robot 1 becomes parallel to the sloping road FC (travelling direction D 2 ), the robot 1 has only the rate component in the Y direction, and has no rate component in the Z direction.
  • the acceleration az caused by the rate Vz is not added to the robot 1 as in frictional surface travelling, and the acceleration of ⁇ g ⁇ cos ⁇ caused by gravity is added to the robot 1 .
  • the acceleration Az gradually increases from the ⁇ side to the +side according to cos ⁇ .
  • the travelling state determination unit 201 determines the travelling state of the robot 1 as frictional surface travelling (YES in S 501 ). Meanwhile, if the acceleration Az′ is not kept to be less than the reference value for the certain time, the travelling state determination unit 201 determines the travelling state of the robot 1 as uphill travelling (NO in S 501 ).
  • the accelerations Az′ for a certain time are stored in the memory.
  • the travelling state determination unit 201 can calculate the waveform of the acceleration Az′ from values of the acceleration Az′ calculated during a certain time T 24 starting from the time P 24 .
  • the waveform protrudes downward as represented by the waveform W 221
  • the acceleration Az′ is kept to be less than the reference value for the certain time, and the travelling state is determined as frictional surface travelling.
  • the waveform is flat as represented by the waveform W 222
  • the acceleration Az′ is kept at the reference value or more for the certain time, the travelling state is determined as uphill travelling.
  • the certain time T 24 may be the above-described transient period.
  • the reference value may be a value that is lower than 0 by a certain margin.
  • FIG. 25 is a flow chart illustrating details of idling control processing (S 105 in FIG. 16 ).
  • FIGS. 26A, 26B, 26C, 26D, and 26E are views illustrating the idling control processing.
  • FIGS. 26A, 26B, 26C, 26D, and 26E illustrate the robot 1 when viewed from above.
  • the step number expressed as “S+numeral value” corresponds to the step number expressed as “S+numeral value”.
  • an obstacle 2600 obstructs movement of the robot 1 , and the robot 1 is idling.
  • the obstacle 2600 is a power line and however, it is merely an example.
  • an object such as a wall may be the obstacle 2600 .
  • the avoidance action control unit 202 rotates the first drive wheel 110 and the second drive wheel 111 reversely (S 601 ).
  • the avoidance action control unit 202 may issue a command to reversely rotate the first drive wheel 110 and the second drive wheel 111 to the housing drive wheel control unit 214 , thereby moving the robot 1 in an opposite direction D 262 to the current travelling direction (D 261 ).
  • the robot 1 attempts to travel in the direction D 262 .
  • the travelling state determination unit 201 executes the moving state determination processing (S 602 ). Details of the moving state determination processing is described with reference to FIG. 18 and thus, detailed description thereof is omitted.
  • the robot 1 can travel in the direction D 262 , and the avoidance action control unit 202 rotates the robot 1 by 180 degrees (S 613 ), to bring the robot 1 into normal travelling using the direction D 262 as the travelling direction (S 614 ).
  • the avoidance action control unit 202 may output a command to rotate the first drive wheel 110 and the second drive wheel 111 in opposite directions until the robot 1 rotates by 180 degrees to the housing drive wheel control unit 214 , thereby rotating the robot 1 by 180 degrees.
  • the avoidance action control unit 202 may monitor the rotational angle of the robot 1 in the yaw direction by integrating the angular velocity coy in the yaw direction, which is detected by the angular velocity sensor 222 , and determine that the robot 1 rotates by 180 degrees when the rotational angle becomes 180 degrees.
  • the robot 1 cannot travel in the direction D 261 or the direction D 262 , and the avoidance action control unit 202 rotates the robot 1 counterclockwise by 90 degrees to change the travelling direction of the robot 1 to a direction D 263 (S 604 ). In this case, as illustrated in FIG. 26C , the robot 1 attempts to travel in the direction D 263 .
  • the travelling state determination unit 201 executes the moving state determination processing again (S 605 ).
  • the robot 1 can travel in the direction D 263 , and the avoidance action control unit 202 brings the robot 1 into normal travelling using the direction D 263 as the travelling direction (S 614 ).
  • the avoidance action control unit 202 rotates the robot 1 from the current travelling direction (direction D 263 ) by 180 degrees as illustrated in FIG. 26D , to change the travelling direction of the robot 1 to a direction D 264 (S 607 ).
  • the travelling state determination unit 201 executes the moving state determination processing again (S 608 ).
  • the result in S 608 indicates “moving state” (NO in S 609 )
  • the robot 1 can travel in the direction D 264
  • the avoidance action control unit 202 brings the robot 1 into normal travelling using the direction D 264 as the travelling direction (S 614 ).
  • the avoidance action control unit 202 determines that the avoidance action cannot be made and executes the processing in S 610 to S 612 .
  • the avoidance action control unit 202 rotates the robot 1 clockwise from the current travelling direction (direction D 264 ) by 90 degrees to change the travelling direction of the robot 1 to a direction D 265 .
  • the avoidance action control unit 202 outputs a command to move the counterweight 114 to an end in the opposite direction (D 266 ) to the current travelling direction (D 265 ) to the weight drive mechanism control unit 215 (S 611 ).
  • the weight drive mechanism control unit 215 moves the counterweight 114 to the rear end of the swing arm 116 (S 612 ).
  • the counterweight 114 is moved to the rear end of the swing arm 116 , such that the robot 1 leans rearward as represented by the arrow 122 . This imitates that the robot 1 hits against the obstacle 2600 and turns over.
  • FIG. 27 is a flow chart illustrating details of attitude direction control processing (S 106 in FIG. 16 ).
  • the attitude direction control processing is executed when the travelling state of the robot 1 is determined as frictional surface travelling in S 104 .
  • the attitude control unit 203 acquires the attitude control angle ⁇ C set by the travelling state determination unit 201 in S 410 in FIG. 19 (S 701 ).
  • the attitude control unit 203 calculates a movement amount of the counterweight 114 , which corresponds to the attitude control angle ⁇ C (S 702 ).
  • K is a coefficient for converting the attitude control angle ⁇ C into the movement amount, and is D_max/ ⁇ _max.
  • D_max denotes the maximum amplitude of the counterweight 114 .
  • the maximum amplitude D_max is a length from the center of the swing arm to the front or rear end.
  • ⁇ _max is the attitude angle ⁇ of the robot 1 found when the counterweight 114 is located at the maximum amplitude D_max.
  • 40 is a difference between the current attitude angle ⁇ and the attitude control angle ⁇ C. For example, when the current attitude angle is 0 degree, and the attitude control angle ⁇ C is 10 degrees, ⁇ becomes 10 degrees.
  • the attitude control unit 203 outputs a command to move the counterweight 114 forward by the movement amount D calculated in S 702 to the weight drive mechanism control unit 215 , thereby moving the counterweight 114 to the position corresponding to the attitude control angle ⁇ C (S 703 ).
  • the Y direction of the robot 1 is tilted upward with respect to a travelling direction D 2 by the attitude angle ⁇ .
  • the counterweight 114 may be moved forward by the movement amount D corresponding to the attitude angle ⁇ .
  • the attitude control unit 203 moves the counterweight 114 forward by the movement amount corresponding to the attitude control angle ⁇ C. This can match the Y direction with the travelling direction D 2 to return the face of the robot 1 to the default position.
  • the robot 1 in this embodiment can prevent from unnaturally travelling with the face oriented upward, depending on the material for the floor surface.
  • the counterweight 114 is not moved.
  • the face of the robot 1 is oriented slightly upward, the amount is small and thus, the face of the robot 1 need not be oriented downward.
  • the attitude angle ⁇ is less than the lower limit angle ⁇ L, the result is determined as no attitude change.
  • the attitude control processing is not executed.
  • the counterweight 114 is moved to the rear end of the swing arm 116 , and the face of the robot 1 is oriented above. This can imitate that the robot 1 hits against the obstacle 2600 and turns over.
  • the face of the robot 1 is oriented above to imitate that the robot 1 turns over.
  • the present disclosure is not limited to this, and when the robot 1 cannot move due to the presence of the obstacle 2600 , the counterweight 114 may be kept at the default position.
  • the acceleration sensor 221 is provided, but the acceleration sensor 221 may be omitted.
  • the attitude angle ⁇ can be calculated from the angular velocity detected by the angular velocity sensor 222 , directing the face of the robot 1 downward by the attitude angle ⁇ .
  • upward is set as plus, and downward is set as minus.
  • upward may be set as minus, and downward may be set as plus.
  • a robot includes a spherical housing, a frame disposed in the housing, a display unit that is provided on the frame, a set of drive wheels that are provided on the frame, a weight drive mechanism that is provided on the frame, an angular velocity sensor, and a control circuit.
  • the display displays at least a portion of a face of the robot.
  • the set of drive wheels rotate and move the housing while being in contact with an inner circumferential face of the housing.
  • the weight drive mechanism reciprocates a weight in a predetermined direction.
  • the angular velocity sensor detects angular velocity about a crosswise direction that is perpendicular to a travelling direction of the housing.
  • the control circuit if the control circuit determines, while the housing is being rotated and moved, that a rotational angle of the housing when viewed from front in the travelling direction changes upward beyond a predetermined angle based on a change in the angular velocity about the crosswise direction, moves the weight frontward in the travelling direction of the housing by a distance corresponding to the rotational angle.
  • a weight drive mechanism that reciprocates the weight in a predetermined direction is provided on the frame, and an angular velocity sensor that detects angular velocity about the crosswise direction that is perpendicular to the travelling direction of the housing is provided.
  • the position of the display unit is moved upward as the movement of the housing in the travelling direction when viewed in the travelling direction is restricted by friction between the housing and the floor surface.
  • the weight is moved forward in the travelling direction of the housing by a distance corresponding to the rotational angle.
  • the position of the robot's face that is, the attitude of the robot can be prevented from unnaturally changing due to the material for the floor surface rather than internal processing of the robot, irrespective of the same travelling processing.
  • a robot includes a spherical housing, a frame that is disposed in the housing, a display unit that is provided on the frame, a set of drive wheels that are provided on the frame, a weight drive mechanism that is provided on the frame, an acceleration sensor, an angular velocity sensor, and a control circuit.
  • the frame includes a base.
  • the display unit displays at least a portion of a face of the robot.
  • the set of drive wheels rotate and move the housing while the drive wheels being in contact with an inner circumferential face of the housing.
  • the weight drive mechanism reciprocates a weight in a predetermined direction.
  • the acceleration sensor detects a first acceleration in a vertical direction that is perpendicular to the base.
  • the angular velocity sensor detects angular velocity about a crosswise direction that is perpendicular to a travelling direction of the housing.
  • the control circuit acquires a second value by excluding a gravitational component from a first value indicative of the first acceleration outputted from the acceleration sensor.
  • control circuit determines, while the housing is being rotated and moved, that the second value changes from a reference value beyond a first change range and reaches a value corresponding to a downward direction that is perpendicular to the base, and that the rotational angle of the housing when viewed from the front in the travelling direction changes upward beyond a predetermined angle based on a change in the angular velocity about the crosswise direction, the control circuit moves the weight forward in the travelling direction of the housing by a distance corresponding to the rotational angle.
  • the position of the robot's face that is, the attitude of the robot can be prevented from unnaturally changing due to the material for the floor surface rather than internal processing of the robot, irrespective of the same travelling processing.
  • the control circuit determines, while the housing is being rotated and moved, that the second value changes within the first change range, and that, the rotational angle of the housing when viewed from the front in the travelling direction changes upward beyond the predetermined angle based on the change in the angular velocity about the crosswise direction, the control circuit does not move the weight forward in the travelling direction of the housing.
  • detection results of the acceleration sensor and the angular velocity sensor indicates that second value changes within the first change range, and that the rotational angle of the housing when viewed from the front in the travelling direction changes upward beyond the predetermined angle based the change in the angular velocity about the crosswise direction, the robot's face is oriented upward and further, the robot itself moves upward. Therefore, it can be estimated that the robot travels on a sloping road, for example.
  • the weight is not moved forward in the travelling direction of the housing.
  • the case where the robot goes uphill can be distinguished from the case where the robot travels on the carpet having a high friction coefficient.
  • the weight is not moved forward in the travelling direction of the housing, with the robot's face oriented upward.
  • the acceleration sensor detects a second acceleration in the travelling direction of the housing that is parallel to the base, and, while the housing is being rotated and moved, the control circuit moves the weight rearward in the travelling direction of the housing if the second value changes within the first change range, the change in the second acceleration falls within a second change range, and the change in the rotational angle of the housing falls within the predetermined angle.
  • waveforms outputted from the acceleration sensor and the angular velocity sensor indicate the following state.
  • the second value changes within the first change range
  • the change in the second acceleration falls within a second change range
  • the change in the rotational angle of the housing falls within the predetermined angle. That is, since the robot does not go uphill, but travels on the flat surface, the second value changes within the first change range. Since the robot hits against the wall and cannot move forward, the change in the second acceleration in the travelling direction of the housing falls within the second change range. Since the robot hits against the wall, but is not restricted in travelling by friction between the housing and the floor surface, the robot's face do not turn upward, and becomes idle without changing its attitude. Accordingly, the rotational angle of the housing falls within the predetermined angle.
  • the weight is moved rearward in the travelling direction of the housing.
  • the robot's face is oriented upward. That is, when the robot hits against the wall or the like during travelling, the robot's face is oriented upward on purpose to imitate that the robot hits against the wall and turns over.
  • the moving direction of the weight varies depending whether the robot hits against the wall or the like during travelling and becomes idle, or travels on the carpet having a high friction coefficient.
  • the robot's attitude is corrected to turn the robot's face upward on purpose as if the robot turns over. This can appeal the user that the robot hits against the wall or the like.
  • the acceleration sensor detects a second acceleration in the travelling direction of the housing that is parallel to the base, and, while the housing is being rotated and moved, the control circuit does not move the weight forward in the travelling direction of the housing if the second value changes within the first change range, a change in the second acceleration falls within a second change range, and a change in the rotational angle of the housing falls within the predetermined angle.
  • the case where the robot hits against the wall or the like during travelling and becomes idle is distinguished from the case where the robot travels on the carpet having a high friction coefficient.
  • the robot is not restricted in travelling by friction between the housing and the floor surface.
  • the weight is not moved forward in the travelling direction of the housing to remain the attitude of the robot unchanged.
  • a robot includes a spherical housing, a frame that is disposed in the housing, a display unit that is provided on the frame, a set of drive wheels that are provided on the frame, a weight drive mechanism that is provided on the frame, an acceleration sensor, an angular velocity sensor, and a control circuit.
  • the frame includes a base.
  • the display unit displays at least a portion of a face of the robot.
  • the set of drive wheels rotate and move the housing while being in contact with an inner circumferential face of the housing.
  • the weight drive mechanism reciprocates a weight in a predetermined direction.
  • the acceleration sensor detects a first acceleration in a vertical direction that is perpendicular to the base.
  • the angular velocity sensor detects angular velocity about a crosswise direction that is perpendicular to a travelling direction of the housing.
  • the control circuit acquires a second value by excluding a gravitational component from a first value indicative of the first acceleration outputted from the acceleration sensor.
  • the control circuit determines, while the housing is being rotated and moved, that the second value changes from a reference value beyond a first change range and reaches a value corresponding to a downward direction that is perpendicular to the base, and that the housing when viewed from the front in the travelling direction rotates from a reference position upward beyond a predetermined angle based on a change in the angular velocity about the crosswise direction, the control circuit determines a rotational angle of the housing based on a change in the angular velocity about the crosswise direction during a predetermined time after the start of the rotation by the housing from the reference position, and moves the weight from an initial position of the weight forward in the travelling direction of the housing by a distance corresponding to the rotational angle.
  • the housing rotates from the reference position upward when viewed from the front in the travelling direction beyond the predetermined angle, the display unit is estimated to move upward due to restriction of driving by friction or the like.
  • the rotational angle of the housing is determined based on the change in the angular velocity about the crosswise direction during a predetermined time after the start of the rotation of the housing from the reference position, and the weight is moved forward from an initial position of the weight in the travelling direction of the housing by a distance corresponding to the rotational angle.
  • the position of the robot's face that is, the attitude of the robot can be prevented from unnaturally changing due to the material for the floor surface rather than internal processing of the robot, irrespective of the same travelling processing.
  • the control circuit determines based on the change in the angular velocity about the crosswise direction that the rotation of the housing from the reference position returns to the predetermined angle or less before the predetermined time elapses, the control circuit does not move the weight.
  • the display unit may be temporarily moved upward by friction between the housing and the floor surface. In such case, if the display unit is moved downward, when the robot travels with the face oriented downward even after passing on the garbage.
  • the rotational of the housing from the reference position returns to the predetermined angle or less before the predetermined time elapses, control to move the weight is not performed.
  • the control circuit does not move the weight if the control circuit determines that the second value changes from the reference value beyond the first change range and reaches the value corresponding to the downward direction that is perpendicular to the base, and that the upward rotation of the housing from the reference position when viewed from the front in the travelling direction falls within the predetermined angle based on the change in the angular velocity about the crosswise direction.
  • the control circuit does not move the weight if the control circuit determines that the second value changes within the first change range, and that the housing when viewed from the front in the travelling direction rotates from the reference position upward beyond the predetermined angle based on the change in the angular velocity about the crosswise direction.
  • the acceleration sensor detects a second acceleration in the travelling direction of the housing that is parallel to the base, and, while the housing is being rotated and moved, the control circuit moves the weight from an initial position of the weight rearward in the travelling direction of the housing if the control circuit determines that the second value changes within the first change range, that the change in the second acceleration falls within a second change range, and that the upward rotation of the housing from the reference position when viewed from the front in the travelling direction falls within the predetermined angle or less based on the change in the angular velocity about the crosswise direction.
  • the acceleration sensor detects a second acceleration in the travelling direction of the housing that is parallel to the base, and, while the housing is being rotated and moved, the control circuit does perform control to move the weight if the control circuit determines that the second value changes within the first change range, that the change in the second acceleration falls within a second change range, and that the upward rotation of the housing from the reference position when viewed from the front in the travelling direction falls within the predetermined angle or less based on the change in the angular velocity about the crosswise direction.
  • the present disclosure is advantageous in that the robot can be caused to travel without presenting unnatural appearance.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Toys (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
US15/905,893 2016-07-08 2018-02-27 Robot Abandoned US20180185764A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016135805A JP2018005810A (ja) 2016-07-08 2016-07-08 ロボット
JP2016-135805 2016-07-08
PCT/JP2017/022041 WO2018008345A1 (ja) 2016-07-08 2017-06-15 ロボット

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/022041 Continuation WO2018008345A1 (ja) 2016-07-08 2017-06-15 ロボット

Publications (1)

Publication Number Publication Date
US20180185764A1 true US20180185764A1 (en) 2018-07-05

Family

ID=60912690

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/905,893 Abandoned US20180185764A1 (en) 2016-07-08 2018-02-27 Robot

Country Status (6)

Country Link
US (1) US20180185764A1 (ja)
EP (1) EP3483687A4 (ja)
JP (1) JP2018005810A (ja)
CN (1) CN107850896A (ja)
CA (1) CA2998310A1 (ja)
WO (1) WO2018008345A1 (ja)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190094874A1 (en) * 2017-09-22 2019-03-28 Panasonic Intellectual Property Management Co., Ltd. Robot
US11490018B2 (en) * 2017-07-06 2022-11-01 Japan Aerospace Exploration Agency Mobile image pickup device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3584662B1 (en) * 2018-06-19 2022-04-13 Panasonic Intellectual Property Management Co., Ltd. Mobile robot
CN112809745B (zh) * 2021-01-28 2023-12-15 深圳市水务工程检测有限公司 一种检测机器人移动姿势调节的方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110294397A1 (en) * 2010-05-25 2011-12-01 Fun Tram Corporation Remote control ball assembly
US20140179197A1 (en) * 2012-12-21 2014-06-26 Dyana Bradley Toy apparatus with simulated lcd screen face
US20160381257A1 (en) * 2015-06-29 2016-12-29 Asustek Computer Inc. Sphere panorama image capturing device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4601675A (en) * 1984-05-25 1986-07-22 Robinson Donald E Mechanized toy ball
GB2319756A (en) * 1996-11-28 1998-06-03 Leif Levon Spherical vehicle
JP2000218578A (ja) * 1999-02-03 2000-08-08 Sony Corp 球形ロボット
JP2004148439A (ja) * 2002-10-30 2004-05-27 Sony Corp 球形ロボット及び球形ロボットの制御方法
JP2005342818A (ja) * 2004-06-01 2005-12-15 Furukawa Electric Co Ltd:The 一足球体輪移動ロボット
SE0402672D0 (sv) * 2004-11-02 2004-11-02 Viktor Kaznov Ball robot
JP2006136962A (ja) * 2004-11-11 2006-06-01 Hitachi Ltd 移動ロボット
US10281915B2 (en) * 2011-01-05 2019-05-07 Sphero, Inc. Multi-purposed self-propelled device
CN103135549A (zh) * 2012-12-21 2013-06-05 北京邮电大学 一种具有视觉反馈的球形机器人运动控制系统及运动控制方法
KR20160016830A (ko) * 2013-05-06 2016-02-15 스페로, 아이엔씨. 다목적 자체 추진 장치

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110294397A1 (en) * 2010-05-25 2011-12-01 Fun Tram Corporation Remote control ball assembly
US20140179197A1 (en) * 2012-12-21 2014-06-26 Dyana Bradley Toy apparatus with simulated lcd screen face
US20160381257A1 (en) * 2015-06-29 2016-12-29 Asustek Computer Inc. Sphere panorama image capturing device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11490018B2 (en) * 2017-07-06 2022-11-01 Japan Aerospace Exploration Agency Mobile image pickup device
US20190094874A1 (en) * 2017-09-22 2019-03-28 Panasonic Intellectual Property Management Co., Ltd. Robot
US10921818B2 (en) * 2017-09-22 2021-02-16 Panasonic Intellectual Property Management Co., Ltd. Robot

Also Published As

Publication number Publication date
WO2018008345A1 (ja) 2018-01-11
CN107850896A (zh) 2018-03-27
EP3483687A4 (en) 2019-07-31
EP3483687A1 (en) 2019-05-15
JP2018005810A (ja) 2018-01-11
CA2998310A1 (en) 2018-01-11

Similar Documents

Publication Publication Date Title
US10507400B2 (en) Robot
US10921818B2 (en) Robot
US10307911B2 (en) Robot
US20180185764A1 (en) Robot
US20190015993A1 (en) Robot
US10799806B2 (en) Robot
JP7273566B2 (ja) ロボット、ロボットの制御方法及びプログラム
JP2021157203A (ja) 移動体制御装置および移動体制御方法、並びにプログラム
JP2020058596A (ja) ロボット
JP2018000845A (ja) ロボット
CN111867696B (zh) 信息处理装置、信息处理方法及程序
JP2020179005A (ja) ロボット及び方法
CN114867540B (zh) 信息处理装置、信息处理方法和信息处理程序
EP4353428A1 (en) Robot apparatus and controlling method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIYAZAKI, RYOUTA;OGAWA, KENTO;HIGUCHI, SEIYA;SIGNING DATES FROM 20180215 TO 20180216;REEL/FRAME:045759/0077

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION