WO2014162161A1 - Agencement d'épaule et de bras pour robot humanoïde - Google Patents
Agencement d'épaule et de bras pour robot humanoïde Download PDFInfo
- Publication number
- WO2014162161A1 WO2014162161A1 PCT/IB2013/000956 IB2013000956W WO2014162161A1 WO 2014162161 A1 WO2014162161 A1 WO 2014162161A1 IB 2013000956 W IB2013000956 W IB 2013000956W WO 2014162161 A1 WO2014162161 A1 WO 2014162161A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- arm
- axis
- robot
- torso
- leg
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H11/00—Self-movable toy figures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J18/00—Arms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/02—Casings; Cabinets ; Supports therefor; Mountings therein
- H04R1/028—Casings; Cabinets ; Supports therefor; Mountings therein associated with devices performing functions other than acoustics, e.g. electric candles
Definitions
- Robots are electromechanical devices capable of performing at least partially automated tasks.
- Humanoid robots are types of robots that are designed to aesthetically resemble a human being.
- the actions of humanoid robots may also be programmed such that the humanoid robot performs actions similar to those of humans.
- Humanoid robots may be used for performing tasks, for research purposes, or for entertainment.
- One exemplary embodiment relates to a robot including a torso section.
- the robot further includes a first arm having a first arm center axis, the first arm coupled to the torso section through a first shoulder joint, wherein the first arm is configured to rotate about a first axis and a second axis, wherein the second axis is generally perpendicular to the first axis.
- the robot further including a second arm having a second arm center axis, the second arm coupled to the torso section through a second shoulder joint, wherein the second arm is configured to rotate about the first axis and a third axis, wherein the third axis is generally perpendicular to the first axis.
- the first arm center axis defines a center of the first arm and the first arm center axis is offset from the first axis.
- the second arm center axis defines a center of the second arm and the second arm center axis is offset from the first axis.
- Another exemplary embodiment relates to a robot configured to play an audio file.
- the robot includes a torso section.
- the robot further includes a first arm having a first arm center axis, the first arm coupled to the torso section through a first shoulder joint, wherein the first arm is configured to rotate about a first axis and a second axis, wherein the second axis is generally perpendicular to the first axis.
- the robot includes a second arm having a second arm center axis, the second arm coupled to the torso section through a second shoulder joint, wherein the second arm is configured to rotate about the first axis and a third axis, wherein the third axis is generally perpendicular to the first axis.
- the robot further includes a first leg coupled to the torso section through a first hip joint, wherein the first leg is configured to rotate about a first hip axis.
- the robot includes a second leg coupled to the torso section through a second hip joint, wherein the second leg is configured to rotate about a second hip axis, and wherein the second leg is coupled to an opposite side of the torso section than the first leg.
- the robot further includes a first speaker having a generally circular front face, wherein the first speaker is coupled to the first leg.
- the robot includes a second speaker having a generally circular front face, wherein the second speaker is coupled to the second leg.
- the first arm center axis defines a center of the first arm.
- the first arm center axis is offset from the first axis.
- the second arm center axis defines a center of the second arm.
- the second arm center axis is offset from the first axis.
- the first hip axis intersects a center point of the generally circular front face of the first speaker.
- Still another exemplary embodiment relates to a robot configured to play an audio file.
- the robot includes a torso section.
- the robot further includes a first arm having a first arm center axis, the first arm coupled to the torso section through a first shoulder joint, wherein the first arm is configured to rotate about a first axis and a second axis, wherein the second axis is generally perpendicular to the first axis.
- the robot includes a second arm having a second arm center axis, the second arm coupled to the torso section through a second shoulder joint, wherein the second arm is configured to rotate about the first axis and a third axis, wherein the third axis is generally perpendicular to the first axis.
- the robot further includes a first speaker and second speaker.
- Yet another exemplary embodiment relates to a method in a robot configured to transform between a folded position and an unfolded position, the robot including a processor and memory.
- the method includes receiving an instruction to transform into the folded position from a user input of the robot.
- the method further includes executing by the processor a transformation program stored in the memory. The execution of the
- transformation program causes a first arm of the robot to fold against a torso of the robot, wherein the first arm has a first arm center axis that defines a center of the first arm, and the first arm is configured to rotate about a shoulder axis that is offset from the first arm center axis such that when the first arm is folded against the torso of the robot, the first arm pivots towards a front side of the torso.
- the execution of the transformation program causes a second arm of the robot to fold against the torso of the robot, wherein the second arm has a second arm center axis that defines a center of the second arm, and the second arm is configured to rotate about the shoulder axis, wherein the second arm center axis is offset from the shoulder axis such that when the first arm is folded against the torso of the robot, the first arm pivots towards a front side of the torso.
- the execution of the transformation program causes a first leg of the robot to fold against the torso, wherein the first leg is coupled to the same side of the torso as the first arm, wherein when the first leg is folded against the torso, the first leg pivots towards a back side of the torso behind the first arm.
- the execution of the transformation program causes a second of the robot to fold against the torso, wherein the second leg is coupled to the same side of the torso as the second arm, wherein when the second leg is folded against the torso, the second leg pivots towards a back side of the torso behind the second arm.
- FIG. 1A is a perspective view schematic drawing of a humanoid robot according to an exemplary embodiment.
- FIG. IB is a front view schematic drawing of the humanoid robot of FIG. 1 A according to the exemplary embodiment.
- FIG. 2A is a perspective view schematic drawing of a humanoid robot having a portion of its body removed according to an exemplary embodiment.
- FIG. 2B is a front view schematic drawing of a humanoid robot having a portion of its body removed according to an exemplary embodiment.
- FIG. 3 is a front view schematic drawing of a humanoid robot according to an exemplary embodiment.
- FIG. 4 is a general block diagram of a system for controlling a humanoid robot according to an exemplary embodiment.
- FIG. 5 is a front view schematic drawing of a humanoid robot according to an exemplary embodiment.
- FIG. 6 is a front view schematic drawing of a humanoid robot according to an exemplary embodiment.
- FIG. 7 is a perspective view schematic drawing of a humanoid robot during a stag of a transformation process according to an exemplary embodiment.
- FIG. 8 is a perspective view schematic drawing of a humanoid robot during a stag of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 9 is a perspective view schematic drawing of a humanoid robot during a stag of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 10 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 11 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 12 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 13 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process having a portion of its body removed according to the exemplary embodiment of FIG. 7.
- FIG. 14 is a top schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 15 is a top schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 16 is a top schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 17 is a top schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 18 is a top schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 19 is a top schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- FIG. 20 is a top schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.
- Robot 100 is shown according to an exemplary embodiment.
- Robot 100 is a humanoid robot.
- Robot 100 is shaped to approximately resemble a human body when in an unfolded position.
- Robot 100 includes head 101, torso 102, arms 103 and 104, and legs 105 and 106.
- Head 101 is coupled to torso 102 through neck 107.
- Arms 103 and 104 are respectively coupled to torso 102 via shoulder joints 108 and 109.
- Legs 105 and 106 are respectively coupled to torso 102 through hip joints 110 and 111.
- Arms 103 and 104 include elbow joints 112 and 113.
- Arms 103 and 104 further include wrist joints 114 and 115.
- Arms 103 and 104 may be rotated and/or bent about any of shoulder joints 108 and 109, elbow joints 112 and 113, and wrist joints 114 and 115.
- Legs 105 and 106 may be rotated about hip joints 110 and 111.
- Arm 104 includes hand 116, forearm 118, and upper arm 120.
- Hand 116 is connected to forearm 118 through wrist joint 114.
- Wrist joint 114 allows hand 116 to rotate with respect to forearm 1 18.
- Forearm 118 is connected to upper arm 120 through elbow joint 112.
- Elbow joint 112 allows forearm 118 to rotate with respect to upper arm 120.
- Upper arm 120 is connected to torso 102 through shoulder joint 108. Shoulder joint 108 allows upper arm 120 to rotate with respect to torso 102.
- arm 105 includes hand 117, forearm 119, and upper arm 121. Hand 117 is connected to forearm 119 through wrist joint 115.
- Wrist joint 115 allows hand 117 to rotate with respect to forearm 119.
- Forearm 119 is connected to upper arm 121 through elbow joint 113.
- Elbow joint 113 allows forearm 119 to rotate with respect to upper arm 121.
- Upper arm 121 is connected to torso 102 through shoulder joint 109.
- Shoulder joint 109 allows upper arm 121 to rotate with respect to torso 102.
- Leg 106 includes foot 122.
- leg 105 includes foot 123.
- Foot 122 and foot 123 provide support to robot 100 when robot 100 is in a standing position (as shown in FIG. 1A and FIG. IB).
- foot 122 and foot 123 may be comprised of multiple parts such that foot 122 can fold and retract into leg 106 and such that foot 123 can fold and retract into leg 105.
- Robot 100 includes speakers 131, 132, and 133.
- Speaker 131 is contained within and coupled to torso 102.
- Speaker 132 is coupled to torso 102 and leg 106 at or near hip joint 110.
- Speaker 133 is coupled to torso 102 and leg 105 at or near hip joint 111.
- speakers 131, 132, and 133 form a 2.1 channel stereo speaker system for audio playback.
- Speaker 131 may be a subwoofer.
- Speakers 132 and 133 may be tweeters.
- FIG. 2A and 2B show various axes of rotational freedom for various moving parts of an upper body of robot 100.
- the description that follows is specific to one side of robot 100 (e.g., the right side or the left side).
- the opposite side of robot 100 will include has the same arrangement of components, but in a mirrored manner for mounting on the opposite side of torso 102 as robot 100 may be symmetrical about a center axis.
- a detailed description of the components of legs 105 and 106 follows with respect to FIG. 3.
- Shoulder joint 108 is configured to allow rotation of arm 104 about two different axes with respect to torso 102.
- Arm 104 rotates about axis SI .
- Arm 104 is configured to rotate about axis SI and about axis S2.
- Axis SI is generally perpendicular to axis S2.
- Motor 201 is configured to rotate arm 104 with respect to torso 102 about axis SI .
- Motor 201 may be a DC motor.
- Motor 201 may be coupled to a gearbox configured to rotate arm 104 at a slower rotational speed than the rotational speed of motor 201 output.
- Arm 104 is not centered on axis S 1.
- Arm center axis AC is offset by distance 202 from axis SI . As discussed below with respect to FIG.
- offsetting arm center axis AC by distance 202 from axis SI allows for compact folding of arm 104 next to leg 106 when robot 100 is in the folded position. Additionally, the offsetting of arm center axis AC by distance 202 from axis SI helps ensure robot 100 will have a human-like appearance as arms 103 and 104 remain closer to the front of torso 102 than the rear of torso 102.
- Elbow joint 112 is configured to allow rotation of forearm 118 with respect to upper arm 120 about axis E.
- Axis E is generally parallel to axis S2.
- Elbow joint 112 includes a motor.
- the motor of elbow joint 112 may be a DC servo motor.
- the motor of elbow joint 112 may be coupled to a gearbox configured to rotate forearm 118 with respect to upper arm 120 at a slower rotational speed than the rotational speed of the motor output.
- elbow joint 112 may also be configured to rotate forearm 118 with respect to upper arm 120 about arm center axis AC.
- elbow joint 112 may include a first motor for effectuating rotation of forearm 118 about axis E and a second motor for effectuating rotation of forearm 118 about arm center axis AC.
- Each motor may be a DC servo motor.
- Wrist joint 114 is configured to allow rotation of hand 116 with respect to forearm 118 about two axes.
- Hand 116 may rotate with respect to forearm about axis W and arm center axis AC.
- Axis W is generally perpendicular to arm center axis AC.
- wrist joint 114 may include two motors: a first motor for effectuating rotation of hand 116 about axis W and a second motor for effectuating rotation of hand 116 about arm center axis AC.
- Each motor may be a DC servo motor.
- Each motor may be coupled to a gearbox configured to rotate hand 116 with respect to forearm 118 at a slower rotational speed than the rotational speed of the motor output.
- arm 104 and arm 103 have similar arrangements with respect to torso 102. The only difference being that arm 103 is a left-side arm, and arm 104 is a right-side arm. Both arm 103 and arm 104 utilize the same arrangement of motors to make the various segments of arm 103 and arm 104 rotate with respect to each other and with respect to torso 102.
- Head 101 is supported on the top of torso 102 by neck 107.
- Neck 107 is configured to allow rotation of head 101 with respect to torso about two axes.
- Head 101 can rotate about axis HI .
- robot 100 emulates a human turning his head left or right.
- Head 101 can also rotate about axis H2.
- Axis H2 is generally perpendicular to axis HI .
- robot 100 emulates a human nodding his head up and down.
- head 101 can rotate about axis H2 into a cavity within torso 102 when robot 100 is in the folded position.
- Neck 107 may form a ball and socket type joint with torso 102 of robot 100 to enable rotational motion of head 101 about axes HI and H2.
- Rotational motion of head 101 may be driven by a first and second motor: a first motor for effectuating rotation of head 101 about axis HI and a second motor for effectuating rotation of head 101 about axis H2.
- Each motor may be a DC servo motor.
- Each motor may be coupled to a gearbox configured to head 101 with respect to torso 102 at a slower rotational speed than the rotational speed of the motor output.
- foot 123 when foot 123 is folded, foot 123 may retract inside leg 105 and is received in internal compartment 310 of leg 105.
- leg 106 the internal components and arrangement of the internal components of leg 106 is generally the same as the described above with respect to leg 105. However, the internal components and the arrangement of the internal components within leg 105 is mirrored for proper placement on the opposite side of torso 102.
- Robot 100 is shown according to an exemplary embodiment.
- Robot 100 may include various lighting features.
- the lighting features may include torso 102 lighting features 501, arm lighting features 502, leg lighting features 503, and face lighting features 504.
- Body lighting features 501, arm lighting features 502, and leg lighting features 503 are shown as linear lighting features.
- body lighting features 501, arm lighting features 502, leg and lighting features 503 may be curves, dots, or other geometric shapes.
- Face lighting features 504 are drawn to represent eyes of robot 100.
- Face lighting features 504 may further include lighted areas representing other facial features of a human (e.g., nose, mouth, ears, hair, etc.).
- Robot 100 may include a greater number or a lesser number of lighting features as disclosed in FIG. 3.
- Lighting features include LED lights located behind opaque or transparent portions of robot 100 body shell. The LEDs used in creating the lighting feature may be multicolored LEDs such that the lighting elements can light up in different colors during different points in a
- Each of body lighting features 501, arm lighting features 502, leg lighting features 503, and face lighting features 504 are independently controllable in terms of activation (being on or off), brightness level, and color emitted by a controller (as discussed below with respect to FIG. 4) ⁇
- Robot 100 includes central processing unit
- Memory 602 stores various programming modules and codes that, when executed by central processing unit 601, control the operation of robot 100. Memory
- Memory 602 may include random access memory, read-only memory (e.g., non-transitory memory for storing programming modules), and flash memory storage. Memory 602 may further include a removable memory media reader for enabling a user of robot 100 to provide removable memory media (e.g., a MicroSD card, an SD card, etc.) containing additional instructions and or data files. Memory 602 is configured to store data files (e.g., audio data files, routine data files, etc.).
- Central processing unit 601 communicates data and electrical signals with various periphery devices, including network interface 603, USB controller 604, auxiliary audio input 605, user inputs 606, audio amplifier 607, servo controller 608, and lighting elements 609. Any of the above mentioned periphery components along with central processing unit 601 and memory 602 may be part of a system-on-chip integrated circuit or may be stand-alone circuit components. Power supply 610 provides operational power to robot 100 and all of the components of robot 100.
- Network interface 603 enables robot 100 to communicate with user device 615, either directly (e.g., through an ad-hoc network connection) or through network 616 (e.g., through the Internet or a local area network).
- User device 615 may be a computer, a laptop, a tablet computing device, a smart phone, a media player, or a PDA.
- User device 615 can communicate with robot 100 through system software loaded on user device 615 (e.g., a stand-alone program running on user device 615 such as a smart phone application, a system program loaded by accessing a system website, etc.).
- User device 615 may send robot 100 operating instructions and various data files.
- Network interface 603 further enables robot 100 to communicate with server 617 through network 616.
- Server 617 may include audio data files and/or dance routine data files.
- Network interface 603 may include a wireless transceiver and a wireless radio antenna.
- Network interface 603 may include a wireless transceiver and a wireless
- Network interface 603 may be able to communicate over multiple wireless frequencies and networking protocols at the same time (e.g., over an 802.11 WiFi connection and a Bluetooth connection at the same time).
- network interface 603 includes a separate WiFi transceiver and antenna (e.g., an 802.11 wireless transceiver and antenna) and a separate Bluetooth transceiver and antenna.
- central processing unit 601 may receive audio and dance routine data files and store in memory 602 for later playback and performance.
- Central processing unit 601 may also transmit data to user device 615 and/or server 617 (e.g., status reports, remaining memory 602 capacity, the identity of data files stored on memory 602, any detected errors associated with any periphery devices, etc.). Central processing unit 601 may also receive system updates (e.g., updated robot 100 operating system or programming modules).
- system updates e.g., updated robot 100 operating system or programming modules.
- USB controller 604 is a version 1.0, 2.0, or 3.0 USB controller.
- USB controller includes a female USB port on the exterior of robot 100 body.
- the USB port may be a standard size USB port (e.g., a mini-USB port, a micro-USB port, etc.) or the USB port may be a proprietary USB port.
- USB controller 604 is configured to establish a data connection between user device 615 and central processing unit 601 when user device 615 is plugged into USB controller 604 via a USB cable. Once a data connection is established between user device 615 and central processing unit 601, central processing unit 601 may receive audio and dance routine data files and store in memory 602 for later playback and performance.
- Central processing unit 601 may also transmit data to user device 615 (e.g., status reports, remaining memory 602 capacity, the identity of data files stored on memory 602, any detected errors associated with any periphery devices, etc.). Central processing unit 601 may also receive system updates (e.g., updated robot 100 operating system or programming modules). In some arrangements, USB controller 604 is also configured to provide electrical power to robot 100 to charge power supply 610 when an external power source (e.g., grid power, an external battery pack, etc.) is connected to robot 100 through USB controller 604 and/or to provide operational power to robot 100.
- an external power source e.g., grid power, an external battery pack, etc.
- Auxiliary audio input 605 is configured to allow a user of robot 100 to provide an audio stream to central processing unit 601 for playback over speakers 620.
- Auxiliary audio input 605 may be a standard 3.5 mm stereo input.
- auxiliary audio input 605 is drawn as providing an audio stream to central processing unit, auxiliary audio input 605 may be directly connected to audio amplifier 607 for direct playback through speakers 620.
- User inputs 606 may include any number of buttons (e.g., spring loaded
- User inputs 606 include at least a power switch (i.e., an on/off button), audio playback controls (e.g., a play button, a stop button, a pause button, a fast forward button, a rewind button, a next button, a previous button, a volume up button, and a volume down button), a fold button, an unfold button.
- a power switch i.e., an on/off button
- audio playback controls e.g., a play button, a stop button, a pause button, a fast forward button, a rewind button, a next button, a previous button, a volume up button, and a volume down button
- a fold button e.g., a fold button, an unfold button.
- User inputs 606 may also include a remote control.
- the remote control may communicate with robot 100 through network interface 603 (e.g., over a Bluetooth connection), over an infrared communication link, over another type of wireless radio communication link, or through a wired connection with robot 100.
- the remote control may include any of the above discussed buttons, switches, and/or dials discussed above.
- Audio amplifier 607 amplifies low-power audio signals received from central processing unit 601 and/or auxiliary audio input 605 to a higher power level suitable for driving speakers 620.
- Audio amplifier 607 may be a 2.1 channel audio amplifier.
- Speakers 620 include speakers 131, 132, and 133. However it should be appreciated that alternate configurations of robot 100 may include a fewer number of speakers (e.g., two speakers or one speaker) or a greater number of speakers (e.g., four speakers, five speakers, etc.).
- Servo controller 608 is in communication with motors 630.
- Motors 630 include the above discussed motors associated with each joint of robot 100 and associated with wheels of feet 122 and 123. Accordingly, servo controller 608 can individually control an individual motor by sending instructions and/or routing power to the individual motor.
- Servo controller 608 sends and receives data signals to and from central processing unit 601.
- the data received by servo controller 608 generally relates to an instruction to turn an individual motor to a certain angular position at a designated angular speed. Accordingly, servo controller 608 parses the instruction and executes the instruction to effectuate the instructed motor motion.
- Each motor of robot 100 is a DC servo motor and may include a servo motor encoder.
- the servo motor encoder provides feedback relating to an angular position of the associated servo motor with respect to a reference position.
- Servo controller 608 may report the various angular positions of individual motors among motors 630 to central processing unit
- Central processing unit 601 also controls the operation of lighting elements 609.
- Lighting elements include any lighted features of robot 100 (e.g., lighting features 501, 502, 503, and 504).
- Central processing unit 601 is configured to activate and deactivate individual lighting features of lighting elements 609 in response to an executed program.
- the eyes of robot 100 e.g., lighting features 504
- may light up independently of lighting elements on torso 102 of robot 100 e.g., lighting features 501).
- Central processing unit 601 is configured to independently adjust the brightness and/or color of individual lighting features of lighting elements 609 in response to an executed routine program.
- Power supply 610 provides operational power to robot 100 and all components of robot 100.
- Power supply 610 may include a battery.
- the battery may be a rechargeable battery (e.g., a lithium-polymer battery).
- Power supply 610 may be charged through power routed from USB controller 604 when USB controller 604 is connected to an external power source (e.g., when connected to user device 615). Alternatively, power supply 610 may be charged through a separate power port located on robot 100.
- an external power supply e.g., grid power, an external battery pack, etc.
- operational power for robot 100 may be provided by the external power supply while power supply 610 is receiving a charge.
- robot 100 is shown according to exemplary embodiments.
- central processing unit 601 controls robot
- Central processing unit 601 may instruct servo controller 608 to move robot 100 in a predefined sequence. Additionally, central processing unit 601 may send audio signals to audio amplifier 607 for playback over speakers 620 during the predefined movement sequence. Accordingly, robot 100 may be configured to move according to a predefined choreography in synchronization with audio playing over speakers 620. Such movement during audio playback gives the appearance that robot 100 is dancing to the audio being emitted from speakers 620. Depending on the program executed by central processing unit 601, robot 100 may move arms 103 and 104, legs 105 and 106, and head
- Central processing unit 601 may also move robot 100 through controlling motors associated wheels of feet 123 and 124. Central processing unit 601 may also control lighting elements 609 for added visual effect during the audio playback and movement of robot 100.
- robot 100 is shown progressing through folding steps from a humanoid robot to a speaker box according to an exemplary
- Central processing unit 601 is configured to transform robot 100 from an unfolded position wherein robot 100 resembles a human (as shown in FIG. 1) to a folded position wherein robot 100 resembles a speaker box (as shown in FIG. 12).
- Central processing unit 601 executes the folding routine program or unfolding routine program stored in memory 602 when instructed to by the user of robot 100.
- the instruction to fold or unfold can be provided through user inputs 606 (e.g., a fold button, an unfold button, a transform button, etc.), through a user remote, or from a command received from user device 615.
- the instruction to fold or unfold may also be embedded in a dance routine. For example, when robot 100 is performing a dance routine, robot 100 may initially transform from the folded position to the unfolded position at the start of a song, and then retransform from the unfolded position to the folded position at the conclusion of a song.
- central processing unit 601 instructs arms 103 and 104 to fold in adjacent to torso 102 and speaker 131.
- arms 103 and 104 are in position, hands 116 and 117 rotate up and underneath speaker 131.
- arms 103 and 104 along with hands 116 and 117 form a generally "U" shape around torso 102 and speaker 131.
- arm 104 folds up against a front portion of torso 102 because arm center axis AC is offset from shoulder axis SI .
- This folding arrangement leaves space 902 for receiving leg 106.
- a similarly shaped space exists on the opposite side of torso 102 behind arm 103.
- FIG. 8 through FIG. 10 the second stage of the folding process is shown according to an exemplary embodiment.
- central processing unit 601 instructs legs 106 and 105 to begin rotation up towards torso 102 and behind arms 103 and 104.
- legs 106 and 105 are folded up against torso 102.
- Leg 106 fits into space 902 against torso 102 and just behind arm 104.
- Feet 122 and 123 retract into legs 106 and 107 respectively.
- robot 100 begins to take on the appearance of a speaker box instead of a humanoid robot.
- central processing unit 601 instructs head 101 to fold into a cavity within torso 102.
- chest plate 1301 pivots away from torso 102.
- robot 100 When in the fully folded position (as shown in FIG. 12), robot 100 no longer looks like a human, and instead resembles a speaker box.
- head 101 sits in a cavity of torso 102 above speaker 131 in the folded position.
- robot 100 in the speaker form can have arms 103 and 104 folded in a disproportionate way to the body of the robot such that arms 103 and 104 are disposed towards the chest of torso 102 in one embodiment.
- This configuration advantageously saves space for legs 105 and 106 to fold on the sides of torso 102. This configuration is aesthetically acceptable when robot 100 is in the speaker form.
- the axis of rotation of the shoulder joint is symmetrical to the shoulder or arms central axis to guarantee a proportionate design.
- axis SI of the shoulder joint of robot 100 is asymmetrical or offset with respect to axis AC or the joint's center axis in one embodiment and is disposed more towards the back of robot 100 compared to axis AC when in speaker form.
- arms 103 and 104 rotate 180 degrees about axis SI so that axis AC is more towards the back of robot 100 compared to axis SI to prevent a disproportionate human body structure that is aesthetically unacceptable in one
- FIGS. 1 and 12 show the difference in placement of arms 103 and 104 in the humanoid form and speaker form, respectively.
- robot 100 is shown from a top perspective to further emphasize the offset nature of shoulder axis SI and the center of robot 100 arms 103 and 104.
- the offset nature of shoulder axis S 1 enables the arms to rotate forward to save space for legs 105 and 106 such that legs 105 and 106 can later be rotated up adjacent to torso 102.
- Robot 100 can transform from the folded position to the unfolded position by performing the above steps in the reverse order.
- Central processing unit 601 begins the process of transforming robot 100 from the folded position to the unfolded position when it receives a command from user inputs 606, a user remote control, or from user device 615.
- the present disclosure contemplates methods, systems, and program products on any machine-readable media for accomplishing various operations.
- the embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system.
- Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon.
- Such machine- readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor.
- machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor.
- a network or another communications connection either hardwired, wireless, or a combination of hardwired or wireless
- any such connection is properly termed a machine-readable medium.
- Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Toys (AREA)
Abstract
L'invention concerne un robot qui comprend une section de torse, un premier bras et un second bras. Le premier bras est doté d'un premier axe central de bras. Le premier bras est couplé au torse au moyen d'une première articulation d'épaule et est conçu pour pivoter autour d'un premier axe et d'un deuxième axe. Le second axe est généralement perpendiculaire au premier axe. Le second bras est doté d'un second axe ventral de bras et est couplé à la section de torse au moyen d'une seconde articulation d'épaule. Le second bras est conçu pour pivoter autour du premier axe et d'un troisième axe. Le troisième axe est généralement perpendiculaire au premier axe. Le premier axe central de bras définit le centre du premier bras et le second axe central de bras définit le centre du second bras. Les premier et second axes centraux de bras sont décalés du premier axe.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2013/000956 WO2014162161A1 (fr) | 2013-04-01 | 2013-04-01 | Agencement d'épaule et de bras pour robot humanoïde |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2013/000956 WO2014162161A1 (fr) | 2013-04-01 | 2013-04-01 | Agencement d'épaule et de bras pour robot humanoïde |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014162161A1 true WO2014162161A1 (fr) | 2014-10-09 |
Family
ID=51657645
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2013/000956 WO2014162161A1 (fr) | 2013-04-01 | 2013-04-01 | Agencement d'épaule et de bras pour robot humanoïde |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014162161A1 (fr) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017029263A1 (fr) * | 2015-08-14 | 2017-02-23 | KBee AG | Système robotique et partie boîtier correspondante |
CN108393906A (zh) * | 2018-03-01 | 2018-08-14 | 深圳普得技术有限公司 | 一种控制机器人实现机械律动方法及机器人 |
CN109840923A (zh) * | 2019-01-22 | 2019-06-04 | 绍兴文理学院 | 基于机器人舞蹈姿态镜像子图像获得方位特征的方法 |
CN109927044A (zh) * | 2019-02-20 | 2019-06-25 | 浙江机电职业技术学院 | 一种多关节机器人及其信息识别处理的使用方法 |
US10843344B2 (en) | 2015-10-08 | 2020-11-24 | Sami Haddadin | Robot system |
US10981278B2 (en) | 2015-10-08 | 2021-04-20 | Kastanienbaum GmbH | Robot system |
US11040455B2 (en) | 2015-10-08 | 2021-06-22 | Haddadin Beteiligungs Ug | Robot system and method for controlling a robot system |
CN114432714A (zh) * | 2021-11-12 | 2022-05-06 | 株式会社万代 | 玩具部件和人形玩具 |
US11358275B2 (en) | 2016-04-20 | 2022-06-14 | Franka Emika Gmbh | Drive unit for a robot and method for manufacturing the same |
US11623355B2 (en) | 2016-04-20 | 2023-04-11 | Kastanienbaum GmbH | Method for producing a robot and device for carrying out said method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0316709Y2 (fr) * | 1984-10-31 | 1991-04-10 | ||
JP2006247780A (ja) * | 2005-03-10 | 2006-09-21 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
JP2006289508A (ja) * | 2005-04-05 | 2006-10-26 | Sony Corp | ロボット装置及びその表情制御方法 |
JP2011000681A (ja) * | 2009-06-19 | 2011-01-06 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
JP2011140096A (ja) * | 2010-01-07 | 2011-07-21 | Kazunori Seki | 重心移動装置を有する2足歩行ロボット及び重心移動方法 |
-
2013
- 2013-04-01 WO PCT/IB2013/000956 patent/WO2014162161A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0316709Y2 (fr) * | 1984-10-31 | 1991-04-10 | ||
JP2006247780A (ja) * | 2005-03-10 | 2006-09-21 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
JP2006289508A (ja) * | 2005-04-05 | 2006-10-26 | Sony Corp | ロボット装置及びその表情制御方法 |
JP2011000681A (ja) * | 2009-06-19 | 2011-01-06 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
JP2011140096A (ja) * | 2010-01-07 | 2011-07-21 | Kazunori Seki | 重心移動装置を有する2足歩行ロボット及び重心移動方法 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017029263A1 (fr) * | 2015-08-14 | 2017-02-23 | KBee AG | Système robotique et partie boîtier correspondante |
US10625414B2 (en) | 2015-08-14 | 2020-04-21 | Franka Emika Gmbh | Robotic system and housing part for such robotic system |
US10843344B2 (en) | 2015-10-08 | 2020-11-24 | Sami Haddadin | Robot system |
US10981278B2 (en) | 2015-10-08 | 2021-04-20 | Kastanienbaum GmbH | Robot system |
US11040455B2 (en) | 2015-10-08 | 2021-06-22 | Haddadin Beteiligungs Ug | Robot system and method for controlling a robot system |
US11358275B2 (en) | 2016-04-20 | 2022-06-14 | Franka Emika Gmbh | Drive unit for a robot and method for manufacturing the same |
US11623355B2 (en) | 2016-04-20 | 2023-04-11 | Kastanienbaum GmbH | Method for producing a robot and device for carrying out said method |
CN108393906A (zh) * | 2018-03-01 | 2018-08-14 | 深圳普得技术有限公司 | 一种控制机器人实现机械律动方法及机器人 |
CN108393906B (zh) * | 2018-03-01 | 2021-04-27 | 深圳小墨智能科技有限公司 | 一种控制机器人实现机械律动方法及机器人 |
CN109840923A (zh) * | 2019-01-22 | 2019-06-04 | 绍兴文理学院 | 基于机器人舞蹈姿态镜像子图像获得方位特征的方法 |
CN109927044A (zh) * | 2019-02-20 | 2019-06-25 | 浙江机电职业技术学院 | 一种多关节机器人及其信息识别处理的使用方法 |
CN114432714A (zh) * | 2021-11-12 | 2022-05-06 | 株式会社万代 | 玩具部件和人形玩具 |
CN114432714B (zh) * | 2021-11-12 | 2024-03-08 | 株式会社万代 | 玩具部件和人形玩具 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014162161A1 (fr) | Agencement d'épaule et de bras pour robot humanoïde | |
US10001843B2 (en) | Modular sensing device implementing state machine gesture interpretation | |
JP2019510524A (ja) | 変更可能な特性を有するロボット | |
WO2014162162A1 (fr) | Système sonore pour un robot humanoïde | |
CN201163417Y (zh) | 具有人脸识别的智能机器人 | |
CN204856222U (zh) | 一种基于物联网技术的全向移动平台控制系统 | |
CN102416265A (zh) | 变形金刚机器人玩具及方法 | |
CN108283814A (zh) | 可编程全自动变形机器人 | |
CN110209309A (zh) | 信息处理装置及存储程序的计算机可读介质 | |
CN104908838B (zh) | 一种立方体结构的变形轮式人形机器人 | |
CN108098787A (zh) | 一种语音互动机器人结构及其系统 | |
CN104924802A (zh) | 残疾人头部控制自动翻书机 | |
US10004328B2 (en) | Movable table | |
CN207318974U (zh) | 电动床和电动床控制系统 | |
CN103801074A (zh) | 便携式终端机用智能玩具驱动系统 | |
CN108176056A (zh) | 一种变脸发声公仔 | |
CN205627058U (zh) | 一种智能招财猫机器人 | |
CN207950641U (zh) | 可编程全自动变形机器人 | |
CN205325698U (zh) | 一种基于安卓系统的智能迎宾机器人 | |
CN207745523U (zh) | 一种变脸发声公仔 | |
CN206980060U (zh) | 家庭式vr—xd座椅 | |
CN203909616U (zh) | 一种基于智能手机无线控制的机器人 | |
US12045088B2 (en) | Thin portable communication terminal, and control method and control program thereof | |
CN205969049U (zh) | 迎宾机器人 | |
US8955750B2 (en) | System and method for interactive mobile gaming |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13880963 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13880963 Country of ref document: EP Kind code of ref document: A1 |