WO2012141130A1 - Robot pour bénéficiaire de soins - Google Patents

Robot pour bénéficiaire de soins Download PDF

Info

Publication number
WO2012141130A1
WO2012141130A1 PCT/JP2012/059662 JP2012059662W WO2012141130A1 WO 2012141130 A1 WO2012141130 A1 WO 2012141130A1 JP 2012059662 W JP2012059662 W JP 2012059662W WO 2012141130 A1 WO2012141130 A1 WO 2012141130A1
Authority
WO
WIPO (PCT)
Prior art keywords
state
robot
sensor
control unit
sleep
Prior art date
Application number
PCT/JP2012/059662
Other languages
English (en)
Japanese (ja)
Inventor
政芳 加納
Original Assignee
株式会社東郷製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東郷製作所 filed Critical 株式会社東郷製作所
Publication of WO2012141130A1 publication Critical patent/WO2012141130A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H13/00Toy figures with self-moving parts, with or without movement of the toy as a whole
    • A63H13/005Toy figures with self-moving parts, with or without movement of the toy as a whole with self-moving head or facial features
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/001Dolls simulating physiological processes, e.g. heartbeat, breathing or fever
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/003Dolls specially adapted for a particular function not connected with dolls
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/281Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for pregnancy, birth or obstetrics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the present invention relates to a robot for a cared person, and particularly to a robot for a cared person that simulates an infant.
  • Japanese Patent Application Laid-Open No. 2010-149276 discloses an animal type robot. The robot determines the next operation based on the current operation based on the sequentially supplied input information and the next input information supplied. This allows the robot to act automatically like a pet and provide healing to humans.
  • Japanese Patent Application Laid-Open No. 2002-018147 discloses an animal type robot equipped with a response device.
  • the robot can soften or relieve humans by simulating curious behavior.
  • the care receiver robot can simulate an infant.
  • the robot includes an external stimulus sensor that detects an external stimulus, an internal state sensor that detects an internal state of the robot, a state expression device that represents a facial expression of the robot, a detection signal from the external stimulus sensor, and an internal state sensor.
  • a control unit that determines the internal state of the robot based on the detection signal and operates the state expression device in accordance with the internal state of the robot.
  • the internal state of the robot includes a normal state representing a state of good mood, a desire state representing a state of poorer mood than the normal state, and a sleep state representing sleeping.
  • the robot can simulate an infant by expressing an expression according to the internal state of the robot. Therefore, the cared person is conscious of taking care of the robot. That is, the robot can assist the care recipient in discovering his / her own existence value and purpose of life.
  • FIG. It is the elements on larger scale of FIG. It is a perspective view of an arm. It is the elements on larger scale of FIG. It is a block diagram of a robot for showing signal input / output. It is a figure which shows each state of the software of a robot. It is a schematic flowchart of a normal state. It is a schematic flowchart of a desire state. It is a schematic flowchart of a sleep state. It is a flowchart of the mode determination process of each state. It is a detailed flowchart of the mode determination process in a normal state. It is a detailed flowchart of the mode determination process of a desire state. It is a detailed flowchart of the mode determination process of a sleep state.
  • the robot 1 for care recipient of the present invention will be described with reference to FIGS.
  • the robot 1 has a head 10 and a torso 20.
  • the robot 1 is wrapped in clothing similar to clothing for infants, and has an appearance, shape, size, weight, and the like simulating an infant.
  • the robot 1 behaves like an infant.
  • a cared person or the like robot carer
  • robot carer who takes care of the robot 1 in a pseudo manner or a cared person or the like (robot carer) who cares for the robot 1 experiences a child-rearing sensation.
  • a cared person or the like (robot carer) learns that the robot 1 “cares”. That is, the robot 1 can support a care recipient or the like (robot caregiver) to discover his / her own existence value or his / her worth.
  • the cared person usually needs the assistance of a carer, but for the robot 1, it is a robot carer.
  • the robot 1 has a control unit CU as shown in FIG.
  • the control unit CU includes a calculation unit C10 such as a CPU, a storage unit M10 that stores a control program, a storage unit M20 that stores data such as a cry, a laughter, and a sleep.
  • the control unit CU takes in a detection signal from the detection means Sxx and outputs a drive signal to the actuator Axx.
  • the control unit CU can be connected to other devices such as a personal computer PC via a communication line or the like, and the control program can be rewritten or confirmed for operation by the personal computer PC.
  • the storage means M10 and M20 have a flash memory and can store a state before charging or replacing the power source BT. Thereby, the robot 1 can resume the operation according to the state after charging or replacing the power source BT.
  • the power supply BT can be configured integrally with the pseudo diaper.
  • the robot 1 enters a desire state and outputs a cry or the like from the speaker A22 or expresses a crying face.
  • the robot 1 requests connection of a charging cable in order to replace the power source BT or charge the power source BT with a household power source.
  • the crying face can be formed by operating the eyelid drive motor A11 and turning on or blinking the tear LED (A12).
  • the robot 1 may have a charging control circuit (not shown) to which a charging cable is connected to charge the power source BT.
  • the charging control circuit supplies power from the charging cable to the control unit CU and the actuator Axx when the charging cable is connected.
  • power is supplied from the power supply BT to the control unit CU and the actuator Axx.
  • optical sensors for detecting the brightness of light are provided on the forehead and the top of the head 10.
  • a pyroelectric sensor infrared detecting means
  • S12 that detects a robot caregiver by detecting a change in the amount of infrared rays is provided at a position corresponding to the left and right ears of the head 10.
  • a touch sensor contact detection means S13 that detects that another object is in contact with the back of the head is provided at the back of the head 10.
  • FIG. 9 shows a face portion from which the epidermis is omitted.
  • eyeballs 11 are provided at positions corresponding to the left and right eyes of the head 10.
  • the eyeball 11 is connected to both ends of the shaft 11A, and is rotated around the rotation axis X11 by the eyelid drive motor A11. As a result, the eyeball 11 rotates so that the eyelid opens and closes.
  • a tear LED (A12) representing tears is provided under the eyeball 11 of the head 10.
  • cheek LEDs (A14) indicating that the cheeks turn red are provided.
  • Lips 15 are provided at positions corresponding to the mouth of the head 10, and a lip drive motor A 15 for opening and closing the lips 15 is provided on the back side of the lips 15. The periphery of the lips is covered with the epidermis, and the epidermis is smoothly deformed when the lips are opened and closed.
  • a cam 15C is attached to the lip drive motor A15.
  • An eccentric shaft member 15S is provided at a position eccentric from the rotation axis X15A of the cam 15C.
  • the eccentric shaft member 15S is inserted through a slit provided in the lip driving member 15B.
  • the cam 15C is rotated by the lip drive motor A15, the lip drive member 15B swings around the rotation axis X15, and the lip 15 moves slowly and smoothly.
  • a neck turning motor A ⁇ b> 21 for turning the head 10 left and right is provided in the body portion 20.
  • a shaft 21A is connected to the neck turning motor A21, and a connection member 16B is connected to be rotatable about the rotation axis X16 perpendicular to the shaft 21A.
  • the head 10 is connected to the connecting member 16B.
  • the shaft 21A is rotated by the neck turning motor A21, and the head 10 is turned left and right.
  • a head swing motor A ⁇ b> 16 that swings the head 10 back and forth with respect to the neck is provided in the head 10.
  • a swing cam 16C is connected to the head swing motor A16, and an eccentric shaft member 16S is provided at an eccentric position of the swing cam 16C.
  • the eccentric shaft member 16S is fitted into a slit 21S formed at the tip of the shaft 21A.
  • the head swing motor A16 rotates the swing cam 16C, whereby the head 10 swings about the rotation axis X16 with respect to the shaft 21A.
  • the robot 1 can simulate the state of an infant whose neck is not sitting sufficiently. That is, if the robot caregiver does not support the head 10, the control unit CU may control the neck turning motor A21 and the head swing motor A16 so that the head 10 hangs down.
  • a camera (imaging unit) S ⁇ b> 21 that detects the presence of a robot caregiver in front of the robot 1 with an image is provided on the upper portion of the body unit 20.
  • a microphone (voice detecting means) S22 for detecting the presence and language of the robot caregiver.
  • Touch sensors (contact detection means) S23 and S24 for detecting contact of an object are provided on the front and lower portions and the back of the body portion 20. The touch sensors S23 and S24 can be attached to appropriate positions as appropriate.
  • an acceleration sensor (acceleration detecting means) S25 for detecting whether the robot 1 is shaken or frayed and a temperature sensor for detecting the temperature inside the trunk portion 20 are provided inside the trunk portion 20. Temperature detection means) S26 is provided.
  • a power source BT is detachably attached to the body portion 20 and a battery sensor (power source detection means) SBT for detecting the remaining battery level of the power source BT is attached.
  • the body portion 20 may be provided with a charge detection sensor (not shown) that is connected to a charging cable and detects that the power source BT is being charged.
  • a speaker (sound output machine) A ⁇ b> 22 that outputs sound is provided on the upper portion of the body portion 20.
  • the body unit 20 accommodates a control unit CU.
  • the control unit CU receives a detection signal from each input detection means, determines a state based on the detection signal, and operates the state expression device according to an internal state.
  • an arm 30 is provided on the body portion 20.
  • the right arm 30 shown in FIG. 12 and the left arm 30 (not shown) are formed in the same manner.
  • the arm 30 includes an upper arm 31 ⁇ / b> B, a forearm 32, and a hand portion 33.
  • One end portion of the upper arm 31 ⁇ / b> B is attached to the body portion 20.
  • the elbow rotating part 31 is connected to the other end of the upper arm 31 ⁇ / b> B, and one end of the forearm 32 is connected to the elbow rotating part 31.
  • a hand 33 is connected to the other end of the forearm 32.
  • the upper arm 31B is provided with a turning motor A31 for turning the elbow rotating portion 31 about the axis X31 with respect to the upper arm 31B.
  • the forearm 32 is provided with a turning motor A32 for turning the forearm 32 about the axis Y32 with respect to the elbow rotating portion 31.
  • the hand portion 33 is provided with a turning motor A33 for turning the hand portion 33 about the axis Y33 with respect to the forearm 32.
  • a touch sensor S ⁇ b> 33 is provided in the central region of the tip of the hand part 33, and the touch sensor S ⁇ b> 33 can detect that the robot caregiver touches the hand part 33.
  • the latch mechanism disclosed in Japanese Patent Application Laid-Open No. 2007-301704 is provided at the joint of the axis Y32 shown in FIGS.
  • the forearm 32 is disengaged from the elbow rotating portion 31 when a predetermined load or more is applied.
  • the latch mechanism can form a dislocation state and prevent the drive mechanism from being damaged.
  • the latch mechanism is provided with a joint sensor (joint state detecting means) S32 for detecting the dislocation state.
  • the robot 1 can express sadness or pain, and can prompt the robot caregiver to reduce the joint.
  • the joint is reduced, the robot 1 becomes a normal state and can express joy or the like.
  • the latch mechanism includes an engaging member 32K, an engaged member 32H, and a leaf spring 32D.
  • the engaging member 32K is provided in the elbow rotating portion 31 so as to be close to and away from the turning motor A32.
  • the engaged member 32H is formed with a recess 32G into which the engaging member 32K is inserted.
  • the leaf spring 32D biases the engaging member 32K to the turning motor A32.
  • the joint sensor S ⁇ b> 32 has conductive terminals located at respective positions of the opposed engaged member 32 ⁇ / b> H and the elbow rotating portion 31.
  • the two terminals contact and conduct in a reduced state, and are separated and insulated in a dislocation state.
  • the control unit CU can determine the reduction state and the dislocation state based on the detection signal from the joint sensor S32.
  • the control unit CU includes an internal state sensor Gn, an external stimulus sensor Gg, and a state expression device Gj.
  • the internal state sensor Gn includes a timer (clock) ST that detects the current time in order to detect the internal state of the robot 1, a battery sensor SBT, and a joint sensor S32.
  • the external stimulus sensor Gg includes optical sensors S11 and S14, a pyroelectric sensor S12, touch sensors S13, S23, S24, and S33, a camera S21, a microphone S22, and an acceleration sensor S25 in order to detect an external state from an external stimulus.
  • the state expression device Gj includes a motor such as a drive motor A11, an LED such as a tear LED (A12), and a speaker A22 in order to represent the state of the robot 1.
  • the control unit CU determines the state of the robot 1 based on the external state and the internal state.
  • the state of the robot 1 includes an operation pause state M0, a sleep state M1, a desire state M2, and a normal state M3.
  • the control unit CU moves the robot 1 like an infant by operating the state expression device Gj according to the state.
  • the operation pause state M0 is a state where the battery 1 is low and the operation of the robot 1 is temporarily stopped.
  • the sleeping state M1 is a state simulating that an infant is sleeping.
  • the desire state M2 is a state in which emotions such as an infant crying are relatively unstable.
  • the normal state M3 is a state simulating that emotions such as an infant being happy are relatively stable.
  • Each state can be changed to another state as shown in FIG. For example, in the operation pause state M0, the remaining amount of the battery increases due to charging or replacement of the power supply BT, and can be changed to another state.
  • the control unit CU performs a person detection process S10A, a voice recognition process S10B, an emotion activation process S10C, an emotion expression process S10D, a mode determination process S10E, and a mode change process S10F.
  • the control unit CU determines the presence of a robot caregiver around the robot 1 based on detection signals from the pyroelectric sensor S12 and the camera S21.
  • the control unit CU recognizes the presence / absence of voice from the robot caregiver and the voice content based on the detection signal from the microphone S22 and the like.
  • the control unit CU determines an emotion based on detection signals of the internal state sensor Gn and the external stimulus sensor Gg.
  • the control unit CU moves the state expression device Gj based on the emotion to express the emotion.
  • the control unit CU determines the state of the robot 1 based on the detection signals of the internal state sensor Gn and the external stimulus sensor Gg.
  • the mode change process S10F the control unit CU changes the state of the robot 1 to any state.
  • the control unit CU performs a person detection process S20A, an emotion activation process S20C, an emotion expression process S20D, a mode determination process S20E, and a mode change process S20F.
  • the control unit CU performs the sleep process S30G, the awakening determination process S30H, the mode determination process S30E, and the mode change process S30F in order.
  • the control unit CU controls the eyelid drive motor A11, the lip drive motor A15, the speaker A22, and the like as shown in FIG.
  • the control unit CU determines “whether it has been awakened” based on the detection signals of the internal state sensor Gn and the external stimulus sensor Gg.
  • the control unit CU periodically or randomly repeats the first sleep breathing expressing shallow breathing and the second sleep breathing expressing deeper breathing than the first sleep breathing.
  • the control unit CU extracts a shallow sleep breath sound from the breath sounds stored in the storage means M20, and outputs a shallow sleep breath sound from the speaker A22.
  • the control unit CU extracts a deep sleep breath sound from the breath sounds and outputs a deep sleep breath sound from the speaker A22.
  • the shallow breathing sound is a sound when the respiration volume is smaller than the deep breathing sound.
  • the storage means M20 stores breathing sounds that differ depending on the depth of breathing, the difference in duration of breathing, the volume of breathing, and the like.
  • the robot 1 can output an appropriate breathing sound in a sleep state, and can operate close to an infant sleeping and sleeping. Therefore, the robot 1 can make the robot caregiver recognize its presence and give a sense of symbiosis and attachment to the robot 1. Laying an infant is a relatively difficult task and a childcare task. Therefore, the robot caregiver obtains a sense of accomplishment by taking care of the robot 1 by listening to the breathing sound during sleep of the robot 1. That is, the robot 1 can support the robot caregiver to obtain a child-rearing sensation.
  • the mode determination processes S10E, S20E, and S30E are executed by the control unit CU every predetermined timing, for example, every several tens to several hundreds ms as shown in FIG.
  • the mode determination process determines whether or not a pause condition is satisfied in states M1 to M3 (step S100). When the suspension condition is satisfied (Yes), it is determined that the operation is suspended and the process is terminated (step S600).
  • step S200 it is determined whether the sleep condition is satisfied.
  • step S200 it is determined that the patient is in a sleep state, and the process is terminated.
  • step S601 it is determined whether the desire condition is satisfied.
  • step S300 it is determined whether the desire condition is satisfied (Yes)
  • step S602 it is determined that it is in the desire state, and the process is terminated (step S602).
  • step S603 it is determined that the state is the normal state, and the process is terminated (step S603).
  • step SA00 determines the play state M and the joy state Y as shown in FIG. 20 (step SA00), and obtains the fatigue degree parameter T (step SB00).
  • step SA00 determines whether or not the acceleration based on the detection signal of the acceleration sensor S25 is a threshold Th6 (for example, 0.05 G) or more (step SA10). If the acceleration is greater than or equal to the threshold Th6 (Yes), the process proceeds to step SA22. When the acceleration is less than the threshold Th6 (No), the process proceeds to Step SA12.
  • step SA12 the control unit CU determines the presence or absence of voice input (talking) from the robot caregiver based on the detection signal from the microphone S22 (step SA12).
  • step SA14 it is determined whether the light A is dark and the light B is bright. In the case of No in step SA14, it is determined whether any of the touch sensors S13 and S24 is in a non-contact state (step SA16).
  • step SA16 when any of the touch sensors S13 and S24 is in a non-contact state (Yes), the control unit CU substitutes 1 for the addition value X as shown in FIG. 26 (step SA22).
  • Step SA22 is also performed when it is determined Yes in steps SA10, SA12, and SA14. For example, if the robot caregiver is stroking the forehead or the top of the robot 1, “Yes” is determined in step SA14.
  • step SA16 when all of the touch sensors S13 and S24 are in contact (No), the control unit CU substitutes 0 for the added value X as shown in FIG. 26 (step SA24).
  • step SA24 when the robot 1 is left on the floor, the touch sensors S13 and S24 can be arranged so that all of the touch sensors S13 and S24 are in contact. Therefore, when the robot 1 is left on the floor, step SA24 is performed.
  • the control unit CU multiplies the play parameter P (P [i-1]) by ⁇ (for example, the coefficient is 0.95), adds the added value X, and plays the play.
  • the value of parameter P (latest P [i]) is obtained (step SA30).
  • a threshold Th4 for example, the parameter value is 0.1
  • the control unit CU sets the play state flag M to 1 as shown in FIG. 26, and determines that the robot 1 is playing (step SA54). If the play parameter P is less than the threshold Th4 (No), the control unit CU clears the play state flag M to 0, and determines that the robot 1 is not playing (step SA52).
  • the control unit CU determines whether or not the play parameter P is greater than or equal to a threshold Th5 (for example, the parameter value is 5.0) (step SA70). If the play parameter P is greater than or equal to the threshold Th5 (Yes), the control unit CU sets the joy state flag Y to 1, determines that the robot 1 is feeling joy (step SA74), and then proceeds to SB00 (FIG. 20). If the play parameter P is less than the threshold Th5 (No), the control unit CU clears the joy state flag Y to 0, determines that the robot 1 does not feel joy (step SA72), and then proceeds to SB00 (FIG. 20).
  • a threshold Th5 for example, the parameter value is 5.0
  • step SB00 shown in FIG. 20 the control unit CU determines whether or not the current state is the normal state as shown in FIG. 27 (step SB10).
  • the control unit CU determines whether or not the pleasure state flag Y is set (step SB20).
  • the control unit CU adds the added value ⁇ 1 (for example, the parameter value is 1.0) to the fatigue level parameter T (T [i-1]) to determine the fatigue level.
  • the parameter T (latest T [i]) is obtained (step SB51), and the process is terminated.
  • step SB20 shown in FIG. 27 when the control unit CU determines that the pleasure state flag Y is cleared (No), the control unit CU adds the added value ⁇ 2 (for example, the parameter value is equal to the fatigue degree parameter T (T [i ⁇ 1])). 0.5) is added to determine the fatigue degree parameter T (latest T [i]) (step SB52), and the process is terminated.
  • the added value ⁇ 2 for example, the parameter value is equal to the fatigue degree parameter T (T [i ⁇ 1])
  • 0.5 is added to determine the fatigue degree parameter T (latest T [i]) (step SB52), and the process is terminated.
  • step SB10 shown in FIG. 27 when it is determined that the control unit CU is not in the normal state, the control unit CU determines whether or not it is in the desire state (step SB30). In the case of the desire state (Yes), the control unit CU adds the added value ⁇ 3 (for example, the parameter value is 1.0) to the fatigue level parameter T (T [i ⁇ 1]) to thereby determine the fatigue level parameter T (latest T [i]) is obtained (step SB53), and the process is terminated.
  • the added value ⁇ 3 for example, the parameter value is 1.0
  • step SB30 shown in FIG. 27 when it is determined that the control unit CU is not in the desire state, the control unit CU determines whether it is in the sleep state (step SB40). In the case of a sleep state (Yes), the control unit CU subtracts the subtraction value ⁇ (for example, the parameter value is 0.1) from the fatigue degree parameter T (T [i ⁇ 1]) to thereby determine the fatigue degree parameter T (latest T [i]) is obtained (step SB54), and the process is terminated. In step SB40, control unit CU complete
  • the subtraction value ⁇ for example, the parameter value is 0.1
  • step S110 shown in FIG. 20 the control unit CU determines whether the remaining battery level is less than a threshold Th10 (for example, 10%) based on the detection signal from the battery sensor SBT. If it is less than the threshold value Th10 (Yes), it is determined that the operation is in a suspended state (step S600), and the process is terminated.
  • the control unit CU determines whether the internal temperature based on the detection signal of the temperature sensor S26 is higher than the threshold Th20 (for example, 40 ° C.) (Step S120). . If it is higher than the threshold value Th20 (Yes), it is determined that the operation is in a pause state (step S600), and the process is terminated.
  • a threshold Th10 for example, 10%
  • the control unit CU determines whether the current time is night based on the time detected by the timer ST, for example, from 21:00 to 6:00. It is determined whether or not there is (step S210). When it determines with it being night (Yes), it determines with it being a sleep state (step S601), and complete
  • step S210 determines whether the remaining battery level is equal to or greater than a threshold Th10 (for example, 10%) and less than the threshold Th11 (for example, 20%). Is determined (step S220). If the remaining battery level is equal to or greater than the threshold Th10 and less than the threshold Th11 (Yes), it is determined that the user is in a sleep state (step S601), and the process is terminated.
  • a threshold Th10 for example, 10%
  • Th11 for example, 20%
  • control unit CU determines whether internal temperature is below threshold value Th20 (for example, 40 degreeC), and is higher than threshold value Th21 (for example, 35 degreeC) (step S230). ). When the threshold value is less than or equal to Th20 and is higher than the threshold value Th21 (Yes), the sleep state is determined (step S601), and the process is terminated.
  • Th20 for example, 40 degreeC
  • Th21 for example, 35 degreeC
  • control unit CU determines whether the fatigue degree determined from the fatigue degree parameter T calculated
  • Th31 for example, parameter value is 100.
  • the control unit CU determines whether the remaining battery level is equal to or greater than the threshold Th11 (for example, 20%) and less than the threshold Th12 (for example, 30%) and is not being charged. Is determined (step S310). Whether charging is in progress is determined based on a detection signal from the charge detection sensor. When it determines with Yes in step S310, it determines with it being in a desire state (step S602), and complete
  • the threshold Th11 for example, 20%
  • the threshold Th12 for example, 30%
  • control unit CU determines whether internal temperature is higher than threshold value Th22 (for example, 32 degreeC), and is less than threshold value Th21 (for example, 35 degreeC) (step S320). ). When it determines with Yes in step S320, it determines with it being in a desire state (step S602), and complete
  • Th22 for example, 32 degreeC
  • Th21 for example, 35 degreeC
  • control unit CU determines whether the fatigue degree parameter T is threshold value Th31 (for example, parameter value is 100) or less, and the play state flag M is cleared (step S330). . When it determines with Yes in step S330, it determines with it being in a desire state (step S602), and complete
  • Th31 for example, parameter value is 100
  • step S350 the control unit CU determines whether the joint has come off based on the detection signal of the joint sensor S32 (step S350). If the joint is disengaged (Yes), it is determined that it is in the desire state (step S602), and the process is terminated. If it is determined that the joint is not detached (No), it is determined that the joint is in the normal state (step S603), and the process is terminated.
  • the mode determination process S20E shown in FIG. 17 in the desire state follows the procedure of the flowchart shown in FIG.
  • the processing procedure of FIG. 21 is the same as the processing procedure of the mode determination processing in the normal state shown in FIG.
  • the mode determination process S30E shown in FIG. 18 in the sleep state follows the procedure of the flowchart shown in FIG.
  • the processing procedure of FIG. 22 has many portions that overlap with the processing procedure of the mode determination processing in the normal state shown in FIG.
  • the processing in steps SA00 to S210 shown in FIG. 22 is the same as the processing in steps SA00 to S210 shown in FIG.
  • the process shown in FIG. 22 includes steps S224, S234, and S244 instead of steps S220, S230, and S240 in FIG.
  • step S224 the control unit CU determines whether or not the remaining battery level is equal to or greater than a threshold Th10 (for example, 10%) and less than a threshold Th13 (for example, 50%). When it determines with Yes in step S224, it determines with it being a sleep state (step S601), and complete
  • a threshold Th10 for example, 10%
  • Th13 for example, 50%
  • the control unit CU determines whether or not the internal temperature is higher than the threshold Th23 (for example, 25 ° C.) and lower than or equal to the threshold Th20 (for example, 40 ° C.) (Ste S234). When it determines with Yes in step S234, it determines with it being a sleep state (step S601), and complete
  • the threshold Th23 for example, 25 ° C.
  • Th20 for example, 40 ° C.
  • the control unit CU determines whether the fatigue level indicated by the fatigue level parameter T is higher than a threshold Th32 (for example, the parameter value is 50) (step S244). ). When it determines with Yes in step S244, it determines with it being a sleep state (step S601), and complete
  • a threshold Th32 for example, the parameter value is 50
  • FIG. 23 shows states determined as thresholds Th10 to Th13 in the normal state shown in the flowchart of FIG. 20, the desire state shown in the flowchart of FIG. 21, and the sleep state shown in the flowchart of FIG. (Operational sleep state, sleep state, desire state, normal state) are shown.
  • the normal state and the desire state can transition to the normal state when the remaining battery level is sufficiently large.
  • the state transitions to a desire state, and the robot 1 urges the user to replace or charge the battery by crying or the like.
  • the state transitions to a sleep state, and the battery consumption is reduced.
  • the operation transitions to the operation pause state.
  • FIG. 24 shows states determined as thresholds Th20 to Th23 in the normal state shown in the flowchart of FIG. 20, the desire state shown in the flowchart of FIG. 21, and the sleep state shown in the flowchart of FIG.
  • the relationship between an operation pause state, a sleep state, a desire state, and a normal state) is shown.
  • the normal state and the desire state can transition to the normal state, for example, when the internal temperature is normal temperature or lower.
  • the robot transits to a desire state, and the robot 1 performs actions such as crying and prompts the temperature to be lowered.
  • the state transits to a sleep state, and when the temperature further rises, the state transits to an operation pause state.
  • FIG. 25 shows states determined as thresholds Th31 and Th32 in the normal state shown in the flowchart of FIG. 20, the desire state shown in the flowchart of FIG. 21, and the sleep state shown in the flowchart of FIG. 22 according to the degree of fatigue.
  • the relationship between an operation pause state, a sleep state, a desire state, and a normal state) is shown.
  • the state transitions depending on whether or not the player is playing with the fatigue level.
  • step S350 in FIG. 20 when the arm is dislocated and the dislocation state is obtained, it is determined in step S350 in FIG. 20 that the joint is disengaged, and in step S602, the desire state is determined, and the normal state is changed to the desire state. If it is determined that it is in the desire state, pain, sadness, etc. are activated in step S20C of FIG. 17, and pain, sadness, etc. are expressed in step S20D.
  • a sad expression is made by the eyelid drive motor A11, the lip drive motor A15, and the tear LED (A12), and a sound indicating that the cry or arm is ache is output from the speaker A22.
  • step S10C When the arm joint is reduced to the original state, it is determined as a normal state, pleasure is activated in step S10C in FIG. 16, and pleasure is expressed in step S10D.
  • a pleasure expression is made by the eyelid drive motor A11, the lip drive motor A15, and the cheek LED (A14), and a pleasure sound is output from the speaker A22.
  • the robot 1 can express the state of sleeping by simulating an infant. As a result, the robot caregiver is encouraged by the robot 1 to feel child-rearing. In this way, the robot 1 can assist the robot caregiver in discovering his / her own existence value and his / her worth.
  • the robot 1 can simulate an infant as shown in FIG.
  • the robot 1 detects an external stimulus sensor Gg that detects an external stimulus, an internal state sensor Gn that detects an internal state of the robot 1, a state expression device Gj that represents the facial expression of the robot 1, and a detection from the external stimulus sensor Gg.
  • a control unit CU that determines the internal state of the robot 1 based on the signal and the detection signal from the internal state sensor Gn and operates the state expression device Gj according to the internal state of the robot 1 is provided. Therefore, the robot 1 can simulate an infant by expressing a facial expression according to the cared state.
  • the state of the robot 1 includes a normal state representing a state of good mood, a desire state representing a state of poorer mood than the normal state, and a sleep state representing sleeping.
  • the robot 1 can simulate an infant by expressing an expression according to the internal state of the robot 1. Therefore, the cared person is conscious of taking care of the robot 1. That is, the robot 1 can support the care recipient to discover his / her existence value and purpose of life. Also, the robot caregiver tries to give a sense of security by touching the robot 1 as gently as possible in order to put the robot 1 in a sleeping state. Thereby, the robot caregiver can have a consciousness that he / she is taking care of the robot 1.
  • a robot carer such as an elderly person can take care of the robot 1 in a pseudo manner and thereby feel that the robot 1 is useful.
  • the robot 1 can promote a child-rearing sensation in the mind of a cared person or the like (robot carer), and can assist the cared person or the like (robot carer) in discovering their own value or purpose of life.
  • care recipients robot caregivers
  • Care recipients include not only elderly people who need care but also those who are not elderly but are being cared for.
  • the state expression device Gj has an audio output device (A22) as shown in FIG.
  • the control unit CU determines that the patient is in the sleep state
  • the control unit CU controls the sound output machine so that a shallow breath sound and a deep breath sound are repeated periodically or randomly as sleep during sleep. Therefore, the robot 1 can express more clearly that it is a sleep state, and can express that it is sleeping in peace by taking care of a care receiver. Thereby, the robot 1 can assist a cared person or the like (robot caregiver) to discover his / her own existence value or his / her worth of life.
  • the internal state sensor Gn has a clock (ST) as shown in FIG.
  • the control unit CU determines the sleep state when it is determined that it is midnight based on the signal from the clock. Therefore, the robot 1 goes to sleep at night instead of crying at night to more realistically simulate an infant. Therefore, the robot 1 can reduce the stress of the cared person and support the cared person to continue care.
  • the robot 1 includes a movable arm 30 as shown in FIGS. 12 and 14, and a joint (Y32) which is provided on the arm 30 and is disengaged when the force exceeding the allowable amount is applied and is reduced and reduced. ),
  • a joint sensor S32 that detects the dislocation state and reduction state of the joint, and the state expression device Gj so as to represent a state of bad mood when it is determined that the dislocation state is based on the detection signal from the joint sensor S32. It has a control unit CU to be operated.
  • the robot 1 requests that the person reduce, for example, cry. As a result, the robot 1 can actively request care for the cared person. As a method of crying, a voice output or a tear LED (A12) can be turned on.
  • the robot 1 makes an expression that makes it feel better when the joint is reduced. For example, express satisfaction by expressing joy or smile. As a result, it is possible to support the cared person so that he / she can take good care of what kind of care should be taken.
  • the joint sensor S32 may have the configuration shown in FIG. 13 or another configuration. For example, it has an elastic body and a strain sensor attached to the elastic body, the strain sensor detects the deformation amount or torque of the elastic body, and whether the control unit CU is disengaged based on the detection signal of the strain sensor. May be determined.
  • the joint sensor S32 and the latch mechanism may be provided at a joint for turning the forearm 32 around the axis Y32 as shown in FIGS. 12 and 13, or may be provided at another joint.
  • may be changed to>, ⁇ may be replaced with ⁇ ,> may be replaced with ⁇ , and ⁇ may be replaced with ⁇ .
  • the numerical value used for the determination may be changed to another appropriate numerical value.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Educational Technology (AREA)
  • Algebra (AREA)
  • Computational Mathematics (AREA)
  • Medical Informatics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Educational Administration (AREA)
  • Medicinal Chemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physiology (AREA)
  • Pulmonology (AREA)
  • Gynecology & Obstetrics (AREA)
  • Pregnancy & Childbirth (AREA)
  • Reproductive Health (AREA)
  • Manipulator (AREA)
  • Toys (AREA)

Abstract

La présente invention porte sur un robot pour bénéficiaire de soins, lequel robot simule un bébé en soins. Ce robot comprend : un capteur de stimuli externe qui détecte des stimuli externes ; un capteur d'état interne qui détecte l'état interne du robot ; un dispositif d'expression d'état qui indique l'expression du robot ; et une unité de commande qui estime l'état du robot sur la base d'un signal de détection à partir du capteur de stimuli externe et d'un signal de détection à partir du capteur d'état interne, et qui, en réponse à l'état, actionne le dispositif d'expression d'état. L'état interne du robot comprend un état normal représentant une bonne humeur, un état d'appétit représentant une humeur plus mauvaise que l'état normal, et un état de sommeil représentant le sommeil.
PCT/JP2012/059662 2011-04-11 2012-04-09 Robot pour bénéficiaire de soins WO2012141130A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011087562A JP2012220783A (ja) 2011-04-11 2011-04-11 被介護者用ロボット装置
JP2011-087562 2011-04-11

Publications (1)

Publication Number Publication Date
WO2012141130A1 true WO2012141130A1 (fr) 2012-10-18

Family

ID=47009303

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/059662 WO2012141130A1 (fr) 2011-04-11 2012-04-09 Robot pour bénéficiaire de soins

Country Status (2)

Country Link
JP (1) JP2012220783A (fr)
WO (1) WO2012141130A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160346917A1 (en) * 2015-05-29 2016-12-01 Hon Hai Precision Industry Co., Ltd. Interactive robot responding to human physical touches in manner of baby
US9597805B2 (en) 2014-03-28 2017-03-21 Nathaniel Bender Care apparatus
CN107053191A (zh) * 2016-12-31 2017-08-18 华为技术有限公司 一种机器人,服务器及人机互动方法
WO2018181640A1 (fr) * 2017-03-29 2018-10-04 Groove X株式会社 Structure d'articulation appropriée pour articulation de robot
WO2019220899A1 (fr) * 2018-05-16 2019-11-21 富士フイルム株式会社 Système d'acquisition d'informations biologiques, appareil électronique, procédé d'acquisition d'informations biologiques et programme d'acquisition d'informations biologiques
US20210046392A1 (en) * 2019-07-08 2021-02-18 Ripple Effects, Inc. Dynamic and variable controlled information system and methods for monitoring and adjusting behavior
JP2022142110A (ja) * 2021-03-16 2022-09-30 カシオ計算機株式会社 機器の制御装置、機器の制御方法及びプログラム
JP7556379B2 (ja) 2022-09-26 2024-09-26 カシオ計算機株式会社 ロボット、ロボット制御方法及びプログラム

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112017003497B4 (de) 2016-07-11 2020-12-03 Groove X, Inc. Selbständig agierender Roboter mit gesteuerter Aktivitätsmenge
CN108319168B (zh) * 2018-01-22 2021-03-23 五邑大学 一种基于机器感觉的智能机器人及其系统
JP7169029B1 (ja) * 2022-04-28 2022-11-10 ヴイストン株式会社 赤ちゃん型対話ロボット、赤ちゃん型対話方法及び赤ちゃん型対話プログラム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004344417A (ja) * 2003-05-22 2004-12-09 Tomy Co Ltd 人形玩具
JP3603148B2 (ja) * 1997-12-08 2004-12-22 リアリティワークス インコーポレーテッド 幼児シミュレーター

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3603148B2 (ja) * 1997-12-08 2004-12-22 リアリティワークス インコーポレーテッド 幼児シミュレーター
JP2004344417A (ja) * 2003-05-22 2004-12-09 Tomy Co Ltd 人形玩具

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MASAYOSHI KANO ET AL.: "Developing a Robot Babyloid That Cannot Do Anything", JOURNAL OF THE ROBOTICS SOCIETY OF JAPAN, vol. 29, no. 3, 15 April 2011 (2011-04-15), pages 76 - 83 *
MASAYOSHI KANO ET AL.: "Interaction Design based on Embodiment of Baby Doll Robot Babyloid and Kansei of Human", HAI SYMPOSIUM 2009 PROGRAM JIKKO IINKAI, 2009, Retrieved from the Internet <URL:http://www.ii. is.kit.ac.jp/hai2011/proceedings/HAI2009/pdf/ lb-4.pdf> [retrieved on 20120615] *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9597805B2 (en) 2014-03-28 2017-03-21 Nathaniel Bender Care apparatus
US20160346917A1 (en) * 2015-05-29 2016-12-01 Hon Hai Precision Industry Co., Ltd. Interactive robot responding to human physical touches in manner of baby
CN107053191A (zh) * 2016-12-31 2017-08-18 华为技术有限公司 一种机器人,服务器及人机互动方法
WO2018121624A1 (fr) * 2016-12-31 2018-07-05 华为技术有限公司 Robot, serveur et procédé d'interaction homme-machine
US11858118B2 (en) 2016-12-31 2024-01-02 Huawei Technologies Co., Ltd. Robot, server, and human-machine interaction method
US11519456B2 (en) 2017-03-29 2022-12-06 Groove X, Inc. Joint structure appropriate for robot joint
WO2018181640A1 (fr) * 2017-03-29 2018-10-04 Groove X株式会社 Structure d'articulation appropriée pour articulation de robot
JPWO2018181640A1 (ja) * 2017-03-29 2019-06-27 Groove X株式会社 ロボットの関節に好適なジョイント構造
WO2019220899A1 (fr) * 2018-05-16 2019-11-21 富士フイルム株式会社 Système d'acquisition d'informations biologiques, appareil électronique, procédé d'acquisition d'informations biologiques et programme d'acquisition d'informations biologiques
JPWO2019220899A1 (ja) * 2018-05-16 2021-07-29 富士フイルム株式会社 生体情報取得システム、電子機器、生体情報取得方法、及び生体情報取得プログラム
US20210046392A1 (en) * 2019-07-08 2021-02-18 Ripple Effects, Inc. Dynamic and variable controlled information system and methods for monitoring and adjusting behavior
US11980825B2 (en) * 2019-07-08 2024-05-14 Ripple Effects, Inc. Dynamic and variable controlled information system and methods for monitoring and adjusting behavior
JP2022142110A (ja) * 2021-03-16 2022-09-30 カシオ計算機株式会社 機器の制御装置、機器の制御方法及びプログラム
JP7287411B2 (ja) 2021-03-16 2023-06-06 カシオ計算機株式会社 機器の制御装置、機器の制御方法及びプログラム
JP7556379B2 (ja) 2022-09-26 2024-09-26 カシオ計算機株式会社 ロボット、ロボット制御方法及びプログラム

Also Published As

Publication number Publication date
JP2012220783A (ja) 2012-11-12

Similar Documents

Publication Publication Date Title
WO2012141130A1 (fr) Robot pour bénéficiaire de soins
CN109526208B (zh) 活动量受控制的行为自主型机器人
Meltzoff Elements of a developmental theory of imitation
Marti et al. Socially assistive robotics in the treatment of behavioural and psychological symptoms of dementia
Wada et al. Development and preliminary evaluation of a caregiver's manual for robot therapy using the therapeutic seal robot Paro
JP2018518328A (ja) 幼児の情動状態の監視、幼児の情動状態に関するデータの遠隔集約、及び幼児に関連する生理学的測定値の確定
WO2017199662A1 (fr) Robot à action autonome et programme informatique
US10223497B2 (en) Infant learning receptivity detection system
Carrillo et al. Everyday technologies for Alzheimer's disease care: Research findings, directions, and challenges
JP2018519612A (ja) 着用可能幼児監視装置、幼児の健康監視、幼児のための環境条件の適性を監視するシステム、幼児の向きを判断するシステム、養護者監視装置、幼児の動きの判定、及び発達年齢に基づいてカスタマイズされた幼児向け学習コンテンツの提示
Tulsulkar et al. Can a humanoid social robot stimulate the interactivity of cognitively impaired elderly? A thorough study based on computer vision methods
CN108187210B (zh) 智能渲染虚拟现实调适睡眠情绪的方法、装置和系统
CN108012560B (zh) 智能婴幼儿监测系统和婴幼儿监测中心及婴幼儿学习接受度检测系统
JP2018517995A (ja) 集約幼児測定データの分析、幼児睡眠パターンの予想、幼児データに関連した観察に基づく幼児モデルの導出、推測を用いた幼児モデルの作成、及び幼児発達モデルの導出
Lancioni et al. Assistive technology for behavioral interventions for persons with severe/profound multiple disabilities: A selective overview
US20220299999A1 (en) Device control apparatus, device control method, and recording medium
Khosla et al. Enhancing emotional well being of elderly using assistive social robots in Australia
CN107924643B (zh) 婴幼儿发育分析方法及系统
US20200214613A1 (en) Apparatus, method and computer program for identifying an obsessive compulsive disorder event
US20220297307A1 (en) Device control apparatus, device control method, and recording medium
WO2021164700A1 (fr) Robot thérapeutique pour faciliter l&#39;entraînement et la thérapie pour les personnes âgées
JP2022074047A (ja) 対話型リマインダ・コンパニオン
Maroto-Gómez et al. Bio-inspired Cognitive Decision-making to Personalize the Interaction and the Selection of Exercises of Social Assistive Robots in Elderly Care
JPWO2020111190A1 (ja) 操作用デバイスを備えるロボット
CN104523240A (zh) 信息提示方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12771094

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12771094

Country of ref document: EP

Kind code of ref document: A1