US20150165625A1 - Robot - Google Patents
Robot Download PDFInfo
- Publication number
- US20150165625A1 US20150165625A1 US14/568,846 US201414568846A US2015165625A1 US 20150165625 A1 US20150165625 A1 US 20150165625A1 US 201414568846 A US201414568846 A US 201414568846A US 2015165625 A1 US2015165625 A1 US 2015165625A1
- Authority
- US
- United States
- Prior art keywords
- robot
- feedback
- touch sensor
- expressive
- emotionally
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000002996 emotional effect Effects 0.000 claims abstract description 42
- 230000008451 emotion Effects 0.000 claims abstract description 20
- 238000004891 communication Methods 0.000 claims abstract description 6
- 230000037007 arousal Effects 0.000 claims description 17
- 230000002452 interceptive effect Effects 0.000 claims description 11
- 239000000463 material Substances 0.000 claims description 10
- 230000008859 change Effects 0.000 claims description 3
- 230000003287 optical effect Effects 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 claims description 3
- 230000004044 response Effects 0.000 description 5
- 240000007643 Phytolacca americana Species 0.000 description 4
- 235000009074 Phytolacca americana Nutrition 0.000 description 4
- 230000007423 decrease Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 238000000034 method Methods 0.000 description 3
- 230000036651 mood Effects 0.000 description 3
- 230000007935 neutral effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 229920001971 elastomer Polymers 0.000 description 2
- 230000003252 repetitive effect Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 229920002379 silicone rubber Polymers 0.000 description 1
- 239000004945 silicone rubber Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40253—Soft arm robot, light, rubber, very compliant
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40414—Man robot interface, exchange of information between operator and robot
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40625—Tactile sensor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/46—Sensing device
- Y10S901/47—Optical
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/50—Miscellaneous
Definitions
- the present disclosure relates to robots having interactive interfaces and systems and methods for using and creating such robots.
- the present disclosure provides a mechanism and/or system for an interactive robot to detect and infer differences between various kinds of touch.
- the present disclosure provides a mechanism and/or system for an interactive robot to generate appropriate affective responses to detected touch inputs.
- FIG. 1 is a perspective view of a robot, according to various embodiments of the present disclosure.
- FIG. 2 is an elevation view of the robot of FIG. 1 , according to various embodiments of the present disclosure.
- FIG. 3 is a plan view of the robot of FIG. 1 , according to various embodiments of the present disclosure.
- FIG. 4 is an elevation view of the robot of FIG. 1 with various elements removed and various elements shown in transparency for illustrative purposes, depicting an emotion-expressing system, according to various embodiments of the present disclosure.
- FIG. 5 is a schematic depicting a control system for the robot emotion-expressing system of FIG. 4 , according to various embodiments of the present disclosure.
- FIG. 6 is an emotional state graph, according to various embodiments of the present disclosure.
- the present disclosure relates to a novel and unique robot.
- the present disclosure relates to an emotionally-expressive and/or communicative robot.
- the present disclosure relates to a robot that is touch-sensitive, and can communicate emotions and/or mood with feedback generators.
- the present disclosure describes an interactive robotic interface that can detect the direction and pressure of touch on the robot's body, and can respond to the nature of this touch through the generation of light, sound, and/or movement.
- a robot 10 is depicted.
- the robot 10 includes a body 12 , and can include additional features and/or elements supported on and/or extending from the body 12 .
- a transparent or semi-transparent shell 18 can be positioned around at least a portion of body 12 .
- the robot 10 can includes eyes 16 , which are supported on the shell 18 .
- the body 12 of the robot 10 depicted in FIGS. 1-4 defines a dome-shaped body.
- the body 12 can be deformable.
- the dome-shaped body 12 can be comprised bf a rubber and/or rubber-like material, such as silicone rubber, for example, which can be configured to deform in response to external forces and/or touches, for example.
- the robot 10 can comprise various different shapes and/or styles.
- the robot 10 can comprise a toy, such as the robotic toys disclosed in U.S. Design Pat. No. D714,881, entitled ROBOT, which issued on Oct. 7, 2014; U.S. Design Pat. No. D714,883, entitled ROBOT, which issued on Oct. 7, 2014; and U.S. Design Pat. No. D714,888, entitled ROBOT, which issued on Oct. 7, 2014, which are hereby incorporated by reference herein in their respective entireties.
- the robot 10 can include additional features, such as additional facial features and/or body parts. Additionally or alternatively, the robot 10 can include various colors and/or designs.
- the robot 10 can include additional control mechanisms, such as the various actuation systems disclosed in contemporaneously-filed U.S. patent application Ser. No. ______, entitled ROBOT, corresponding to Attorney Docket No. 130499, which is hereby incorporated by reference herein in its entirety.
- the robot 10 includes an affective or emotion-expressing system 20 .
- the emotion-expressing system 20 can be at least partially embedded and/or encased within the body 12 .
- the emotion-expressing expressing system 20 depicted in FIG. 4 includes a touch sensor 22 , which is positioned in the center of the body 12 .
- the emotion-expressing system 20 can include a plurality of touch sensors 22 .
- the touch sensor 22 is configured to detect the pressure, the location and/or the direction, i.e., angle, of externally-applied forces.
- the touch sensors 22 can be embedded within the body 12 , and can detect forces on various external surfaces of the body 12 and/or the robot 10 .
- the touch sensor 22 can be implemented with an OptoForce sensor.
- the touch sensor 22 can be an optical sensor, as described in International Patent Application Publication No. WO 2013/072712 A1, entitled SENSOR DEVICE, filed on Nov. 16, 2012, which is hereby incorporated by reference herein in its entirety.
- the touch sensor 22 can detect the relative movement of LEDs and/or photosensors embedded and arranged in a cavity defined in a rubber body.
- an emotion-expressing system can include feedback generators, which can be configured to emit visual, tactile, and/or auditory feedback, for example, based on the forces) detected by the touch sensors) 22 .
- an emotion-expressing system can include at least one light, at least one speaker and/or at least one actuator.
- the system 20 includes a plurality of lights 24 , a speaker 26 , and an actuator 28 , which can provide multimodal feedback to interactants, e.g., people who interact with the robot 10 .
- the speaker 26 can be positioned on the body 12 , such as on the bottom and/or underside of the body 12 , for example.
- the lights 24 can be arranged on the body 12 .
- an array of lights can be embedded below the surface and/or skin of the body 12 .
- the lights 24 can be arranged in a plurality of columns and/or lines.
- the lights 24 can be arranged in a plurality of columns extending downward from the top of the dome-shaped body 12 .
- a single light 24 can be positioned at the top of the dome-shaped body 12 .
- the lights 24 can form star-shaped arrangement when viewed from the top see FIG. 3 ).
- the lights 24 can be symmetrically arranged around the body 12 , for example.
- the lights 24 can be arranged in at least one cluster and/or can be randomly positioned around the body 12 .
- the lights 24 can comprise light-emitting diodes LEDs), for example.
- the lights 24 can comprise addressable color-controllable LEDs, for example.
- the lights 24 can be implemented with WS2812B LEDs.
- the actuator 28 can comprise a vibrator, which can be embedded within the body 12 of the robot 10 .
- the vibrator 28 can be positioned in the center of the body 12 .
- the vibrator 28 can include a rotary motor with an off-center weight on its shaft, for example.
- the vibrator 28 can be implemented with a Precision Microdrives 310-101 motor.
- an actuator of the emotion-expressing system 20 can include a rotary and/or linear actuator, which can be configured to move and/or deform the body 12 of the robot 10 and/or elements thereof in response to touch.
- the emotion-expressing system 20 can include a controller 30 , which can be in communication with the touch sensors) 22 and the feedback generators 24 , 26 , and 28 .
- the touch sensors) 22 can communicate the detected magnitude, direction, and position of the external force to the controller 30 .
- Software on a controller 30 can process data from the sensors) 22 and provide localized touch feedback.
- the lights 24 in the vicinity of the location of an applied force can glow to indicate awareness of the touch.
- the controller 30 can integrate the recent history of applied touches to place the robot 10 in an emotional state that mediates the nature of the expressed feedback.
- the controller 46 can be implemented with an iOS Micro microcontroller board.
- Emotional state is defined as a location in a multi-dimensional space with axes representing various characteristics of emotion.
- the emotional state of the robot 10 can shift with each new touch.
- an emotional state graph 32 is depicted.
- the emotional state of the robot 10 can be defined within the two-dimensional plane of the emotional state graph 32 . In other instances, the emotional state can be defined by three or more dimensions.
- Touches detected by the touch sensor 22 can shift and/or update the position of the robot's 10 emotional state on the emotional state graph 32 .
- an axis on the graph 32 corresponds to valence, which can refer to the favorableness of the touch.
- the valence spectrum can include positive touches in region 42 and negative touches in region 44 .
- a neutral region or point 40 can be intermediate the position region 42 and the negative region 44 .
- the other axis on the graph 32 corresponds to arousal, which refers to the level of activity.
- the arousal spectrum can increase from no arousal to heightened arousal. Emotional modeling based on valence and arousal is further described in “Designing Sociable Robots” by Cynthia L. Breazeal, MIT Press 2004), which is hereby incorporated by reference herein in its entirety.
- the sensor 22 can be configured to detect the force applied to the robot 10 .
- the sensor 22 can determine whether the detected force is associated with a light, gentle touch or a hard, abrupt touch.
- the detected force of the touch can correspond to valence.
- lighter touches such as a gentle stroke, for example, can correspond to a positive valence value in region 42 of the graph 32 .
- harder touches such as an abrupt punch, for example, can correspond to a negative valence value in region 44 of the graph 32 .
- the senor 22 in combination with the controller 30 can be configured to detect the frequency and/or timing of touches.
- the controller 30 can store and/or access information regarding previous touches and can determine if the detected touches are associated with constant pressure or a sequence of touches, such as pokes, for example.
- the frequency and/or timing of the touches can correspond to arousal.
- constant pressure can correspond to a lower arousal level while a sequence of touches can correspond to a heightened arousal level.
- the combination of valence and arousal can determine the emotional state of the robot 10 .
- the emotional state of the robot 10 can be joyful as depicted in the upper, right corner of the graph 32 in FIG. 6 .
- the emotional state of the robot 10 can be angry.
- the emotional state of the robot 10 can be calm and content, as depicted in the lower, right corner of graph 32 in FIG. 6 .
- the arousal level is low and the touches are strong and/or hurtful, e.g., infrequent, high-pressure touches, the emotional state of the robot 10 can be sad.
- the controller 30 can be configured to adjust the emotional state of the robot 10 based on the detected touches. For example, negative touches can shift the robot's 10 emotional state toward, into, and/or further into the negative region 44 and away from and/or out of the positive region 42 . Positive touches can shift the robot's 10 emotional state toward, into, and/or further into the positive region 42 and away from and/or out of the negative region 44 . Moreover, the change in emotional state can be greater when the arousal level is higher, and can be less when the arousal level is lower.
- the feedback generators 24 , 26 , and 28 of the emotion-expressing system 20 can display qualities reflective and/or expressive of the emotion state and/or changes thereto. For example, harder touches can be configured to shift the robot toward a “negative” emotional state, while repetitive soft touches might place the robot in a “positive” emotional state. Referring again to FIG. 6 , the robot 10 can be in a first emotional state at location 46 on the emotional state graph 32 . If the sensor 22 of the emotion-expressing system 20 detects a strong, negative touch, the robot's 10 emotional state can shift to location 48 , for example.
- touch can be applied to various points on the body 12 of the robot 10 .
- the touch can be recognized by the sensors) 22 described herein.
- Information about the pressure and direction of the applied forces can be continuously and/or repeatedly sent to the controllers) 30 .
- the controller 30 can estimate the location on the body 12 from which the externally-applied touch would have produced the sensed force.
- the controller 30 can feed the information regarding the location, magnitude, and/or direction of the force to an algorithm and/or software package, which can adjust the emotional state of the robot 10 based on the touch.
- the controller can direct the feedback generators 24 , 26 , and/or 28 to communicate the updated emotional state.
- the color, intensity, and spatiotemporal growth of the patterns can be determined by the emotional state.
- the pressure of touch inversely influences the valence component of the emotional state, for example, and the quantity or frequency of touch influences the arousal component of the emotional state, for example.
- the controller 30 can initiate visual patterns to be displayed on the lights 24 below the surface of the body 12 , with an appropriate mapping between the address of each light 24 and its location on the body 12 .
- the starting location of the patterns can determined by the most recent location of touch, for example.
- a touch of a short duration such as a poke, for example, can result in a shockwave of illuminated lights 24 from the point of contact.
- consistent pressure at a point of contact can result in light in the specific region, which can expand during the duration of the touch.
- the color, intensity, and/or duration of the lights 24 can suggest the emotional state of the robot 10 .
- the controller 30 can direct the lights to be illuminated in a series of light-emitting patterns indicative of the emotion state.
- the controller 30 can be configured to adjust the color, movement, and pace of the lights.
- the controller 30 in response to a harder touch, such as a punch, for example, the controller 30 can direct the lights 24 to light up with a color suggestive of pain, such as red hues, for example.
- a color suggestive of comfort such as blue hues, for example.
- the color of the lights 24 can correspond to the mood of the robot 10 .
- a different color and/or series of colors can correspond to the four mood quadrants shown in FIG. 6 , i.e., calm/content, sad, angry, and joyful.
- the color blue can correspond to calmness and contentment.
- the lights 24 can pulse a shade of blue at a slow and steady rate. Purple, for example, can signify gloom and darkness and thus, be associated with sadness.
- the lights 24 can pulse a purple hue slowly and inconsistently.
- red can be associated with anger to signify alarm and/or to communicate “stop”.
- the lights 24 can pulse a red hue rapidly and inconsistently.
- the joyful state of the robot 10 can correspond to the color yellow, which is associated with happiness and energy.
- the lights 24 can be configured to pulse a yellow hue at a frequent and steady rate.
- the robot 10 can be configured to generate a pattern of fast-paced, rainbow-colored lights when the pinnacle of extreme joyfulness is experienced.
- the sounds produced by the speaker 26 can be generated from simple sound blocks, such as sinusoids and/or pre-recorded waveforms, for example.
- the sounds can be modulated and/or repeated according to the emotional state of the robot 10 .
- the slope of the overall prosodic or pitch envelope can be determined by the valence component, for example, and the frequency and quantity of sound blocks can be determined by the arousal component of the emotional state.
- the pitch of sounds from the speaker 26 can move through a sequence from a low pitch to a high pitch as the valence shifts from the neutral position 40 to an increasingly positive valence level in region 42 .
- the pitch of sounds from the speaker 26 can move through a sequence from a low pitch to a high pitch as the valence shifts from the neutral position 40 to an increasingly negative valence level in region 44 .
- the output frequency of sounds from the speaker 26 can increase and the duration of sounds from the speaker can decrease as the robot 10 becomes more aroused, for example, and the output frequency of sounds from the speaker 26 can decrease and the duration of sounds from the speaker can increase as the robot 10 arousal level decreases, for example.
- the actuator 28 can also be in communication with the controller 30 and can respond to the emotional state of the robot 10 .
- the actuator 28 can be actuated when the sensor 22 detects a touch, and the intensity of the vibrations and/or movements can be controlled by pulse-width-modulation PWM) according to the detected pressure applied to the body 12 .
- PWM pulse-width-modulation
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Toys (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
Abstract
Description
- This application claims the benefit under 35 U.S.C. §119e) of co-pending U.S. Provisional Patent Application No. 61/915,253, entitled ROBOT, filed Dec. 12, 2013, which is incorporated by reference herein in its entirety.
- The present disclosure relates to robots having interactive interfaces and systems and methods for using and creating such robots.
- In at least one embodiment, the present disclosure provides a mechanism and/or system for an interactive robot to detect and infer differences between various kinds of touch.
- In at least one embodiment, the present disclosure provides a mechanism and/or system for an interactive robot to generate appropriate affective responses to detected touch inputs.
- The foregoing discussion is intended only to illustrate various aspects of certain embodiments disclosed in the present disclosure, and should not be taken as a disavowal of claim scope.
- Various features of the embodiments described herein are set forth with particularity in the appended claims. The various embodiments, however, both as to organization and methods of operation, together with the advantages thereof, may be understood in accordance with the following description taken in conjunction with the accompanying drawings as follows:
-
FIG. 1 is a perspective view of a robot, according to various embodiments of the present disclosure. -
FIG. 2 is an elevation view of the robot ofFIG. 1 , according to various embodiments of the present disclosure. -
FIG. 3 is a plan view of the robot ofFIG. 1 , according to various embodiments of the present disclosure. -
FIG. 4 is an elevation view of the robot ofFIG. 1 with various elements removed and various elements shown in transparency for illustrative purposes, depicting an emotion-expressing system, according to various embodiments of the present disclosure. -
FIG. 5 is a schematic depicting a control system for the robot emotion-expressing system ofFIG. 4 , according to various embodiments of the present disclosure. -
FIG. 6 is an emotional state graph, according to various embodiments of the present disclosure. - The exemplifications set out herein illustrate various embodiments, in one form, and such exemplifications are not to be construed as limiting the scope of the appended claims in any manner.
- Numerous specific details are set forth to provide a thorough understanding of the overall structure, function, manufacture, and use of the embodiments as described in the specification and illustrated in the accompanying drawings. It will be understood by those skilled in the art, however, that the embodiments may be practiced without such specific details. In other instances, well-known operations, components, and elements have not been described in detail so as not to obscure the embodiments described in the specification. Those of ordinary skill in the art will understand that the embodiments described and illustrated herein are non-limiting examples, and thus it can be appreciated that the specific structural and functional details disclosed herein may be representative and illustrative. Variations and changes thereto may be made without departing from the scope of the claims.
- The terms “comprise” and any form of comprise, such as “comprises” and “comprising”), “have” and any form of have, such as “has” and “having”), “include” and any form of include, such as “includes” and “including”) and “contain” and any form of contain, such as “contains” and “containing”) are open-ended linking verbs. As a result, a system, device, or apparatus that “comprises,” “has,” “includes” or “contains” one or more elements possesses those one or more elements, but is not limited to possessing only those one or more elements. Likewise, an element of a system, device, or apparatus that “comprises,” “has,” “includes” or “contains” one or more features possesses those one or more features, but is not limited to possessing only those one or more features.
- The present disclosure relates to a novel and unique robot. In various instances, the present disclosure relates to an emotionally-expressive and/or communicative robot. In certain instances, the present disclosure relates to a robot that is touch-sensitive, and can communicate emotions and/or mood with feedback generators. For example, the present disclosure describes an interactive robotic interface that can detect the direction and pressure of touch on the robot's body, and can respond to the nature of this touch through the generation of light, sound, and/or movement.
- Referring to
FIGS. 1-4 , arobot 10 is depicted. Therobot 10 includes abody 12, and can include additional features and/or elements supported on and/or extending from thebody 12. A transparent orsemi-transparent shell 18 can be positioned around at least a portion ofbody 12. Referring primarily toFIGS. 1-3 , therobot 10 can includeseyes 16, which are supported on theshell 18. Thebody 12 of therobot 10 depicted inFIGS. 1-4 defines a dome-shaped body. Thebody 12 can be deformable. For example, the dome-shaped body 12 can be comprised bf a rubber and/or rubber-like material, such as silicone rubber, for example, which can be configured to deform in response to external forces and/or touches, for example. - The reader will further appreciate that the
robot 10 can comprise various different shapes and/or styles. For example, therobot 10 can comprise a toy, such as the robotic toys disclosed in U.S. Design Pat. No. D714,881, entitled ROBOT, which issued on Oct. 7, 2014; U.S. Design Pat. No. D714,883, entitled ROBOT, which issued on Oct. 7, 2014; and U.S. Design Pat. No. D714,888, entitled ROBOT, which issued on Oct. 7, 2014, which are hereby incorporated by reference herein in their respective entireties. In various instances, therobot 10 can include additional features, such as additional facial features and/or body parts. Additionally or alternatively, therobot 10 can include various colors and/or designs. Moreover, therobot 10 can include additional control mechanisms, such as the various actuation systems disclosed in contemporaneously-filed U.S. patent application Ser. No. ______, entitled ROBOT, corresponding to Attorney Docket No. 130499, which is hereby incorporated by reference herein in its entirety. - Referring primarily to
FIG. 4 , therobot 10 includes an affective or emotion-expressingsystem 20. In various instances, the emotion-expressingsystem 20 can be at least partially embedded and/or encased within thebody 12. The emotion-expressing expressingsystem 20 depicted inFIG. 4 includes atouch sensor 22, which is positioned in the center of thebody 12. In various instances, the emotion-expressingsystem 20 can include a plurality oftouch sensors 22. Thetouch sensor 22 is configured to detect the pressure, the location and/or the direction, i.e., angle, of externally-applied forces. For example, the touch sensors) 22 can be embedded within thebody 12, and can detect forces on various external surfaces of thebody 12 and/or therobot 10. - In at least one embodiment, the
touch sensor 22 can be implemented with an OptoForce sensor. For example, thetouch sensor 22 can be an optical sensor, as described in International Patent Application Publication No. WO 2013/072712 A1, entitled SENSOR DEVICE, filed on Nov. 16, 2012, which is hereby incorporated by reference herein in its entirety. Thetouch sensor 22 can detect the relative movement of LEDs and/or photosensors embedded and arranged in a cavity defined in a rubber body. - In various instances, an emotion-expressing system can include feedback generators, which can be configured to emit visual, tactile, and/or auditory feedback, for example, based on the forces) detected by the touch sensors) 22. For example, an emotion-expressing system can include at least one light, at least one speaker and/or at least one actuator. Referring again to the
affective system 20 depicted inFIG. 20 , thesystem 20 includes a plurality oflights 24, aspeaker 26, and anactuator 28, which can provide multimodal feedback to interactants, e.g., people who interact with therobot 10. Thespeaker 26 can be positioned on thebody 12, such as on the bottom and/or underside of thebody 12, for example. - In various instances, the
lights 24 can be arranged on thebody 12. For example, an array of lights can be embedded below the surface and/or skin of thebody 12. As depicted inFIGS. 1-3 , thelights 24 can be arranged in a plurality of columns and/or lines. For example, thelights 24 can be arranged in a plurality of columns extending downward from the top of the dome-shapedbody 12. Asingle light 24 can be positioned at the top of the dome-shapedbody 12. In such instances, thelights 24 can form star-shaped arrangement when viewed from the top seeFIG. 3 ). Thelights 24 can be symmetrically arranged around thebody 12, for example. In certain instances, thelights 24 can be arranged in at least one cluster and/or can be randomly positioned around thebody 12. In various instances, thelights 24 can comprise light-emitting diodes LEDs), for example. In certain instances, thelights 24 can comprise addressable color-controllable LEDs, for example. In at least one embodiment, thelights 24 can be implemented with WS2812B LEDs. - In certain instances, the
actuator 28 can comprise a vibrator, which can be embedded within thebody 12 of therobot 10. For example, thevibrator 28 can be positioned in the center of thebody 12. Thevibrator 28 can include a rotary motor with an off-center weight on its shaft, for example. In at least one embodiment, thevibrator 28 can be implemented with a Precision Microdrives 310-101 motor. Additionally or alternatively, an actuator of the emotion-expressingsystem 20, can include a rotary and/or linear actuator, which can be configured to move and/or deform thebody 12 of therobot 10 and/or elements thereof in response to touch. - Referring now to
FIGS. 4 and 5 , the emotion-expressingsystem 20 can include acontroller 30, which can be in communication with the touch sensors) 22 and thefeedback generators controller 30. Software on acontroller 30 can process data from the sensors) 22 and provide localized touch feedback. For example, thelights 24 in the vicinity of the location of an applied force can glow to indicate awareness of the touch. Furthermore, thecontroller 30 can integrate the recent history of applied touches to place therobot 10 in an emotional state that mediates the nature of the expressed feedback. In at least one embodiment, thecontroller 46 can be implemented with an Arduino Micro microcontroller board. - Emotional state is defined as a location in a multi-dimensional space with axes representing various characteristics of emotion. In various instances, the emotional state of the
robot 10 can shift with each new touch. Referring toFIG. 6 , anemotional state graph 32 is depicted. The emotional state of therobot 10 can be defined within the two-dimensional plane of theemotional state graph 32. In other instances, the emotional state can be defined by three or more dimensions. - Touches detected by the
touch sensor 22 can shift and/or update the position of the robot's 10 emotional state on theemotional state graph 32. Referring still toFIG. 6 , an axis on thegraph 32 corresponds to valence, which can refer to the favorableness of the touch. The valence spectrum can include positive touches inregion 42 and negative touches inregion 44. A neutral region orpoint 40 can be intermediate theposition region 42 and thenegative region 44. The other axis on thegraph 32 corresponds to arousal, which refers to the level of activity. The arousal spectrum can increase from no arousal to heightened arousal. Emotional modeling based on valence and arousal is further described in “Designing Sociable Robots” by Cynthia L. Breazeal, MIT Press 2004), which is hereby incorporated by reference herein in its entirety. - The
sensor 22 can be configured to detect the force applied to therobot 10. For example, thesensor 22 can determine whether the detected force is associated with a light, gentle touch or a hard, abrupt touch. In various instances, the detected force of the touch can correspond to valence. For example, lighter touches, such as a gentle stroke, for example, can correspond to a positive valence value inregion 42 of thegraph 32. Moreover, harder touches, such as an abrupt punch, for example, can correspond to a negative valence value inregion 44 of thegraph 32. - In certain instances, the
sensor 22 in combination with thecontroller 30 can be configured to detect the frequency and/or timing of touches. For example, thecontroller 30 can store and/or access information regarding previous touches and can determine if the detected touches are associated with constant pressure or a sequence of touches, such as pokes, for example. In various instances, the frequency and/or timing of the touches can correspond to arousal. For example, constant pressure can correspond to a lower arousal level while a sequence of touches can correspond to a heightened arousal level. - The combination of valence and arousal can determine the emotional state of the
robot 10. For example, when therobot 10 is highly aroused by positive valence touches, e.g., frequent, low-pressure pokes, the emotional state of therobot 10 can be joyful as depicted in the upper, right corner of thegraph 32 inFIG. 6 . Referring still toFIG. 6 , when therobot 10 is highly aroused by negative touches, e.g., frequent, high-pressure pokes, the emotional state of therobot 10 can be angry. If the arousal level of therobot 10 is low but the touches are positive, e.g., infrequent, low-pressure touches, the emotional state of therobot 10 can be calm and content, as depicted in the lower, right corner ofgraph 32 inFIG. 6 . Referring still toFIG. 6 , if the arousal level is low and the touches are strong and/or hurtful, e.g., infrequent, high-pressure touches, the emotional state of therobot 10 can be sad. - The
controller 30 can be configured to adjust the emotional state of therobot 10 based on the detected touches. For example, negative touches can shift the robot's 10 emotional state toward, into, and/or further into thenegative region 44 and away from and/or out of thepositive region 42. Positive touches can shift the robot's 10 emotional state toward, into, and/or further into thepositive region 42 and away from and/or out of thenegative region 44. Moreover, the change in emotional state can be greater when the arousal level is higher, and can be less when the arousal level is lower. - The
feedback generators system 20 can display qualities reflective and/or expressive of the emotion state and/or changes thereto. For example, harder touches can be configured to shift the robot toward a “negative” emotional state, while repetitive soft touches might place the robot in a “positive” emotional state. Referring again toFIG. 6 , therobot 10 can be in a first emotional state atlocation 46 on theemotional state graph 32. If thesensor 22 of the emotion-expressingsystem 20 detects a strong, negative touch, the robot's 10 emotional state can shift tolocation 48, for example. - In various embodiments, touch can be applied to various points on the
body 12 of therobot 10. The touch can be recognized by the sensors) 22 described herein. Information about the pressure and direction of the applied forces can be continuously and/or repeatedly sent to the controllers) 30. In various instances, thecontroller 30 can estimate the location on thebody 12 from which the externally-applied touch would have produced the sensed force. Thecontroller 30 can feed the information regarding the location, magnitude, and/or direction of the force to an algorithm and/or software package, which can adjust the emotional state of therobot 10 based on the touch. Moreover, the controller can direct thefeedback generators - The color, intensity, and spatiotemporal growth of the patterns can be determined by the emotional state. In one embodiment, the pressure of touch inversely influences the valence component of the emotional state, for example, and the quantity or frequency of touch influences the arousal component of the emotional state, for example. For example, the
controller 30 can initiate visual patterns to be displayed on thelights 24 below the surface of thebody 12, with an appropriate mapping between the address of each light 24 and its location on thebody 12. The starting location of the patterns can determined by the most recent location of touch, for example. In certain instances, a touch of a short duration, such as a poke, for example, can result in a shockwave ofilluminated lights 24 from the point of contact. Additionally or alternatively, consistent pressure at a point of contact can result in light in the specific region, which can expand during the duration of the touch. - In certain instances, the color, intensity, and/or duration of the
lights 24 can suggest the emotional state of therobot 10. Thecontroller 30 can direct the lights to be illuminated in a series of light-emitting patterns indicative of the emotion state. For example, thecontroller 30 can be configured to adjust the color, movement, and pace of the lights. In certain instances, in response to a harder touch, such as a punch, for example, thecontroller 30 can direct thelights 24 to light up with a color suggestive of pain, such as red hues, for example. Moreover, in response to soft, repetitive touches, such as light strokes, for example, thecontroller 30 can direct thelights 24 to light up with a color suggestive of comfort, such as blue hues, for example. - In certain instances, the color of the
lights 24 can correspond to the mood of therobot 10. For example, a different color and/or series of colors can correspond to the four mood quadrants shown inFIG. 6 , i.e., calm/content, sad, angry, and joyful. In various instances, the color blue can correspond to calmness and contentment. For example, when therobot 10 is calm and content, thelights 24 can pulse a shade of blue at a slow and steady rate. Purple, for example, can signify gloom and darkness and thus, be associated with sadness. For example, when therobot 10 is sad, thelights 24 can pulse a purple hue slowly and inconsistently. In certain instances, red can be associated with anger to signify alarm and/or to communicate “stop”. When therobot 10 is angry, thelights 24 can pulse a red hue rapidly and inconsistently. In various instances, the joyful state of therobot 10 can correspond to the color yellow, which is associated with happiness and energy. When therobot 10 is joyful, thelights 24 can be configured to pulse a yellow hue at a frequent and steady rate. In certain instances, therobot 10 can be configured to generate a pattern of fast-paced, rainbow-colored lights when the pinnacle of extreme joyfulness is experienced. - In various instances, the sounds produced by the
speaker 26 can be generated from simple sound blocks, such as sinusoids and/or pre-recorded waveforms, for example. The sounds can be modulated and/or repeated according to the emotional state of therobot 10. In one embodiment, the slope of the overall prosodic or pitch envelope can be determined by the valence component, for example, and the frequency and quantity of sound blocks can be determined by the arousal component of the emotional state. For example, the pitch of sounds from thespeaker 26 can move through a sequence from a low pitch to a high pitch as the valence shifts from theneutral position 40 to an increasingly positive valence level inregion 42. Additionally, the pitch of sounds from thespeaker 26 can move through a sequence from a low pitch to a high pitch as the valence shifts from theneutral position 40 to an increasingly negative valence level inregion 44. Additionally, the output frequency of sounds from thespeaker 26 can increase and the duration of sounds from the speaker can decrease as therobot 10 becomes more aroused, for example, and the output frequency of sounds from thespeaker 26 can decrease and the duration of sounds from the speaker can increase as therobot 10 arousal level decreases, for example. - The
actuator 28 can also be in communication with thecontroller 30 and can respond to the emotional state of therobot 10. For example, theactuator 28 can be actuated when thesensor 22 detects a touch, and the intensity of the vibrations and/or movements can be controlled by pulse-width-modulation PWM) according to the detected pressure applied to thebody 12. - While the present disclosure has been described as having certain designs, the various disclosed embodiments may be further modified within the scope of the disclosure. This application is therefore intended to cover any variations, uses, or adaptations of the disclosed embodiments using its general principles. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the relevant art.
- Any patent, publication, or other disclosure material, in whole or in part, that is said to be incorporated by reference herein is incorporated herein only to the extent that the incorporated materials does not conflict with existing definitions, statements, or other disclosure material set forth in this disclosure. As such, and to the extent necessary, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference. Any material, or portion thereof, that is said to be incorporated by reference herein, but which conflicts with existing definitions, statements, or other disclosure material set forth herein will only be incorporated to the extent that no conflict arises between that incorporated material and the existing disclosure material.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/568,846 US9421688B2 (en) | 2013-12-12 | 2014-12-12 | Robot |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361915253P | 2013-12-12 | 2013-12-12 | |
US14/568,846 US9421688B2 (en) | 2013-12-12 | 2014-12-12 | Robot |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150165625A1 true US20150165625A1 (en) | 2015-06-18 |
US9421688B2 US9421688B2 (en) | 2016-08-23 |
Family
ID=53367310
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/568,846 Active US9421688B2 (en) | 2013-12-12 | 2014-12-12 | Robot |
Country Status (1)
Country | Link |
---|---|
US (1) | US9421688B2 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9358475B2 (en) | 2013-12-12 | 2016-06-07 | Beatbots, LLC | Robot |
WO2017000800A1 (en) * | 2015-06-30 | 2017-01-05 | 芋头科技(杭州)有限公司 | System and method for intelligently controlling activity of robot |
US9802314B2 (en) | 2015-10-01 | 2017-10-31 | Disney Enterprises, Inc. | Soft body robot for physical interaction with humans |
US10068424B2 (en) | 2016-05-13 | 2018-09-04 | Universal Entertainment Corporation | Attendant device and gaming machine |
WO2018208761A1 (en) * | 2017-05-11 | 2018-11-15 | Misty Robotics, Inc. | Infinite robot personalities |
US20190232146A1 (en) * | 2018-01-26 | 2019-08-01 | Sony Corporation | Sporting display device and method |
CN113414763A (en) * | 2021-06-21 | 2021-09-21 | 杭州电子科技大学 | Overlapped optical signal touch sensing system based on soft body arm and touch detection method thereof |
WO2022194029A1 (en) * | 2021-03-15 | 2022-09-22 | 华为技术有限公司 | Robot feedback method and robot |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6347261B1 (en) * | 1999-08-04 | 2002-02-12 | Yamaha Hatsudoki Kabushiki Kaisha | User-machine interface system for enhanced interaction |
US6373265B1 (en) * | 1999-02-02 | 2002-04-16 | Nitta Corporation | Electrostatic capacitive touch sensor |
US20090090305A1 (en) * | 2007-10-03 | 2009-04-09 | National University Of Singapore | System for humans and pets to interact remotely |
US20110137137A1 (en) * | 2009-12-08 | 2011-06-09 | Electronics And Telecommunications Research Institute | Sensing device of emotion signal and method thereof |
US20130078600A1 (en) * | 2011-08-29 | 2013-03-28 | Worcester Polytechnic Institute | System and method of pervasive developmental disorder interventions |
US8441467B2 (en) * | 2006-08-03 | 2013-05-14 | Perceptive Pixel Inc. | Multi-touch sensing display through frustrated total internal reflection |
US20140035603A1 (en) * | 2012-08-03 | 2014-02-06 | Xerox Corporation | Printed Stretch Sensor |
US9002768B2 (en) * | 2012-05-12 | 2015-04-07 | Mikhail Fedorov | Human-computer interface system |
US20150100157A1 (en) * | 2012-04-04 | 2015-04-09 | Aldebaran Robotics S.A | Robot capable of incorporating natural dialogues with a user into the behaviour of same, and methods of programming and using said robot |
US20150277617A1 (en) * | 2014-03-28 | 2015-10-01 | Paul Gwin | Flexible sensor |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9224273B1 (en) * | 2005-12-20 | 2015-12-29 | Diebold Self-Service Systems Division Of Diebold, Incorporated | Banking system controlled responsive to data bearing records |
HUP1100633A2 (en) | 2011-11-17 | 2013-06-28 | Pazmany Peter Katolikus Egyetem | Device with optical feedback for measuring force and pressure |
US9207755B2 (en) * | 2011-12-20 | 2015-12-08 | Iconicast, LLC | Method and system for emotion tracking, tagging, and rating and communication |
KR101410416B1 (en) * | 2011-12-21 | 2014-06-27 | 주식회사 케이티 | Remote control method, system and user interface |
USD714888S1 (en) | 2013-11-25 | 2014-10-07 | Beatbots LLC | Toy |
USD714881S1 (en) | 2013-11-25 | 2014-10-07 | Beatbots LLC | Toy |
USD714883S1 (en) | 2013-11-25 | 2014-10-07 | Beatbots LLC | Toy |
US9358475B2 (en) | 2013-12-12 | 2016-06-07 | Beatbots, LLC | Robot |
-
2014
- 2014-12-12 US US14/568,846 patent/US9421688B2/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6373265B1 (en) * | 1999-02-02 | 2002-04-16 | Nitta Corporation | Electrostatic capacitive touch sensor |
US6347261B1 (en) * | 1999-08-04 | 2002-02-12 | Yamaha Hatsudoki Kabushiki Kaisha | User-machine interface system for enhanced interaction |
US8441467B2 (en) * | 2006-08-03 | 2013-05-14 | Perceptive Pixel Inc. | Multi-touch sensing display through frustrated total internal reflection |
US20090090305A1 (en) * | 2007-10-03 | 2009-04-09 | National University Of Singapore | System for humans and pets to interact remotely |
US20110137137A1 (en) * | 2009-12-08 | 2011-06-09 | Electronics And Telecommunications Research Institute | Sensing device of emotion signal and method thereof |
US20130078600A1 (en) * | 2011-08-29 | 2013-03-28 | Worcester Polytechnic Institute | System and method of pervasive developmental disorder interventions |
US20150100157A1 (en) * | 2012-04-04 | 2015-04-09 | Aldebaran Robotics S.A | Robot capable of incorporating natural dialogues with a user into the behaviour of same, and methods of programming and using said robot |
US9002768B2 (en) * | 2012-05-12 | 2015-04-07 | Mikhail Fedorov | Human-computer interface system |
US20140035603A1 (en) * | 2012-08-03 | 2014-02-06 | Xerox Corporation | Printed Stretch Sensor |
US20150277617A1 (en) * | 2014-03-28 | 2015-10-01 | Paul Gwin | Flexible sensor |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9358475B2 (en) | 2013-12-12 | 2016-06-07 | Beatbots, LLC | Robot |
WO2017000800A1 (en) * | 2015-06-30 | 2017-01-05 | 芋头科技(杭州)有限公司 | System and method for intelligently controlling activity of robot |
US9802314B2 (en) | 2015-10-01 | 2017-10-31 | Disney Enterprises, Inc. | Soft body robot for physical interaction with humans |
US10275982B2 (en) | 2016-05-13 | 2019-04-30 | Universal Entertainment Corporation | Attendant device, gaming machine, and dealer-alternate device |
US10192399B2 (en) * | 2016-05-13 | 2019-01-29 | Universal Entertainment Corporation | Operation device and dealer-alternate device |
US10068424B2 (en) | 2016-05-13 | 2018-09-04 | Universal Entertainment Corporation | Attendant device and gaming machine |
US10290181B2 (en) | 2016-05-13 | 2019-05-14 | Universal Entertainment Corporation | Attendant device and gaming machine |
WO2018208761A1 (en) * | 2017-05-11 | 2018-11-15 | Misty Robotics, Inc. | Infinite robot personalities |
US10850398B2 (en) | 2017-05-11 | 2020-12-01 | Misty Robotics, Inc. | Infinite robot personalities |
US20190232146A1 (en) * | 2018-01-26 | 2019-08-01 | Sony Corporation | Sporting display device and method |
US10799782B2 (en) * | 2018-01-26 | 2020-10-13 | Sony Corporation | Sporting display device and method |
WO2022194029A1 (en) * | 2021-03-15 | 2022-09-22 | 华为技术有限公司 | Robot feedback method and robot |
CN113414763A (en) * | 2021-06-21 | 2021-09-21 | 杭州电子科技大学 | Overlapped optical signal touch sensing system based on soft body arm and touch detection method thereof |
Also Published As
Publication number | Publication date |
---|---|
US9421688B2 (en) | 2016-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9421688B2 (en) | Robot | |
CN110832439B (en) | Luminous user input device | |
CN104750309B (en) | The button of touch panel is converted into rubbing the method and system of enhanced control | |
US11786831B2 (en) | Robot | |
CN100583007C (en) | Movable device with surface display information and interaction function | |
EP3456487A2 (en) | Robot, method of controlling the same, and program | |
KR101548156B1 (en) | A wireless exoskeleton haptic interface device for simultaneously delivering tactile and joint resistance and the method for comprising the same | |
US11231772B2 (en) | Apparatus control device, method of controlling apparatus, and non-transitory recording medium | |
JP2004237392A (en) | Robotic device and expression method of robotic device | |
JP2002283261A (en) | Robot device and its control method and storage medium | |
Hummel et al. | A lightweight electrotactile feedback device for grasp improvement in immersive virtual environments | |
JP5227362B2 (en) | Emotion engine, emotion engine system, and electronic device control method | |
JP2022113701A (en) | Equipment control device, equipment, and equipment control method and program | |
KR20180060567A (en) | Communion robot system for senior citizen | |
EP2478288B1 (en) | Luminaire and method for controlling a luminaire | |
KR20140006458A (en) | Light emitting apparatus having sensitivity function | |
WO2020190362A2 (en) | A social interaction robot | |
CN102411890A (en) | Toy with digitalized dynamic expressions | |
CN206514235U (en) | The healthy electric candle of touch emulation | |
US20070087658A1 (en) | Interactive toy including transparent container | |
US20190060601A1 (en) | Lighting lamps generating novel aesthetic experiences in humans and other beings using physical interactive systems | |
KR101727941B1 (en) | Interaction training device and method, and system thereof | |
JP7414735B2 (en) | Method for controlling multiple robot effectors | |
EP3324102B1 (en) | Electronic candle | |
KR200447079Y1 (en) | Led driving system of false eyelashes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BEATBOTS, LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MICHALOWSKI, MAREK P.;KATZ, GREGORY R.;HERSAN, THIAGO G.;SIGNING DATES FROM 20150415 TO 20150422;REEL/FRAME:035780/0034 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: SURCHARGE FOR LATE PAYMENT, SMALL ENTITY (ORIGINAL EVENT CODE: M2554); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |