WO2019093952A1 - User interactive electronic system and method for controlling a robotic arm - Google Patents

User interactive electronic system and method for controlling a robotic arm Download PDF

Info

Publication number
WO2019093952A1
WO2019093952A1 PCT/SE2018/051138 SE2018051138W WO2019093952A1 WO 2019093952 A1 WO2019093952 A1 WO 2019093952A1 SE 2018051138 W SE2018051138 W SE 2018051138W WO 2019093952 A1 WO2019093952 A1 WO 2019093952A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
robotic arm
attachment
sensor arrangement
electronic system
Prior art date
Application number
PCT/SE2018/051138
Other languages
French (fr)
Inventor
Jesper KOUTHOOFD
Original Assignee
Teenage Engineering Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Teenage Engineering Ab filed Critical Teenage Engineering Ab
Priority to CN202311646130.8A priority Critical patent/CN117798903A/en
Priority to CN201880071698.2A priority patent/CN111344117A/en
Priority to US16/762,338 priority patent/US11584018B2/en
Priority to EP18877044.0A priority patent/EP3710205A4/en
Publication of WO2019093952A1 publication Critical patent/WO2019093952A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/003Manipulators for entertainment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/0019End effectors other than grippers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping

Definitions

  • the present disclosure relates to a user interactive electronic system, the user interactive electronic system comprising a robotic arm and at least an attachment detachably affixed to a distal end of the robotic arm.
  • the present disclosure also relates to a method for operating such a user interactive electronic system and to a computer program product.
  • US20140277735 An example of such a robot is disclosed in US20140277735, presenting a telecommunications enabled robotic device adapted to persist in an environment of a user.
  • the robotic device disclosed in US20140277735 is adapted to consider the social and emotional particulars of a particular situation and its user, and to interact accordingly.
  • the robotic device disclosed in US20140277735 may for example be adapted to handle voice or image input for capturing the emotion of, recognize the identity and gestures of, and maintain interaction with user.
  • the robotic device disclosed in US20140277735 shows an interesting approach in how to e.g. assist a user/person in an everyday situation within the home environment.
  • the robotic device is strictly limited to assistance in regards to a fixed hardware configuration for assisting the user/person, only relying on possible software updates for further extending services to the user/person. Accordingly, there appears to be room for further improvement, specifically allowing for an improved flexibility of such a robotic device, making it future-proof to yet unforeseen services that could be provided to the user/person.
  • a user interactive electronic system comprising base device comprising a controllable robotic arm, the robotic arm having a proximate end and a distal end, at least an attachment adapted to be detachably affixed to the distal end of the robotic arm, the attachment being one of a plurality of different types of attachments, a sensor arrangement adapted to acquire information indicative of a user behavior, and a control unit arranged in communication with the sensor arrangement and adapted to control movement of the robotic arm, wherein the control unit is further adapted to receive the information indicative of the user behavior from the sensor arrangement, determine the type of attachment affixed to the robotic arm, receive or form a movement pattern based on the type of attachment and a user desire related to the user behavior, and control the robotic arm
  • the present disclosure is based upon the realization that it would be advantageous to configure the user interactive electronic system such that it possibly may be continuously adaptable with different types of attachments.
  • the user interactive electronic system according to the present disclosure is configured to automatically determine the type of attachment.
  • the user needs only to affix the attachment to the distal end of the robotic arm, and the user interactive electronic system will then
  • the robotic arm may be allowed to behave differently for a similar user behavior, i.e. in case a first type of attachment being affixed vs. a similar situation where a second attachment is affixed.
  • the user interactive electronic system will (automatically) behave differently dependent on the affixed attachment, thus allowing the operation of the user interactive electronic system to be continuously developed once e.g. new types of attachments are made available. Accordingly, future services possibly being allowed to be provided to the user of the user interactive electronic system will not be limited to an initial hardware configuration of the user interactive electronic system. Rather, once future/further attachments are made available, such future/further attachments may form part of allowing the user to be provided with future/further services.
  • the sensor arrangement adapted to acquire information indicative of a user behavior, where the user behavior for example may be selected from a group comprising the presence of a user, a location of the user, an identity of the user, a movement of the user, a gesture of the user, a voice of the user.
  • the control unit may then take this acquired information and there though derive a user desire from the acquired information. That is, a user behavior will be used as an input for determining the sesire of the user. For example, in case the user waves his hand up and down in front of the to the user interactive electronic system, this will be interpreted by the control unit as an instruction to the user interactive electronic system for performing a specific function.
  • the robotic arm comprises a plurality of connected segments, typically connected to each other using at least one joint.
  • the movement pattern for the user interactive electronic system may be allowed to take into account the structural set-up of the robotic arm, such as for example by allowing the robotic arm to move towards and away from the user/person interacting with the user interactive electronic system. It is preferred to arrange the movement pattern such that the movement of the robotic arm is "smooth" and perceived as relatively natural dependent on the structural appearance of the user interactive electronic system.
  • the joints are configured to detect a user interaction with the robotic arm. That is, in case e.g. the user "pushes” or in any way provokes the robotic arm this may form part of the user behavior used for determining the user desire. For example, in case the user "pushes down” the robotic arm (or the attachment), this may be interpreted (user behavior to generate a user desire) for pausing or shutting down the user interactive electronic system. When the user later "lifts up” the robotic arm (or the attachment), this may possibly be interpreted as a way of re-activating the user interactive electronic system.
  • the sensor arrangement comprises at least one of a camera and a microphone.
  • the at least one camera and/or microphone may be used for locating the user/person as well as for collecting information as to the user behavior, to subsequently be used for determining the user desire.
  • the user interactive electronic system may e.g. be voice (in relation to the microphone) or movement (in relation to the camera) controlled. That is, the user may interact with the user interactive electronic system for instructing the user interactive electronic system to perform a service for the user.
  • the user interactive electronic system may comprise at least one of a speaker element and a display element.
  • the at least one speaker element and a display element may in some embodiments used for providing feedback to the user, such as using spoken (using the speaker element) and/or visual feedback.
  • the speaker may possibly be used for playing music for the user.
  • the microphone, camera, speaker element and/or display element may be comprised with either of the base device and the attachment.
  • the speaker element and/or microphone is comprised with the base device and at least one of the display element and the camera is comprised with the attachment.
  • the attachment may comprise e.g. a touch screen, a fingerprint sensor, an NFC reader, etc.
  • the user interactive electronic system may be configured to detect a position of the user and extend (using the received/formed movement pattern for the present type of attachment) the distal end with the affixed attachment in a direction of the user. The user may then interact with the e.g. touch screen (possibly also showing a graphical user interface, GUI) for inputting e.g. a PIN code or with the fingerprint sensor for fingerprint authentication.
  • GUI graphical user interface
  • the attachment may comprise e.g. a fan.
  • the user interactive electronic system may determine a position of the user and then extend (using the received/formed movement pattern for the present type of attachment) the distal end with the affixed "fan attachment” for "cooling" the user.
  • an attachment comprising a camera may be directed towards the user for taking a photo of the user.
  • the user interactive electronic system may be desirable to equip the user interactive electronic system with means for allowing networked communication with e.g. a remote server, wherein the information indicative of the user behavior is provided to the server and the server is adapted for determining the user desire.
  • resources available at the server may thereby be used for e.g. autonomous determination of the user desire.
  • the resources at the server may in some embodiment at least partly implement functionality related to machine learning and data mining.
  • the machine learning process may be either of an unsupervised machine learning process or a supervised machine learning process, for example related to the concept of artificial intelligence (AI).
  • AI artificial intelligence
  • the remote server may determine general instructions relating to the movement pattern (e.g. attachment type independent), to be "blended" with a type related movement pattern available locally at the user interactive electronic system for determining instructions for controlling the robotic arm.
  • a computer implemented method for operating a user interactive electronic system comprising base device comprising a controllable robotic arm, the robotic arm having a proximate end and a distal end, at least an attachment adapted to be detachably affixed to the distal end of the robotic arm, the attachment being one of a plurality of different types of attachments, a sensor arrangement adapted to acquire information indicative of a user behavior, and a control unit arranged in communication with the sensor arrangement and adapted to control movement of the robotic arm, wherein the method comprises receiving the information indicative of the user behavior from the sensor arrangement, determining the type of attachment affixed to the robotic arm, receiving or forming a movement pattern based on the type of attachment and a user desire related to the user behavior, and controlling the robotic arm based on the movement pattern.
  • a computer program product comprising a non-transitory computer readable medium having stored thereon computer program means for a control unit adapted for controlling a user interactive electronic system
  • the user interactive electronic system comprising base device comprising a controllable robotic arm, the robotic arm having a proximate end and a distal end, at least an attachment adapted to be detachably affixed to the distal end of the robotic arm, the attachment being one of a plurality of different types of attachments, a sensor arrangement adapted to acquire information indicative of a user behavior, and a control unit arranged in communication with the sensor arrangement and adapted to control movement of the robotic arm
  • the computer program product comprises code for receiving the information indicative of the user behavior from the sensor arrangement, code for determining the type of attachment affixed to the robotic arm, code for receiving or forming a movement pattern based on the type of attachment and a user desire related to the user behavior, and controlling the robotic arm based on the movement pattern.
  • the control unit is preferably an ASIC, a microprocessor or any other type of computing device.
  • a software executed by the control unit for operating the inventive system may be stored on a computer readable medium, being any type of memory device, including one of a removable nonvolatile random access memory, a hard disk drive, a floppy disk, a CD-ROM, a DVD-ROM, a USB memory, an SD memory card, or a similar computer readable medium known in the art.
  • Figs. 1 A and IB schematically exemplify a user interactive electronic system according to an embodiment of the present disclosure
  • Figs. 2A - 2C show examples of possible attachments to be comprised with a user interactive electronic system
  • Fig. 3 is a flowchart illustrating the operation of the user interactive electronic system according to the present disclosure.
  • the present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which currently preferred embodiments of the present disclosure are shown. This present disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided for thoroughness and completeness, and fully convey the scope of the present disclosure to the skilled person. Like reference characters refer to like elements throughout.
  • the user interactive electronic system 100 comprises a base device 102 and an attachment 104.
  • the base device 102 comprises a base portion 106, forming a foundation for the user interactive electronic system 100.
  • the base portion 106 further comprising an elongated robotic arm having a proximate end and a distal end.
  • proximal and distal are used herein with reference to the base portion 106 and the attachment 104.
  • proximal end referring to the portion of the robotic arm closest to the base portion 106 and the term “distal end” referring to the portion of the robotic arm located closest to the attachment 104.
  • the robotic arm comprises a plurality of segments 108, connected with joints 110. In the illustrated embodiment the robotic arm comprises two segments 108 and three joints 110. Accordingly the robotic arm may be moved in six degrees of freedom. It could of course be possible to arrange the robotic arm to comprise more or less segments 108 and joints 110.
  • the base portion 106 further comprises a head portion 112.
  • the head portion 112 may be rotated, preferably 360 degrees.
  • the head portion 112 further comprises a first connector portion 114 adapted to for mechanical and optionally electrical connection with a second (matching) connector portion 116 comprised with the attachment 114.
  • the first 114 and the second 116 connector portions are arranged to allow for both of a mechanical and electrical connection between the base portion 102 and the attachment 104.
  • each of the base portion 102 and the attachment 104 may have separate power supplies, such as allowing the base portion 102 to be connected to e.g. the mains and the attachment to comprise an e.g. chargeable battery.
  • the base portion 102 may further be arranged to comprise a first control unit
  • the attachment 104 may in some embodiments comprise a second control unit (not shown). However, in some embodiments only one of the base portion 102 and the attachment 104 is arranged to comprise a (single) control unit. Either or both of the base portion 102 and the attachment 104 may comprise means for allowing network communication, such as wireless
  • Wi-Fi Wireless Fidelity
  • Bluetooth Wireless Fidelity
  • the communication between the base portion 102 and the attachment 104 is wired (such as using serial communication) through the electrical part of the first 114 and the second 116 connector portions.
  • the communication between the base portion 102 and the attachment 104 may also or alternatively be wireless.
  • either or both of the base portion 102 and the attachment 104 may be arranged to comprise a sensor arrangement.
  • a sensor arrangement may for example comprise a microphone, a camera, etc.
  • Either or both of the base portion 102 and the attachment 104 may comprise a speaker element and/or a display element.
  • the first, second and/or single control unit may include a microprocessor, microcontroller, programmable digital signal processor or another programmable device.
  • the control unit may also, or instead, each include an application specific integrated circuit, a programmable gate array or programmable array logic, a programmable logic device, or a digital signal processor.
  • the control unit includes a programmable device such as the microprocessor, microcontroller or programmable digital signal processor mentioned above, the processor may further include computer executable code that controls operation of the programmable device.
  • the exemplary attachment 104 shown in Figs. 1A, IB and 2A further comprises a base plate 202 having a front side and a back side. As shown in Fig. 1 A, the second connector portion 116 is arranged at the backside of the attachment 104.
  • the front side of the attachment 104 is provided with a display element, in the illustrated embodiment comprising a plurality of light emitting elements, such as a plurality of light emitting diodes (LEDs) 204.
  • the LEDs 204 are in the illustrated embodiment arranged in a matrix formation, where each of the LEDs 204 are individually controllable, typically for use in interaction with a user.
  • the LEDs 204 may for example be used for showing an instruction to the user, an emotion (such as by showing an illustration of an emoji, etc.).
  • an emotion such as by showing an illustration of an emoji, etc.
  • other types of display elements such as a liquid crystal display (LCD) for showing an image.
  • LCD liquid crystal display
  • Such a display element may also be configured for receiving user input, for example by arranging such a display element as a touch screen.
  • FIG. 2B illustrates an alternative embodiment of an attachment 104', in essence corresponding to the attachment 104 as shown in Figs. 1 A, IB and 2A.
  • the embodiment shown in Fig. 2B further comprises a camera portion 206 in turn comprising an image sensor and lens arrangement 208.
  • FIG. 2C A further exemplary embodiment of an attachment 104" is shown in Fig. 2C.
  • the attachment 104" is arranged to correspond a second 116 connector portion as the attachments 104 and 104.
  • the attachment 104" comprises a fan portion including a plurality of fan blades 210.
  • the attachment 104" may also, optionally, comprise a camera and/or microphone arranged e.g. at a center of the pan portion.
  • the sensor arrangement such as including the mentioned microphone and or the camera portion 206 may be adapted to collect data relating to the user, such as related to a behavior of the user.
  • the control unit receives, SI, the information indicative of the user behavior.
  • the control unit also receives information from the attachment 104 once the attachment has been affixed to the base device 102.
  • Such information may for example comprise an indication of an identity of the attachment 104, allowing the control unit to determine a type of the attachment 104.
  • the control unit may receive the indication of the identity and send a request to e.g. a remotely located server holding information that allows the control unit to determine, S2, the determining the type of attachment affixed to the robotic arm.
  • control unit may in some embodiments provide the data relating to the user behavior to the remote server, where the remote server inputs the data relating to the user behavior into a machine learning model for determining a user desire.
  • This determination may however be at least partly (or fully) determined locally at the user interactive electronic system 100.
  • Based on the type of attachment and the determined user desire at least one of the remote server and the user interactive electronic system 100 may determine or form a movement pattern for the user interactive electronic system 100.
  • the movement pattern may for example relate to extending the distal end of the robotic arm, and thus the attachment 104, in a direction where the user has previously been detected.
  • the robotic arm may once the pizza order has been delivered to the pizza restaurant move, S4, in a "dancing manner" and showing a countdown timer at the display element indicating the time until the pizza will be delivered.
  • the movement of the robotic arm will as such be dependent on the user desire.
  • control functionality of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system.
  • Embodiments within the scope of the present disclosure include program products comprising machine- readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor.
  • machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor.
  • a network or another communications connection either hardwired, wireless, or a combination of hardwired or wireless
  • any such connection is properly termed a machine-readable medium.
  • Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Physics & Mathematics (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)
  • User Interface Of Digital Computer (AREA)
  • Toys (AREA)

Abstract

The present disclosure relates to a user interactive electronic system, the user interactive electronic system comprising a robotic arm and at least an attachment detachably affixed to a distal end of the robotic arm. The present disclosure also relates to a method for operating such a user interactive electronic system and to a computer program product.

Description

USER INTERACTIVE ELECTRONIC SYSTEM AND METHOD FOR
CONTROLLING A ROBOTIC ARM
TECHNICAL FIELD
The present disclosure relates to a user interactive electronic system, the user interactive electronic system comprising a robotic arm and at least an attachment detachably affixed to a distal end of the robotic arm. The present disclosure also relates to a method for operating such a user interactive electronic system and to a computer program product.
BACKGROUND
Various types of interactive autonomous robots exist, residing and functioning continually in the environment of a person, such as within a home environment. Recently, with the advances made within the area of artificial intelligence, robots have been presented that dynamically interact with the person/user based on e.g. a requirement of the person/user.
An example of such a robot is disclosed in US20140277735, presenting a telecommunications enabled robotic device adapted to persist in an environment of a user. The robotic device disclosed in US20140277735 is adapted to consider the social and emotional particulars of a particular situation and its user, and to interact accordingly. The robotic device disclosed in US20140277735 may for example be adapted to handle voice or image input for capturing the emotion of, recognize the identity and gestures of, and maintain interaction with user.
The robotic device disclosed in US20140277735 shows an interesting approach in how to e.g. assist a user/person in an everyday situation within the home environment. However, the robotic device is strictly limited to assistance in regards to a fixed hardware configuration for assisting the user/person, only relying on possible software updates for further extending services to the user/person. Accordingly, there appears to be room for further improvement, specifically allowing for an improved flexibility of such a robotic device, making it future-proof to yet unforeseen services that could be provided to the user/person.
SUMMARY
In view of above-mentioned and other drawbacks of the prior art, it is an object of the present disclosure to provide an improved robotic device for use in a home environment for assisting a user/person in the everyday life. According to an aspect of the present disclosure, it is therefore provided a user interactive electronic system, comprising base device comprising a controllable robotic arm, the robotic arm having a proximate end and a distal end, at least an attachment adapted to be detachably affixed to the distal end of the robotic arm, the attachment being one of a plurality of different types of attachments, a sensor arrangement adapted to acquire information indicative of a user behavior, and a control unit arranged in communication with the sensor arrangement and adapted to control movement of the robotic arm, wherein the control unit is further adapted to receive the information indicative of the user behavior from the sensor arrangement, determine the type of attachment affixed to the robotic arm, receive or form a movement pattern based on the type of attachment and a user desire related to the user behavior, and control the robotic arm based on the movement pattern.
The present disclosure is based upon the realization that it would be advantageous to configure the user interactive electronic system such that it possibly may be continuously adaptable with different types of attachments. However, for allowing an operation of such a user interactive electronic system to be smooth and user friendly, not forcing the user to make configurations when the attachment is changed, the user interactive electronic system according to the present disclosure is configured to automatically determine the type of attachment. Thus, the user needs only to affix the attachment to the distal end of the robotic arm, and the user interactive electronic system will then
automatically handle all settings necessary for further operation of the user interactive electronic system.
In addition, in accordance to the present disclosure, not only is the configuration based on the affixed attachment made in an automated manner. Rather, also the control of movement of the robotic arm will be dependent on the type of attachment.
Accordingly, in a typical scenario the robotic arm may be allowed to behave differently for a similar user behavior, i.e. in case a first type of attachment being affixed vs. a similar situation where a second attachment is affixed.
Thus, in accordance to the present disclosure, the user interactive electronic system will (automatically) behave differently dependent on the affixed attachment, thus allowing the operation of the user interactive electronic system to be continuously developed once e.g. new types of attachments are made available. Accordingly, future services possibly being allowed to be provided to the user of the user interactive electronic system will not be limited to an initial hardware configuration of the user interactive electronic system. Rather, once future/further attachments are made available, such future/further attachments may form part of allowing the user to be provided with future/further services.
As defined above, the sensor arrangement adapted to acquire information indicative of a user behavior, where the user behavior for example may be selected from a group comprising the presence of a user, a location of the user, an identity of the user, a movement of the user, a gesture of the user, a voice of the user. The control unit may then take this acquired information and there though derive a user desire from the acquired information. That is, a user behavior will be used as an input for determining the sesire of the user. For example, in case the user waves his hand up and down in front of the to the user interactive electronic system, this will be interpreted by the control unit as an instruction to the user interactive electronic system for performing a specific function.
Preferably, the robotic arm comprises a plurality of connected segments, typically connected to each other using at least one joint. Accordingly, the movement pattern for the user interactive electronic system may be allowed to take into account the structural set-up of the robotic arm, such as for example by allowing the robotic arm to move towards and away from the user/person interacting with the user interactive electronic system. It is preferred to arrange the movement pattern such that the movement of the robotic arm is "smooth" and perceived as relatively natural dependent on the structural appearance of the user interactive electronic system.
In a possible embodiment the joints are configured to detect a user interaction with the robotic arm. That is, in case e.g. the user "pushes" or in any way provokes the robotic arm this may form part of the user behavior used for determining the user desire. For example, in case the user "pushes down" the robotic arm (or the attachment), this may be interpreted (user behavior to generate a user desire) for pausing or shutting down the user interactive electronic system. When the user later "lifts up" the robotic arm (or the attachment), this may possibly be interpreted as a way of re-activating the user interactive electronic system.
In a preferred embodiment of the present disclosure the sensor arrangement comprises at least one of a camera and a microphone. Accordingly, the at least one camera and/or microphone may be used for locating the user/person as well as for collecting information as to the user behavior, to subsequently be used for determining the user desire. As such, the user interactive electronic system may e.g. be voice (in relation to the microphone) or movement (in relation to the camera) controlled. That is, the user may interact with the user interactive electronic system for instructing the user interactive electronic system to perform a service for the user.
It may also be advantageous to allow the user interactive electronic system to comprise at least one of a speaker element and a display element. The at least one speaker element and a display element may in some embodiments used for providing feedback to the user, such as using spoken (using the speaker element) and/or visual feedback. The speaker may possibly be used for playing music for the user.
The microphone, camera, speaker element and/or display element may be comprised with either of the base device and the attachment. In some embodiments the speaker element and/or microphone is comprised with the base device and at least one of the display element and the camera is comprised with the attachment.
In some embodiments it may be possible to allow the attachment to comprise e.g. a touch screen, a fingerprint sensor, an NFC reader, etc. Accordingly, in case for example an identity of the user to be authenticated, the user interactive electronic system may be configured to detect a position of the user and extend (using the received/formed movement pattern for the present type of attachment) the distal end with the affixed attachment in a direction of the user. The user may then interact with the e.g. touch screen (possibly also showing a graphical user interface, GUI) for inputting e.g. a PIN code or with the fingerprint sensor for fingerprint authentication.
In an alternative embodiment the attachment may comprise e.g. a fan.
Accordingly, the user interactive electronic system may determine a position of the user and then extend (using the received/formed movement pattern for the present type of attachment) the distal end with the affixed "fan attachment" for "cooling" the user. Similarly, an attachment comprising a camera may be directed towards the user for taking a photo of the user.
It may in some embodiment of the present disclosure be desirable to equip the user interactive electronic system with means for allowing networked communication with e.g. a remote server, wherein the information indicative of the user behavior is provided to the server and the server is adapted for determining the user desire. Accordingly, resources available at the server may thereby be used for e.g. autonomous determination of the user desire. The resources at the server may in some embodiment at least partly implement functionality related to machine learning and data mining. The machine learning process may be either of an unsupervised machine learning process or a supervised machine learning process, for example related to the concept of artificial intelligence (AI). However, it should be understood that at least some of the mentioned determination may be performed locally at the user interactive electronic system. For example, the remote server may determine general instructions relating to the movement pattern (e.g. attachment type independent), to be "blended" with a type related movement pattern available locally at the user interactive electronic system for determining instructions for controlling the robotic arm.
According to another aspect of the present disclosure, there is provided a computer implemented method for operating a user interactive electronic system, the system comprising base device comprising a controllable robotic arm, the robotic arm having a proximate end and a distal end, at least an attachment adapted to be detachably affixed to the distal end of the robotic arm, the attachment being one of a plurality of different types of attachments, a sensor arrangement adapted to acquire information indicative of a user behavior, and a control unit arranged in communication with the sensor arrangement and adapted to control movement of the robotic arm, wherein the method comprises receiving the information indicative of the user behavior from the sensor arrangement, determining the type of attachment affixed to the robotic arm, receiving or forming a movement pattern based on the type of attachment and a user desire related to the user behavior, and controlling the robotic arm based on the movement pattern. This aspect of the present disclosure provides similar advantages as discussed above in relation to the previous aspects of the present disclosure.
According to a further aspect of the present disclosure, there is provided a computer program product comprising a non-transitory computer readable medium having stored thereon computer program means for a control unit adapted for controlling a user interactive electronic system, the user interactive electronic system comprising base device comprising a controllable robotic arm, the robotic arm having a proximate end and a distal end, at least an attachment adapted to be detachably affixed to the distal end of the robotic arm, the attachment being one of a plurality of different types of attachments, a sensor arrangement adapted to acquire information indicative of a user behavior, and a control unit arranged in communication with the sensor arrangement and adapted to control movement of the robotic arm, wherein the computer program product comprises code for receiving the information indicative of the user behavior from the sensor arrangement, code for determining the type of attachment affixed to the robotic arm, code for receiving or forming a movement pattern based on the type of attachment and a user desire related to the user behavior, and controlling the robotic arm based on the movement pattern. Also this aspect of the present disclosure provides similar advantages as discussed above in relation to the previous aspects of the present disclosure.
The control unit is preferably an ASIC, a microprocessor or any other type of computing device. A software executed by the control unit for operating the inventive system may be stored on a computer readable medium, being any type of memory device, including one of a removable nonvolatile random access memory, a hard disk drive, a floppy disk, a CD-ROM, a DVD-ROM, a USB memory, an SD memory card, or a similar computer readable medium known in the art.
Further features of, and advantages with, the present disclosure will become apparent when studying the appended claims and the following description. The skilled addressee realize that different features of the present disclosure may be combined to create embodiments other than those described in the following, without departing from the scope of the present disclosure. BRIEF DESCRIPTION OF THE DRAWINGS
The various aspects of the present disclosure, including its particular features and advantages, will be readily understood from the following detailed description and the accompanying drawings, in which:
Figs. 1 A and IB schematically exemplify a user interactive electronic system according to an embodiment of the present disclosure;
Figs. 2A - 2C show examples of possible attachments to be comprised with a user interactive electronic system;
Fig. 3 is a flowchart illustrating the operation of the user interactive electronic system according to the present disclosure.
DETAILED DESCRIPTION
The present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which currently preferred embodiments of the present disclosure are shown. This present disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided for thoroughness and completeness, and fully convey the scope of the present disclosure to the skilled person. Like reference characters refer to like elements throughout. Turning now to the drawings and to Figs. 1 A, IB and 2A in particular, there is schematically illustrated an example embodiment of a user interactive electronic system 100. The user interactive electronic system 100 comprises a base device 102 and an attachment 104. The base device 102 comprises a base portion 106, forming a foundation for the user interactive electronic system 100.
The base portion 106 further comprising an elongated robotic arm having a proximate end and a distal end. The terms "proximal" and "distal" are used herein with reference to the base portion 106 and the attachment 104. The term "proximal end" referring to the portion of the robotic arm closest to the base portion 106 and the term "distal end" referring to the portion of the robotic arm located closest to the attachment 104. The robotic arm comprises a plurality of segments 108, connected with joints 110. In the illustrated embodiment the robotic arm comprises two segments 108 and three joints 110. Accordingly the robotic arm may be moved in six degrees of freedom. It could of course be possible to arrange the robotic arm to comprise more or less segments 108 and joints 110.
The base portion 106 further comprises a head portion 112. The head portion
112 may be rotated, preferably 360 degrees. The head portion 112 further comprises a first connector portion 114 adapted to for mechanical and optionally electrical connection with a second (matching) connector portion 116 comprised with the attachment 114.
In the illustrated embodiment as shown in Fig. 1 A the first 114 and the second 116 connector portions are arranged to allow for both of a mechanical and electrical connection between the base portion 102 and the attachment 104. However, it may in some embodiments be possible to allow each of the base portion 102 and the attachment 104 to have separate power supplies, such as allowing the base portion 102 to be connected to e.g. the mains and the attachment to comprise an e.g. chargeable battery. The
The base portion 102 may further be arranged to comprise a first control unit
(not shown), adapted to control at least the robotic arm. Similarly, the attachment 104 may in some embodiments comprise a second control unit (not shown). However, in some embodiments only one of the base portion 102 and the attachment 104 is arranged to comprise a (single) control unit. Either or both of the base portion 102 and the attachment 104 may comprise means for allowing network communication, such as wireless
communication using e.g. Wi-Fi, Bluetooth or similar. In some embodiments the
communication between the base portion 102 and the attachment 104 is wired (such as using serial communication) through the electrical part of the first 114 and the second 116 connector portions. However, the communication between the base portion 102 and the attachment 104 may also or alternatively be wireless.
As mentioned above, either or both of the base portion 102 and the attachment 104 may be arranged to comprise a sensor arrangement. Such a sensor arrangement may for example comprise a microphone, a camera, etc. Either or both of the base portion 102 and the attachment 104 may comprise a speaker element and/or a display element.
The first, second and/or single control unit may include a microprocessor, microcontroller, programmable digital signal processor or another programmable device. The control unit may also, or instead, each include an application specific integrated circuit, a programmable gate array or programmable array logic, a programmable logic device, or a digital signal processor. Where the control unit includes a programmable device such as the microprocessor, microcontroller or programmable digital signal processor mentioned above, the processor may further include computer executable code that controls operation of the programmable device.
The exemplary attachment 104 shown in Figs. 1A, IB and 2A further comprises a base plate 202 having a front side and a back side. As shown in Fig. 1 A, the second connector portion 116 is arranged at the backside of the attachment 104. In the illustration shown in Figs. IB and 2A, the front side of the attachment 104 is provided with a display element, in the illustrated embodiment comprising a plurality of light emitting elements, such as a plurality of light emitting diodes (LEDs) 204. The LEDs 204 are in the illustrated embodiment arranged in a matrix formation, where each of the LEDs 204 are individually controllable, typically for use in interaction with a user. Accordingly, the LEDs 204 may for example be used for showing an instruction to the user, an emotion (such as by showing an illustration of an emoji, etc.). It should be understood that other types of display elements may be used, such as a liquid crystal display (LCD) for showing an image. Such a display element may also be configured for receiving user input, for example by arranging such a display element as a touch screen.
Turning now to Fig. 2B which illustrates an alternative embodiment of an attachment 104', in essence corresponding to the attachment 104 as shown in Figs. 1 A, IB and 2A. However, the embodiment shown in Fig. 2B further comprises a camera portion 206 in turn comprising an image sensor and lens arrangement 208.
A further exemplary embodiment of an attachment 104" is shown in Fig. 2C. The attachment 104" is arranged to correspond a second 116 connector portion as the attachments 104 and 104. However, rather than comprising a display element/camera, the attachment 104" comprises a fan portion including a plurality of fan blades 210. The attachment 104" may also, optionally, comprise a camera and/or microphone arranged e.g. at a center of the pan portion.
Turning finally to Fig. IB in conjunction with Fig. 3, together illustrating an exemplary general operation of the user interactive electronic system 100. As discussed above, the sensor arrangement, such as including the mentioned microphone and or the camera portion 206 may be adapted to collect data relating to the user, such as related to a behavior of the user. Typically, the control unit receives, SI, the information indicative of the user behavior. The control unit also receives information from the attachment 104 once the attachment has been affixed to the base device 102. Such information may for example comprise an indication of an identity of the attachment 104, allowing the control unit to determine a type of the attachment 104. Possibly, the control unit may receive the indication of the identity and send a request to e.g. a remotely located server holding information that allows the control unit to determine, S2, the determining the type of attachment affixed to the robotic arm.
In addition, the control unit may in some embodiments provide the data relating to the user behavior to the remote server, where the remote server inputs the data relating to the user behavior into a machine learning model for determining a user desire. This determination may however be at least partly (or fully) determined locally at the user interactive electronic system 100. Based on the type of attachment and the determined user desire at least one of the remote server and the user interactive electronic system 100 may determine or form a movement pattern for the user interactive electronic system 100. The movement pattern may for example relate to extending the distal end of the robotic arm, and thus the attachment 104, in a direction where the user has previously been detected. For example, in case the user has provided a voice input to microphone comprised with the user interactive electronic system 100, for example with a request to order a home delivered pizza, the robotic arm may once the pizza order has been delivered to the pizza restaurant move, S4, in a "dancing manner" and showing a countdown timer at the display element indicating the time until the pizza will be delivered. The movement of the robotic arm will as such be dependent on the user desire.
The control functionality of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine- readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine, the machine properly views the connection as a machine-readable medium. Thus, any such connection is properly termed a machine-readable medium.
Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures may show a sequence the order of the steps may differ from what is depicted. Also two or more steps may be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps. Additionally, even though the present disclosure has been described with reference to specific exemplifying embodiments thereof, many different alterations, modifications and the like will become apparent for those skilled in the art.
In addition, variations to the disclosed embodiments can be understood and effected by the skilled addressee in practicing the claimed present disclosure, from a study of the drawings, the disclosure, and the appended claims. Furthermore, in the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality.

Claims

1. A user interactive electronic system, comprising:
- base device comprising a controllable robotic arm, the robotic arm having a proximate end and a distal end,
- at least an attachment adapted to be detachably affixed to the distal end of the robotic arm, the attachment being one of a plurality of different types of attachments,
- a sensor arrangement adapted to acquire information indicative of a user behavior, the user behavior being selected from a group comprising the presence of a user, a location of the user, an identity of the user, a movement of the user, a gesture of the user, a voice of the user , and
- a control unit arranged in communication with the sensor arrangement and adapted to control movement of the robotic arm,
wherein the control unit is further adapted to:
- receive the information indicative of the user behavior from the sensor arrangement,
- determine the type of attachment affixed to the robotic arm,
- receive or form a movement pattern based on the type of attachment and a user desire derived from the user behavior, and
- control the robotic arm based on the movement pattern.
2. The system according to claim 1, wherein the robotic arm comprises a plurality of connected segments.
3. The system according to claim 2, wherein the plurality of segments are connected to each other using at least one joint.
4. The system according to any one of the preceding claims, wherein the sensor arrangement comprises at least one of a camera and a microphone.
5. The system according to any one of the preceding claims, further comprising at least one of a speaker element and a display element.
6. The system according to any one of the preceding claims, further comprising means for allowing networked communication with a remote server, wherein the information indicative of the user behavior is provided to the server and the server is adapted for determining the user desire.
7. The system according to any one of the preceding claims, wherein the movement pattern is different for different types of attachments for a corresponding user desire.
8. A computer implemented method for operating a user interactive electronic system, the system comprising:
- base device comprising a controllable robotic arm, the robotic arm having a proximate end and a distal end,
- at least an attachment adapted to be detachably affixed to the distal end of the robotic arm, the attachment being one of a plurality of different types of attachments,
- a sensor arrangement adapted to acquire information indicative of a user behavior, the user behavior being selected from a group comprising the presence of a user, a location of the user, an identity of the user, a movement of the user, a gesture of the user, a voice of the user, and
- a control unit arranged in communication with the sensor arrangement and adapted to control movement of the robotic arm,
wherein the method comprises:
- receiving the information indicative of the user behavior from the sensor arrangement,
- determining the type of attachment affixed to the robotic arm,
- receiving or forming a movement pattern based on the type of attachment and a user desire derived from the user behavior, and
- controlling the robotic arm based on the movement pattern.
9. The method according to claim 8, further comprising:
- establishing a network connection between the control unit and a remotely located server,
- providing the information indicative of the user behavior to the server, and - determining, at the server, user desire based on the information indicative of the user behavior.
10. The method according to any one of claims 8 and 9, further comprising: - providing a plurality of different type of attachments,
- selecting one of the different types of attachments, and
- affixing the selected one attachment to the distal end of the robotic arm.
11. A computer program product comprising a non-transitory computer readable medium having stored thereon computer program means for a control unit adapted for controlling a user interactive electronic system, the user interactive electronic system comprising:
- base device comprising a controllable robotic arm, the robotic arm having a proximate end and a distal end,
- at least an attachment adapted to be detachably affixed to the distal end of the robotic arm, the attachment being one of a plurality of different types of attachments,
- a sensor arrangement adapted to acquire information indicative of a user behavior, the user behavior being selected from a group comprising the presence of a user, a location of the user, an identity of the user, a movement of the user, a gesture of the user, a voice of the user, and
- a control unit arranged in communication with the sensor arrangement and adapted to control movement of the robotic arm,
wherein the computer program product comprises:
- code for receiving the information indicative of the user behavior from the sensor arrangement,
- code for determining the type of attachment affixed to the robotic arm,
- code for receiving or forming a movement pattern based on the type of attachment and a user desire derived from the user behavior, and
- controlling the robotic arm based on the movement pattern.
PCT/SE2018/051138 2017-11-13 2018-11-07 User interactive electronic system and method for controlling a robotic arm WO2019093952A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202311646130.8A CN117798903A (en) 2017-11-13 2018-11-07 User interaction electronic system and method for controlling a robotic arm
CN201880071698.2A CN111344117A (en) 2017-11-13 2018-11-07 User interactive electronic system and method for controlling a robotic arm
US16/762,338 US11584018B2 (en) 2017-11-13 2018-11-07 User interactive electronic system and method for controlling a robotic arm
EP18877044.0A EP3710205A4 (en) 2017-11-13 2018-11-07 User interactive electronic system and method for controlling a robotic arm

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1751404 2017-11-13
SE1751404-3 2017-11-13

Publications (1)

Publication Number Publication Date
WO2019093952A1 true WO2019093952A1 (en) 2019-05-16

Family

ID=66438967

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2018/051138 WO2019093952A1 (en) 2017-11-13 2018-11-07 User interactive electronic system and method for controlling a robotic arm

Country Status (4)

Country Link
US (1) US11584018B2 (en)
EP (1) EP3710205A4 (en)
CN (2) CN117798903A (en)
WO (1) WO2019093952A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD966279S1 (en) * 2020-06-18 2022-10-11 Moiin Co., Ltd. Holder for a display panel

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120185096A1 (en) * 2010-05-20 2012-07-19 Irobot Corporation Operating a Mobile Robot
US20140277735A1 (en) 2013-03-15 2014-09-18 JIBO, Inc. Apparatus and methods for providing a persistent companion device
US9687982B1 (en) 2015-05-27 2017-06-27 X Development Llc Adapting programming of a robot and/or control of the robot based on one or more parameters of an end effector of the robot
US20170190050A1 (en) * 2015-08-24 2017-07-06 Daniel Cookson Robot with hot-swapped end effectors
WO2017169826A1 (en) * 2016-03-28 2017-10-05 Groove X株式会社 Autonomous behavior robot that performs welcoming behavior

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2281667B1 (en) * 2005-09-30 2013-04-17 iRobot Corporation Companion robot for personal interaction
US9475198B2 (en) * 2014-12-22 2016-10-25 Qualcomm Incorporated System and method for dynamic robot manipulator selection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120185096A1 (en) * 2010-05-20 2012-07-19 Irobot Corporation Operating a Mobile Robot
US20140277735A1 (en) 2013-03-15 2014-09-18 JIBO, Inc. Apparatus and methods for providing a persistent companion device
US9687982B1 (en) 2015-05-27 2017-06-27 X Development Llc Adapting programming of a robot and/or control of the robot based on one or more parameters of an end effector of the robot
US20170190050A1 (en) * 2015-08-24 2017-07-06 Daniel Cookson Robot with hot-swapped end effectors
WO2017169826A1 (en) * 2016-03-28 2017-10-05 Groove X株式会社 Autonomous behavior robot that performs welcoming behavior

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3710205A4

Also Published As

Publication number Publication date
US11584018B2 (en) 2023-02-21
CN117798903A (en) 2024-04-02
CN111344117A (en) 2020-06-26
US20200338746A1 (en) 2020-10-29
EP3710205A4 (en) 2021-09-08
EP3710205A1 (en) 2020-09-23

Similar Documents

Publication Publication Date Title
JP7411133B2 (en) Keyboards for virtual reality display systems, augmented reality display systems, and mixed reality display systems
AU2020201306B2 (en) Implementation of biometric authentication
US11561519B2 (en) Systems and methods of gestural interaction in a pervasive computing environment
JP7341166B2 (en) Transmode input fusion for wearable systems
KR102143148B1 (en) Implementation of biometric authentication
CN107577229B (en) Mobile robot, movement control system, and movement control method
US20180006840A1 (en) Wearable device and controlling method thereof, and system for controlling smart home
JP6048321B2 (en) Self-propelled working device and program
US10212040B2 (en) Troubleshooting voice-enabled home setup
AU2018258679A1 (en) Light-emitting user input device
US20170364239A1 (en) Application icon customization
CN102301312A (en) Portable Engine For Entertainment, Education, Or Communication
CN104159360A (en) Illumination control method, device and equipment
CN104869304A (en) Method of displaying focus and electronic device applying the same
CN107969150A (en) Equipment for aiding in user in family
US9548012B1 (en) Adaptive ergonomic keyboard
KR102118054B1 (en) remote controller for a robot cleaner and a control method of the same
US11584018B2 (en) User interactive electronic system and method for controlling a robotic arm
JP2017012691A (en) Rehabilitation support device, rehabilitation support system, rehabilitation support method and program
US20240094888A1 (en) Method and apparatus for controlling devices
JP6666955B2 (en) Method for grasping space where electronic device is located using charger of electronic device, electronic device and charger
US20190364114A1 (en) Information processing apparatus, information processing system, and non-transitory computer readable medium
JP6436215B2 (en) Working device and working system
JP6264433B2 (en) Work equipment
JP2021145198A (en) Equipment operation system by eye contact

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18877044

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018877044

Country of ref document: EP

Effective date: 20200615